WorldWideScience

Sample records for comprehensive computational analysis

  1. Comprehensive analysis of a Radiology Operations Management computer system.

    Science.gov (United States)

    Arenson, R L; London, J W

    1979-11-01

    The Radiology Operations Management computer system at the Hospital of the University of Pennsylvania is discussed. The scheduling and file room modules are based on the system at Massachusetts General Hospital. Patient delays are indicated by the patient tracking module. A reporting module allows CRT/keyboard entry by transcriptionists, entry of standard reports by radiologists using bar code labels, and entry by radiologists using a specialty designed diagnostic reporting terminal. Time-flow analyses demonstrate a significant improvement in scheduling, patient waiting, retrieval of radiographs, and report delivery. Recovery of previously lost billing contributes to the proved cost effectiveness of this system.

  2. Computed microtomography and X-ray fluorescence analysis for comprehensive analysis of structural changes in bone.

    Science.gov (United States)

    Buzmakov, Alexey; Chukalina, Marina; Nikolaev, Dmitry; Schaefer, Gerald; Gulimova, Victoria; Saveliev, Sergey; Tereschenko, Elena; Seregin, Alexey; Senin, Roman; Prun, Victor; Zolotov, Denis; Asadchikov, Victor

    2013-01-01

    This paper presents the results of a comprehensive analysis of structural changes in the caudal vertebrae of Turner's thick-toed geckos by computer microtomography and X-ray fluorescence analysis. We present algorithms used for the reconstruction of tomographic images which allow to work with high noise level projections that represent typical conditions dictated by the nature of the samples. Reptiles, due to their ruggedness, small size, belonging to the amniote and a number of other valuable features, are an attractive model object for long-orbital experiments on unmanned spacecraft. Issues of possible changes in their bone tissue under the influence of spaceflight are the subject of discussions between biologists from different laboratories around the world.

  3. The Perseus computational platform for comprehensive analysis of (prote)omics data.

    Science.gov (United States)

    Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen

    2016-09-01

    A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.

  4. Computational text analysis and reading comprehension exam complexity towards automatic text classification

    CERN Document Server

    Liontou, Trisevgeni

    2014-01-01

    This book delineates a range of linguistic features that characterise the reading texts used at the B2 (Independent User) and C1 (Proficient User) levels of the Greek State Certificate of English Language Proficiency exams in order to help define text difficulty per level of competence. In addition, it examines whether specific reader variables influence test takers' perceptions of reading comprehension difficulty. The end product is a Text Classification Profile per level of competence and a formula for automatically estimating text difficulty and assigning levels to texts consistently and re

  5. Analysis of a comprehensive quality assurance program with computer-enhanced monitors

    International Nuclear Information System (INIS)

    Arenson, R.L.; Mintz, M.C.; Goldstein, E.; Stevens, J.F.; Jovais, C.

    1987-01-01

    The authors' quality assurance (QA) program provides communication pathways among its constituent committees, which include patient care, professional review, medical staff, missed case, quality control, safety, and management committees. The QA monitors are based on data from these committees but also include data from the information management system, such as patient delays, contrast reactions, incidents, complications, time-flow analyses, film library retrieval, cancellations, missing reports, and missing clinical data. Committee data include complaints, missed diagnoses, patient identification problems, and equipment failure. The QA monitors have now been incorporated into summary reports as part of their computer networks. A systematic method for follow-up ensures corrective action and documentation. Examples of improved quality of care resulting from this approach includes reductions in delays for report signature and in repeat films

  6. Comprehensive Analysis of Trends and Emerging Technologies in All Types of Fuel Cells Based on a Computational Method

    Directory of Open Access Journals (Sweden)

    Takaya Ogawa

    2018-02-01

    Full Text Available Fuel cells have been attracting significant attention recently as highly efficient and eco-friendly energy generators. Here, we have comprehensively reviewed all types of fuel cells using computational analysis based on a citation network that detects emerging technologies objectively and provides interdisciplinary data to compare trends. This comparison shows that the technologies of solid oxide fuel cells (SOFCs and electrolytes in polymer electrolyte fuel cells (PEFCs are at the mature stage, whereas those of biofuel cells (BFCs and catalysts in PEFCs are currently garnering attention. It does not mean, however, that the challenges of SOFCs and PEFC electrolytes have been overcome. SOFCs need to be operated at lower temperatures, approximately 500 °C. Electrolytes in PEFCs still suffer from a severe decrease in proton conductivity at low relative humidity and from their high cost. Catalysts in PEFCs are becoming attractive as means to reduce the platinum catalyst cost. The emerging technologies in PEFC catalysts are mainly heteroatom-doped graphene/carbon nanotubes for metal-free catalysts and supports for iron- or cobalt-based catalysts. BFCs have also received attention for wastewater treatment and as miniaturized energy sources. Of particular interest in BFCs are membrane reactors in microbial fuel cells and membrane-less enzymatic biofuel cells.

  7. Molecular and Thermodynamic Properties of Zwitterions versus Ionic Liquids: A Comprehensive Computational Analysis to Develop Advanced Separation Processes.

    Science.gov (United States)

    Moreno, Daniel; Gonzalez-Miquel, Maria; Ferro, Victor R; Palomar, Jose

    2018-04-05

    Zwitterion ionic liquids (ZIs) are compounds in which both counterions are covalently tethered, conferring them with unique characteristics; however, most of their properties are still unknown, representing a bottleneck to exploit their practical applications. Herein, the molecular and fluid properties of ZIs and their mixtures were explored by means of quantum chemical analysis based on the density functional theory (DFT) and COSMO-RS method, and compared against homologous ionic liquids (ILs) to provide a comprehensive overview of the effect of the distinct structures on their physicochemical and thermodynamic behavior. Overall, ZIs were revealed as compounds with higher polarity and stronger hydrogen-bonding capacity, implying higher density, viscosity, melting point, and even lower volatility than structurally similar ILs. The phase equilibrium of binary and ternary systems supports stronger attractive interactions between ZIs and polar compounds, whereas higher liquid-liquid immiscibility with nonpolar compounds may be expected. Ultimately, the performance of ZIs in the wider context of separation processes is illustrated, while providing molecular insights to allow their selection and design for relevant applications. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Comprehension and computation in Bayesian problem solving

    Directory of Open Access Journals (Sweden)

    Eric D. Johnson

    2015-07-01

    Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.

  9. Comprehensive computational design of ordered peptide macrocycles

    Science.gov (United States)

    Hosseinzadeh, Parisa; Bhardwaj, Gaurav; Mulligan, Vikram Khipple; Shortridge, Matthew D.; Craven, Timothy W.; Pardo-Avila, Fátima; Rettie, Stephen A.; Kim, David E.; Silva, Daniel-Adriano; Ibrahim, Yehia M.; Webb, Ian K.; Cort, John R.; Adkins, Joshua N.; Varani, Gabriele; Baker, David

    2018-01-01

    Mixed-chirality peptide macrocycles such as cyclosporine are among the most potent therapeutics identified to date, but there is currently no way to systematically search the structural space spanned by such compounds. Natural proteins do not provide a useful guide: Peptide macrocycles lack regular secondary structures and hydrophobic cores, and can contain local structures not accessible with L-amino acids. Here, we enumerate the stable structures that can be adopted by macrocyclic peptides composed of L- and D-amino acids by near-exhaustive backbone sampling followed by sequence design and energy landscape calculations. We identify more than 200 designs predicted to fold into single stable structures, many times more than the number of currently available unbound peptide macrocycle structures. Nuclear magnetic resonance structures of 9 of 12 designed 7- to 10-residue macrocycles, and three 11- to 14-residue bicyclic designs, are close to the computational models. Our results provide a nearly complete coverage of the rich space of structures possible for short peptide macrocycles and vastly increase the available starting scaffolds for both rational drug design and library selection methods. PMID:29242347

  10. Comprehensive computational design of ordered peptide macrocycles

    Energy Technology Data Exchange (ETDEWEB)

    Hosseinzadeh, Parisa; Bhardwaj, Gaurav; Mulligan, Vikram K.; Shortridge, Matthew D.; Craven, Timothy W.; Pardo-Avila, Fatima; Rettie, Stephan A.; Kim, David E.; Silva, Daniel A.; Ibrahim, Yehia M.; Webb, Ian K.; Cort, John R.; Adkins, Joshua N.; Varani, Gabriele; Baker, David

    2017-12-14

    Mixed chirality peptide macrocycles such as cyclosporine are among the most potent therapeutics identified to-date, but there is currently no way to systematically search through the structural space spanned by such compounds for new drug candidates. Natural proteins do not provide a useful guide: peptide macrocycles lack regular secondary structures and hydrophobic cores and have different backbone torsional constraints. Hence the development of new peptide macrocycles has been approached by modifying natural products or using library selection methods; the former is limited by the small number of known structures, and the latter by the limited size and diversity accessible through library-based methods. To overcome these limitations, here we enumerate the stable structures that can be adopted by macrocyclic peptides composed of L and D amino acids. We identify more than 200 designs predicted to fold into single stable structures, many times more than the number of currently available unbound peptide macrocycle structures. We synthesize and characterize by NMR twelve 7-10 residue macrocycles, 9 of which have structures very close to the design models in solution. NMR structures of three 11-14 residue bicyclic designs are also very close to the computational models. Our results provide a nearly complete coverage of the rich space of structures possible for short peptide based macrocycles unparalleled for other molecular systems, and vastly increase the available starting scaffolds for both rational drug design and library selection methods.

  11. Learning to reason about speakers' alternatives in sentence comprehension : A computational account

    NARCIS (Netherlands)

    Hendriks, Petra; van Rijn, Hedderik; Valkenier, Bea

    We present a computational simulation study of the acquisition of pronouns and reflexives. The computational simulation is based on an Optimality Theory analysis, and is shown to account for the well-known observation that in English and many other languages the correct comprehension of pronouns

  12. Learning to reason about speakers' alternatives in sentence comprehension : A computational account

    NARCIS (Netherlands)

    Hendriks, Petra; van Rijn, Hedderik; Valkenier, Bea

    2007-01-01

    We present a computational simulation study of the acquisition of pronouns and reflexives. The computational simulation is based on an Optimality Theory analysis, and is shown to account for the well-known observation that in English and many other languages the correct comprehension of pronouns

  13. Monochromatic computed microtomography using laboratory and synchrotron sources and X-ray fluorescence analysis for comprehensive analysis of structural changes in bones1

    Science.gov (United States)

    Buzmakov, Alexey; Chukalina, Marina; Nikolaev, Dmitry; Gulimova, Victoriya; Saveliev, Sergey; Tereschenko, Elena; Seregin, Alexey; Senin, Roman; Zolotov, Denis; Prun, Victor; Shaefer, Gerald; Asadchikov, Victor

    2015-01-01

    A combination of X-ray tomography at different wavelengths and X-ray fluorescence analysis was applied in the study of two types of bone tissue changes: prolonged presence in microgravity conditions and age-related bone growth. The proximal tail vertebrae of geckos were selected for investigation because they do not bear the supporting load in locomotion, which allows them to be considered as an independent indicator of gravitational influence. For the vertebrae of geckos no significant differences were revealed in the elemental composition of the flight samples and the synchronous control samples. In addition, the gecko bone tissue samples from the jaw apparatus, spine and shoulder girdle were measured. The dynamics of structural changes in the bone tissue growth was studied using samples of a human fetal hand. The hands of human fetuses of 11–15 weeks were studied. Autonomous zones of calcium accumulation were found not only in individual fingers but in each of the investigated phalanges. The results obtained are discussed. PMID:26089762

  14. Monochromatic computed microtomography using laboratory and synchrotron sources and X-ray fluorescence analysis for comprehensive analysis of structural changes in bones.

    Science.gov (United States)

    Buzmakov, Alexey; Chukalina, Marina; Nikolaev, Dmitry; Gulimova, Victoriya; Saveliev, Sergey; Tereschenko, Elena; Seregin, Alexey; Senin, Roman; Zolotov, Denis; Prun, Victor; Shaefer, Gerald; Asadchikov, Victor

    2015-06-01

    A combination of X-ray tomography at different wavelengths and X-ray fluorescence analysis was applied in the study of two types of bone tissue changes: prolonged presence in microgravity conditions and age-related bone growth. The proximal tail vertebrae of geckos were selected for investigation because they do not bear the supporting load in locomotion, which allows them to be considered as an independent indicator of gravitational influence. For the vertebrae of geckos no significant differences were revealed in the elemental composition of the flight samples and the synchronous control samples. In addition, the gecko bone tissue samples from the jaw apparatus, spine and shoulder girdle were measured. The dynamics of structural changes in the bone tissue growth was studied using samples of a human fetal hand. The hands of human fetuses of 11-15 weeks were studied. Autonomous zones of calcium accumulation were found not only in individual fingers but in each of the investigated phalanges. The results obtained are discussed.

  15. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  16. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    Science.gov (United States)

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  18. Comprehensive analysis of transport aircraft flight performance

    Science.gov (United States)

    Filippone, Antonio

    2008-04-01

    This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance

  19. Incorporating Computers into Classroom: Effects on Learners’ Reading Comprehension in EFL Context

    Directory of Open Access Journals (Sweden)

    Ali Akbar Ansarin

    2017-10-01

    Full Text Available Owing to the importance of computer-assisted reading and considering the prominent role of learners in this respect, the present study investigated: (1 the effects of computer as a supplemental tool to support and improve the Iranian EFL learners’ reading comprehension in comparison with equivalent non-technological or traditional print-based treatments, (2 EFL learners’ attitudes and perception towards the computer-assisted reading course.To this purpose, 111 randomly selected groups of EFL learners participated in the study. The subjects were divided into two groups of control and experimental. Both groups received 10 reading lessons either through computers or through an instructor-led method. The statistical analysis revealed no significant difference between the learners who had access to reading supports on computer screen and their counterparts in the traditional reading classes. Learners were also allowed to express their ideas on a 5-point Likert Scale. The purpose of the attitude questionnaire was to find out more information about the participants and their experiences with computer-assisted reading. Results of attitude questionnaire supported the conclusion that computers may enhance EFL learners’ motivation and interest towards learning but they do not enhance comprehension. The findings of this study support the view that technology should supplement not supplant teachers and that people read less accurately and less comprehensively on screens than on paper.

  20. COMAN: a web server for comprehensive metatranscriptomics analysis.

    Science.gov (United States)

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  1. Applicability of computer-aided comprehensive tool (LINDA: LINeament Detection and Analysis) and shaded digital elevation model for characterizing and interpreting morphotectonic features from lineaments

    Science.gov (United States)

    Masoud, Alaa; Koike, Katsuaki

    2017-09-01

    Detection and analysis of linear features related to surface and subsurface structures have been deemed necessary in natural resource exploration and earth surface instability assessment. Subjectivity in choosing control parameters required in conventional methods of lineament detection may cause unreliable results. To reduce this ambiguity, we developed LINDA (LINeament Detection and Analysis), an integrated tool with graphical user interface in Visual Basic. This tool automates processes of detection and analysis of linear features from grid data of topography (digital elevation model; DEM), gravity and magnetic surfaces, as well as data from remote sensing imagery. A simple interface with five display windows forms a user-friendly interactive environment. The interface facilitates grid data shading, detection and grouping of segments, lineament analyses for calculating strike and dip and estimating fault type, and interactive viewing of lineament geometry. Density maps of the center and intersection points of linear features (segments and lineaments) are also included. A systematic analysis of test DEMs and Landsat 7 ETM+ imagery datasets in the North and South Eastern Deserts of Egypt is implemented to demonstrate the capability of LINDA and correct use of its functions. Linear features from the DEM are superior to those from the imagery in terms of frequency, but both linear features agree with location and direction of V-shaped valleys and dykes and reference fault data. Through the case studies, LINDA applicability is demonstrated to highlight dominant structural trends, which can aid understanding of geodynamic frameworks in any region.

  2. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  3. Comprehensive proteomic analysis of human pancreatic juice

    DEFF Research Database (Denmark)

    Grønborg, Mads; Bunkenborg, Jakob; Kristiansen, Troels Zakarias

    2004-01-01

    Proteomic technologies provide an excellent means for analysis of body fluids for cataloging protein constituents and identifying biomarkers for early detection of cancers. The biomarkers currently available for pancreatic cancer, such as CA19-9, lack adequate sensitivity and specificity...... contributing to late diagnosis of this deadly disease. In this study, we carried out a comprehensive characterization of the "pancreatic juice proteome" in patients with pancreatic adenocarcinoma. Pancreatic juice was first fractionated by 1-dimensional gel electrophoresis and subsequently analyzed by liquid...... in this study could be directly assessed for their potential as biomarkers for pancreatic cancer by quantitative proteomics methods or immunoassays....

  4. Computer aided safety analysis

    International Nuclear Information System (INIS)

    1988-05-01

    The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs

  5. Effects of computer-based immediate feedback on foreign language listening comprehension and test-associated anxiety.

    Science.gov (United States)

    Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da

    2012-06-01

    This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.

  6. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  7. Real analysis a comprehensive course in analysis, part 1

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 1 is devoted to real analysis. From one point of view, it presents the infinitesimal calculus of the twentieth century with the ultimate integral calculus (measure theory)

  8. Harmonic analysis a comprehensive course in analysis, part 3

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 3 returns to the themes of Part 1 by discussing pointwise limits (going beyond the usual focus on the Hardy-Littlewood maximal function by including ergodic theorems and m

  9. A Comprehensive Analysis of Marketing Journal Rankings

    Science.gov (United States)

    Steward, Michelle D.; Lewis, Bruce R.

    2010-01-01

    The purpose of this study is to offer a comprehensive assessment of journal standings in Marketing from two perspectives. The discipline perspective of rankings is obtained from a collection of published journal ranking studies during the past 15 years. The studies in the published ranking stream are assessed for reliability by examining internal…

  10. The Comprehension Problems of Children with Poor Reading Comprehension despite Adequate Decoding: A Meta-Analysis.

    Science.gov (United States)

    Spencer, Mercedes; Wagner, Richard K

    2018-06-01

    The purpose of this meta-analysis was to examine the comprehension problems of children who have a specific reading comprehension deficit (SCD), which is characterized by poor reading comprehension despite adequate decoding. The meta-analysis included 86 studies of children with SCD who were assessed in reading comprehension and oral language (vocabulary, listening comprehension, storytelling ability, and semantic and syntactic knowledge). Results indicated that children with SCD had deficits in oral language ( d = -0.78, 95% CI [-0.89, -0.68], but these deficits were not as severe as their deficit in reading comprehension ( d = -2.78, 95% CI [-3.01, -2.54]). When compared to reading comprehension age-matched normal readers, the oral language skills of the two groups were comparable ( d = 0.32, 95% CI [-0.49, 1.14]), which suggests that the oral language weaknesses of children with SCD represent a developmental delay rather than developmental deviance. Theoretical and practical implications of these findings are discussed.

  11. Analysis of Comprehensive Utilization of Coconut Waste

    OpenAIRE

    Zheng, Kan; Liang, Dong; Zhang, Xirui

    2013-01-01

    This paper describes and analyzes the coconut cultivation in China, and the current comprehensive utilization of waste resources generated during cultivation and processing of coconut. The wastes generated in the process of cultivation include old coconut tree trunk, roots, withered coconut leaves, coconut flower and fallen cracking coconut, mainly used for biogas extraction, direct combustion and power generation, brewing, pharmacy, and processing of building materials; the wastes generated ...

  12. A comprehensive computing initiative for MFE. Revision 1

    International Nuclear Information System (INIS)

    Cohen, R.H.; Crotinger, J.A.; Baldwin, D.E.

    1996-01-01

    The authors propose that a national initiative by launched to develop a comprehensive simulation facility for MFE. The facility would consist of physics codes developed by the national MFE community tightly but flexibly coupled through a programmable shell, enabling effectively simultaneous solution of the models in the various codes. The world ''facility'' is chosen to convey the notion that this is where one would go to conduct numerical experiments, using a full set of modules to describe an entire device, a coupled subset to describe particular aspects of a device, or a combination of the facility's modules plus the user's own physics

  13. On the Computation of Comprehensive Boolean Gröbner Bases

    Science.gov (United States)

    Inoue, Shutaro

    We show that a comprehensive Boolean Gröbner basis of an ideal I in a Boolean polynomial ring B (bar A,bar X) with main variables bar X and parameters bar A can be obtained by simply computing a usual Boolean Gröbner basis of I regarding both bar X and bar A as variables with a certain block term order such that bar X ≫ bar A. The result together with a fact that a finite Boolean ring is isomorphic to a direct product of the Galois field mathbb{GF}_2 enables us to compute a comprehensive Boolean Gröbner basis by only computing corresponding Gröbner bases in a polynomial ring over mathbb{GF}_2. Our implementation in a computer algebra system Risa/Asir shows that our method is extremely efficient comparing with existing computation algorithms of comprehensive Boolean Gröbner bases.

  14. Comprehensive data analysis of human ureter proteome

    Directory of Open Access Journals (Sweden)

    Sameh Magdeldin

    2016-03-01

    Full Text Available Comprehensive human ureter proteome dataset was generated from OFFGel fractionated ureter samples. Our result showed that among 2217 non-redundant ureter proteins, 751 protein candidates (33.8% were detected in urine as urinary protein/polypeptide or exosomal protein. On the other hand, comparing ureter protein hits (48 that are not shown in corresponding databases to urinary bladder and prostate human protein atlas databases pinpointed 21 proteins that might be unique to ureter tissue. In conclusion, this finding offers future perspectives for possible identification of ureter disease-associated biomarkers such as ureter carcinoma. In addition, Cytoscape GO annotation was examined on the final ureter dataset to better understand proteins molecular function, biological processes, and cellular component. The ureter proteomic dataset published in this article will provide a valuable resource for researchers working in the field of urology and urine biomarker discovery.

  15. Comprehensive adaptive mesh refinement in wrinkling prediction analysis

    NARCIS (Netherlands)

    Selman, A.; Meinders, Vincent T.; Huetink, Han; van den Boogaard, Antonius H.

    2002-01-01

    Discretisation errors indicator, contact free wrinkling and wrinkling with contact indicators are, in a challenging task, brought together and used in a comprehensive approach to wrinkling prediction analysis in thin sheet metal forming processes.

  16. Comprehensive analysis of information dissemination in disasters

    Science.gov (United States)

    Zhang, N.; Huang, H.; Su, Boni

    2016-11-01

    China is a country that experiences a large number of disasters. The number of deaths caused by large-scale disasters and accidents in past 10 years is around 900,000. More than 92.8 percent of these deaths could be avoided if there were an effective pre-warning system deployed. Knowledge of the information dissemination characteristics of different information media taking into consideration governmental assistance (information published by a government) in disasters in urban areas, plays a critical role in increasing response time and reducing the number of deaths and economic losses. In this paper we have developed a comprehensive information dissemination model to optimize efficiency of pre-warning mechanics. This model also can be used for disseminating information for evacuees making real-time evacuation plans. We analyzed every single information dissemination models for pre-warning in disasters by considering 14 media: short message service (SMS), phone, television, radio, news portals, Wechat, microblogs, email, newspapers, loudspeaker vehicles, loudspeakers, oral communication, and passive information acquisition via visual and auditory senses. Since governmental assistance is very useful in a disaster, we calculated the sensitivity of governmental assistance ratio. The results provide useful references for information dissemination during disasters in urban areas.

  17. Comprehensive metabolic panel

    Science.gov (United States)

    Metabolic panel - comprehensive; Chem-20; SMA20; Sequential multi-channel analysis with computer-20; SMAC20; Metabolic panel 20 ... Chernecky CC, Berger BJ. Comprehensive metabolic panel (CMP) - blood. In: ... Tests and Diagnostic Procedures . 6th ed. St Louis, MO: ...

  18. Green-Ampt approximations: A comprehensive analysis

    Science.gov (United States)

    Ali, Shakir; Islam, Adlul; Mishra, P. K.; Sikka, Alok K.

    2016-04-01

    Green-Ampt (GA) model and its modifications are widely used for simulating infiltration process. Several explicit approximate solutions to the implicit GA model have been developed with varying degree of accuracy. In this study, performance of nine explicit approximations to the GA model is compared with the implicit GA model using the published data for broad range of soil classes and infiltration time. The explicit GA models considered are Li et al. (1976) (LI), Stone et al. (1994) (ST), Salvucci and Entekhabi (1994) (SE), Parlange et al. (2002) (PA), Barry et al. (2005) (BA), Swamee et al. (2012) (SW), Ali et al. (2013) (AL), Almedeij and Esen (2014) (AE), and Vatankhah (2015) (VA). Six statistical indicators (e.g., percent relative error, maximum absolute percent relative error, average absolute percent relative errors, percent bias, index of agreement, and Nash-Sutcliffe efficiency) and relative computer computation time are used for assessing the model performance. Models are ranked based on the overall performance index (OPI). The BA model is found to be the most accurate followed by the PA and VA models for variety of soil classes and infiltration periods. The AE, SW, SE, and LI model also performed comparatively better. Based on the overall performance index, the explicit models are ranked as BA > PA > VA > LI > AE > SE > SW > ST > AL. Results of this study will be helpful in selection of accurate and simple explicit approximate GA models for solving variety of hydrological problems.

  19. Comprehensive analysis of a medication dosing error related to CPOE.

    Science.gov (United States)

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  20. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  1. Comprehensive cluster analysis with Transitivity Clustering.

    Science.gov (United States)

    Wittkop, Tobias; Emig, Dorothea; Truss, Anke; Albrecht, Mario; Böcker, Sebastian; Baumbach, Jan

    2011-03-01

    Transitivity Clustering is a method for the partitioning of biological data into groups of similar objects, such as genes, for instance. It provides integrated access to various functions addressing each step of a typical cluster analysis. To facilitate this, Transitivity Clustering is accessible online and offers three user-friendly interfaces: a powerful stand-alone version, a web interface, and a collection of Cytoscape plug-ins. In this paper, we describe three major workflows: (i) protein (super)family detection with Cytoscape, (ii) protein homology detection with incomplete gold standards and (iii) clustering of gene expression data. This protocol guides the user through the most important features of Transitivity Clustering and takes ∼1 h to complete.

  2. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    Science.gov (United States)

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  4. Development of a Computer-Based Measure of Listening Comprehension of Science Talk

    Science.gov (United States)

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2015-01-01

    The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…

  5. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  6. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  7. Comprehensive physical analysis of bond wire interfaces in power modules

    DEFF Research Database (Denmark)

    Popok, Vladimir; Pedersen, Kristian Bonderup; Kristensen, Peter Kjær

    2016-01-01

    causing failures. In this paper we present a review on the set of our experimental and theoretical studies allowing comprehensive physical analysis of changes in materials under active power cycling with focus on bond wire interfaces and thin metallisation layers. The developed electro-thermal and thermo...

  8. Identification Of Protein Vaccine Candidates Using Comprehensive Proteomic Analysis Strategies

    Science.gov (United States)

    2007-12-01

    that fascinating fungus known as Coccidioides. I also want to thank the UA Mass Spectrometry Facility and the UA Proteomics Consortium, especially...W. & N. N. Kav. 2006. The proteome of the phytopathogenic fungus Sclerotinia sclerotiorum. Proteomics 6: 5995-6007. 127. de Godoy, L. M., J. V...IDENTIFICATION OF PROTEIN VACCINE CANDIDATES USING COMPREHENSIVE PROTEOMIC ANALYSIS STRATEGIES by James G. Rohrbough

  9. Comprehensive School Reform and Achievement: A Meta-Analysis

    Science.gov (United States)

    Borman, Geoffrey D.; Hewes, Gina M.; Overman, Laura T.; Brown, Shelly

    2003-01-01

    This meta-analysis reviews research on the achievement effects of comprehensive school reform (CSR) and summarizes the specific effects of 29 widely implemented models. There are limitations on the overall quantity and quality of the research base, but the overall effects of CSR appear promising. The combined quantity, quality, and statistical…

  10. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  11. Are There Gender Differences in Emotion Comprehension? Analysis of the Test of Emotion Comprehension.

    Science.gov (United States)

    Fidalgo, Angel M; Tenenbaum, Harriet R; Aznar, Ana

    2018-01-01

    This article examines whether there are gender differences in understanding the emotions evaluated by the Test of Emotion Comprehension (TEC). The TEC provides a global index of emotion comprehension in children 3-11 years of age, which is the sum of the nine components that constitute emotion comprehension: (1) recognition of facial expressions, (2) understanding of external causes of emotions, (3) understanding of desire-based emotions, (4) understanding of belief-based emotions, (5) understanding of the influence of a reminder on present emotional states, (6) understanding of the possibility to regulate emotional states, (7) understanding of the possibility of hiding emotional states, (8) understanding of mixed emotions, and (9) understanding of moral emotions. We used the answers to the TEC given by 172 English girls and 181 boys from 3 to 8 years of age. First, the nine components into which the TEC is subdivided were analysed for differential item functioning (DIF), taking gender as the grouping variable. To evaluate DIF, the Mantel-Haenszel method and logistic regression analysis were used applying the Educational Testing Service DIF classification criteria. The results show that the TEC did not display gender DIF. Second, when absence of DIF had been corroborated, it was analysed for differences between boys and girls in the total TEC score and its components controlling for age. Our data are compatible with the hypothesis of independence between gender and level of comprehension in 8 of the 9 components of the TEC. Several hypotheses are discussed that could explain the differences found between boys and girls in the belief component. Given that the Belief component is basically a false belief task, the differences found seem to support findings in the literature indicating that girls perform better on this task.

  12. Computerized comprehensive data analysis of Lung Imaging Database Consortium (LIDC)

    International Nuclear Information System (INIS)

    Tan Jun; Pu Jiantao; Zheng Bin; Wang Xingwei; Leader, Joseph K.

    2010-01-01

    Purpose: Lung Image Database Consortium (LIDC) is the largest public CT image database of lung nodules. In this study, the authors present a comprehensive and the most updated analysis of this dynamically growing database under the help of a computerized tool, aiming to assist researchers to optimally use this database for lung cancer related investigations. Methods: The authors developed a computer scheme to automatically match the nodule outlines marked manually by radiologists on CT images. A large variety of characteristics regarding the annotated nodules in the database including volume, spiculation level, elongation, interobserver variability, as well as the intersection of delineated nodule voxels and overlapping ratio between the same nodules marked by different radiologists are automatically calculated and summarized. The scheme was applied to analyze all 157 examinations with complete annotation data currently available in LIDC dataset. Results: The scheme summarizes the statistical distributions of the abovementioned geometric and diagnosis features. Among the 391 nodules, (1) 365 (93.35%) have principal axis length ≤20 mm; (2) 120, 75, 76, and 120 were marked by one, two, three, and four radiologists, respectively; and (3) 122 (32.48%) have the maximum volume overlapping ratios ≥80% for the delineations of two radiologists, while 198 (50.64%) have the maximum volume overlapping ratios <60%. The results also showed that 72.89% of the nodules were assessed with malignancy score between 2 and 4, and only 7.93% of these nodules were considered as severely malignant (malignancy ≥4). Conclusions: This study demonstrates that LIDC contains examinations covering a diverse distribution of nodule characteristics and it can be a useful resource to assess the performance of the nodule detection and/or segmentation schemes.

  13. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  14. Comprehensive automatic assessment of retinal vascular abnormalities for computer-assisted retinopathy grading.

    Science.gov (United States)

    Joshi, Vinayak; Agurto, Carla; VanNess, Richard; Nemeth, Sheila; Soliz, Peter; Barriga, Simon

    2014-01-01

    One of the most important signs of systemic disease that presents on the retina is vascular abnormalities such as in hypertensive retinopathy. Manual analysis of fundus images by human readers is qualitative and lacks in accuracy, consistency and repeatability. Present semi-automatic methods for vascular evaluation are reported to increase accuracy and reduce reader variability, but require extensive reader interaction; thus limiting the software-aided efficiency. Automation thus holds a twofold promise. First, decrease variability while increasing accuracy, and second, increasing the efficiency. In this paper we propose fully automated software as a second reader system for comprehensive assessment of retinal vasculature; which aids the readers in the quantitative characterization of vessel abnormalities in fundus images. This system provides the reader with objective measures of vascular morphology such as tortuosity, branching angles, as well as highlights of areas with abnormalities such as artery-venous nicking, copper and silver wiring, and retinal emboli; in order for the reader to make a final screening decision. To test the efficacy of our system, we evaluated the change in performance of a newly certified retinal reader when grading a set of 40 color fundus images with and without the assistance of the software. The results demonstrated an improvement in reader's performance with the software assistance, in terms of accuracy of detection of vessel abnormalities, determination of retinopathy, and reading time. This system enables the reader in making computer-assisted vasculature assessment with high accuracy and consistency, at a reduced reading time.

  15. Effects of a Computer-Assisted Concept Mapping Learning Strategy on EFL College Students' English Reading Comprehension

    Science.gov (United States)

    Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju

    2010-01-01

    The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…

  16. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  17. Operator theory a comprehensive course in analysis, part 4

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 4 focuses on operator theory, especially on a Hilbert space. Central topics are the spectral theorem, the theory of trace class and Fredholm determinants, and the study of

  18. Computer aided safety analysis 1989

    International Nuclear Information System (INIS)

    1990-04-01

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  19. Using miscue analysis to assess comprehension in deaf college readers.

    Science.gov (United States)

    Albertini, John; Mayer, Connie

    2011-01-01

    For over 30 years, teachers have used miscue analysis as a tool to assess and evaluate the reading abilities of hearing students in elementary and middle schools and to design effective literacy programs. More recently, teachers of deaf and hard-of-hearing students have also reported its usefulness for diagnosing word- and phrase-level reading difficulties and for planning instruction. To our knowledge, miscue analysis has not been used with older, college-age deaf students who might also be having difficulty decoding and understanding text at the word level. The goal of this study was to determine whether such an analysis would be helpful in identifying the source of college students' reading comprehension difficulties. After analyzing the miscues of 10 college-age readers and the results of other comprehension-related tasks, we concluded that comprehension of basic grade school-level passages depended on the ability to recognize and comprehend key words and phrases in these texts. We also concluded that these diagnostic procedures provided useful information about the reading abilities and strategies of each reader that had implications for designing more effective interventions.

  20. The Comprehension Problems for Second-Language Learners with Poor Reading Comprehension Despite Adequate Decoding: A Meta-Analysis

    Science.gov (United States)

    Spencer, Mercedes; Wagner, Richard K.

    2017-01-01

    We conducted a meta-analysis of 16 existing studies to examine the nature of the comprehension problems for children who were second-language learners with poor reading comprehension despite adequate decoding. Results indicated that these children had deficits in oral language (d = -0.80), but these deficits were not as severe as their reading…

  1. Comprehensive Analysis Competence and Innovative Approaches for Sustainable Chemical Production.

    Science.gov (United States)

    Appel, Joerg; Colombo, Corrado; Dätwyler, Urs; Chen, Yun; Kerimoglu, Nimet

    2016-01-01

    Humanity currently sees itself facing enormous economic, ecological, and social challenges. Sustainable products and production in specialty chemistry are an important strategic element to address these megatrends. In addition to that, digitalization and global connectivity will create new opportunities for the industry. One aspect is examined in this paper, which shows the development of comprehensive analysis of production networks for a more sustainable production in which the need for innovative solutions arises. Examples from data analysis, advanced process control and automated performance monitoring are shown. These efforts have significant impact on improved yields, reduced energy and water consumption, and better product performance in the application of the products.

  2. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  3. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  4. Computational analysis of cerebral cortex

    International Nuclear Information System (INIS)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni

    2010-01-01

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  5. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  6. Computer aided analysis of disturbances

    International Nuclear Information System (INIS)

    Baldeweg, F.; Lindner, A.

    1986-01-01

    Computer aided analysis of disturbances and the prevention of failures (diagnosis and therapy control) in technological plants belong to the most important tasks of process control. Research in this field is very intensive due to increasing requirements to security and economy of process control and due to a remarkable increase of the efficiency of digital electronics. This publication concerns with analysis of disturbances in complex technological plants, especially in so called high risk processes. The presentation emphasizes theoretical concept of diagnosis and therapy control, modelling of the disturbance behaviour of the technological process and the man-machine-communication integrating artificial intelligence methods, e.g., expert system approach. Application is given for nuclear power plants. (author)

  7. Development of comprehensive and versatile framework for reactor analysis, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Numata, Kazuyuki; Jin, Tomoyuki

    2014-01-01

    Highlights: • We have developed a neutronics code system for reactor analysis. • The new code system covers all five phases of the core design procedures. • All the functionalities are integrated and validated in the same framework. • The framework supports continuous improvement and extension. • We report results of validation and practical applications. - Abstract: A comprehensive and versatile reactor analysis code system, MARBLE, has been developed. MARBLE is designed as a software development framework for reactor analysis, which offers reusable and extendible functions and data models based on physical concepts, rather than a reactor analysis code system. From a viewpoint of the code system, it provides a set of functionalities utilized in a detailed reactor analysis scheme for fast criticality assemblies and power reactors, and nuclear data related uncertainty quantification such as cross-section adjustment. MARBLE includes five sub-systems named ECRIPSE, BIBLO, SCHEME, UNCERTAINTY and ORPHEUS, which are constructed of the shared functions and data models in the framework. By using these sub-systems, MARBLE covers all phases required in fast reactor core design prediction and improvement procedures, i.e. integral experiment database management, nuclear data processing, fast criticality assembly analysis, uncertainty quantification, and power reactor analysis. In the present paper, these functionalities are summarized and system validation results are described

  8. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data

    Science.gov (United States)

    Ma, Jianhua

    2018-01-01

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model. PMID:29495343

  9. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data.

    Science.gov (United States)

    Guo, Ao; Ma, Jianhua

    2018-02-25

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject's persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual's facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.

  10. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data

    Directory of Open Access Journals (Sweden)

    Ao Guo

    2018-02-01

    Full Text Available A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual. This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.

  11. Analysis of Virtual Learning Environments from a Comprehensive Semiotic Perspective

    Directory of Open Access Journals (Sweden)

    Gloria María Álvarez Cadavid

    2012-11-01

    Full Text Available Although there is a wide variety of perspectives and models for the study of online education, most of these focus on the analysis of the verbal aspects of such learning, while very few consider the relationship between speech and elements of a different nature, such as images and hypermediality. In a previous article we presented a proposal for a comprehensive semiotic analysis of virtual learning environments that more recently has been developed and tested for the study of different online training courses without instructional intervention. In this paper we use this same proposal to analyze online learning environments in the framework of courses with instructional intervention. One of the main observations in relation to this type of analyses is that the organizational aspects of the courses are found to be related to the way in which the input elements for the teaching and learning process are constructed.

  12. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  13. A comprehensive risk analysis of coastal zones in China

    Science.gov (United States)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  14. Comprehensive two-dimensional liquid chromatographic analysis of poloxamers.

    Science.gov (United States)

    Malik, Muhammad Imran; Lee, Sanghoon; Chang, Taihyun

    2016-04-15

    Poloxamers are low molar mass triblock copolymers of poly(ethylene oxide) (PEO) and poly(propylene oxide) (PPO), having number of applications as non-ionic surfactants. Comprehensive one and two-dimensional liquid chromatographic (LC) analysis of these materials is proposed in this study. The separation of oligomers of both types (PEO and PPO) is demonstrated for several commercial poloxamers. This is accomplished at the critical conditions for one of the block while interaction for the other block. Reversed phase LC at CAP of PEO allowed for oligomeric separation of triblock copolymers with regard to PPO block whereas normal phase LC at CAP of PPO renders oligomeric separation with respect to PEO block. The oligomeric separation with regard to PEO and PPO are coupled online (comprehensive 2D-LC) to reveal two-dimensional contour plots by unconventional 2D IC×IC (interaction chromatography) coupling. The study provides chemical composition mapping of both PEO and PPO, equivalent to combined molar mass and chemical composition mapping for several commercial poloxamers. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. ClassyFire: automated chemical classification with a comprehensive, computable taxonomy.

    Science.gov (United States)

    Djoumbou Feunang, Yannick; Eisner, Roman; Knox, Craig; Chepelev, Leonid; Hastings, Janna; Owen, Gareth; Fahy, Eoin; Steinbeck, Christoph; Subramanian, Shankar; Bolton, Evan; Greiner, Russell; Wishart, David S

    2016-01-01

    Scientists have long been driven by the desire to describe, organize, classify, and compare objects using taxonomies and/or ontologies. In contrast to biology, geology, and many other scientific disciplines, the world of chemistry still lacks a standardized chemical ontology or taxonomy. Several attempts at chemical classification have been made; but they have mostly been limited to either manual, or semi-automated proof-of-principle applications. This is regrettable as comprehensive chemical classification and description tools could not only improve our understanding of chemistry but also improve the linkage between chemistry and many other fields. For instance, the chemical classification of a compound could help predict its metabolic fate in humans, its druggability or potential hazards associated with it, among others. However, the sheer number (tens of millions of compounds) and complexity of chemical structures is such that any manual classification effort would prove to be near impossible. We have developed a comprehensive, flexible, and computable, purely structure-based chemical taxonomy (ChemOnt), along with a computer program (ClassyFire) that uses only chemical structures and structural features to automatically assign all known chemical compounds to a taxonomy consisting of >4800 different categories. This new chemical taxonomy consists of up to 11 different levels (Kingdom, SuperClass, Class, SubClass, etc.) with each of the categories defined by unambiguous, computable structural rules. Furthermore each category is named using a consensus-based nomenclature and described (in English) based on the characteristic common structural properties of the compounds it contains. The ClassyFire webserver is freely accessible at http://classyfire.wishartlab.com/. Moreover, a Ruby API version is available at https://bitbucket.org/wishartlab/classyfire_api, which provides programmatic access to the ClassyFire server and database. ClassyFire has been used to

  16. Piping stress analysis with personal computers

    International Nuclear Information System (INIS)

    Revesz, Z.

    1987-01-01

    The growing market of the personal computers is providing an increasing number of professionals with unprecedented and surprisingly inexpensive computing capacity, which if using with powerful software, can enhance immensely the engineers capabilities. This paper focuses on the possibilities which opened in piping stress analysis by the widespread distribution of personal computers, on the necessary changes in the software and on the limitations of using personal computers for engineering design and analysis. Reliability and quality assurance aspects of using personal computers for nuclear applications are also mentioned. The paper resumes with personal views of the author and experiences gained during interactive graphic piping software development for personal computers. (orig./GL)

  17. Advanced complex analysis a comprehensive course in analysis, part 2b

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2B provides a comprehensive look at a number of subjects of complex analysis not included in Part 2A. Presented in this volume are the theory of conformal metrics (includ

  18. Dose tracking and dose auditing in a comprehensive computed tomography dose-reduction program.

    Science.gov (United States)

    Duong, Phuong-Anh; Little, Brent P

    2014-08-01

    Implementation of a comprehensive computed tomography (CT) radiation dose-reduction program is a complex undertaking, requiring an assessment of baseline doses, an understanding of dose-saving techniques, and an ongoing appraisal of results. We describe the role of dose tracking in planning and executing a dose-reduction program and discuss the use of the American College of Radiology CT Dose Index Registry at our institution. We review the basics of dose-related CT scan parameters, the components of the dose report, and the dose-reduction techniques, showing how an understanding of each technique is important in effective auditing of "outlier" doses identified by dose tracking. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  20. Stochastic biological response to radiation. Comprehensive analysis of gene expression

    International Nuclear Information System (INIS)

    Inoue, Tohru; Hirabayashi, Yoko

    2012-01-01

    Authors explain that the radiation effect on biological system is stochastic along the law of physics, differing from chemical effect, using instances of Cs-137 gamma-ray (GR) and benzene (BZ) exposures to mice and of resultant comprehensive analyses of gene expression. Single GR irradiation is done with Gamma Cell 40 (CSR) to C57BL/6 or C3H/He mouse at 0, 0.6 and 3 Gy. BE is given orally at 150 mg/kg/day for 5 days x 2 weeks. Bone marrow cells are sampled 1 month after the exposure. Comprehensive gene expression is analyzed by Gene Chip Mouse Genome 430 2.0 Array (Affymetrix) and data are processed by programs like case normalization, statistics, network generation, functional analysis etc. GR irradiation brings about changes of gene expression, which are classifiable in common genes variable commonly on the dose change and stochastic genes variable stochastically within each dose: e.g., with Welch-t-test, significant differences are between 0/3 Gy (dose-specific difference, 455 pbs (probe set), in stochastic 2113 pbs), 0/0.6 Gy (267 in 1284 pbs) and 0.6/3 Gy (532 pbs); and with one-way analysis of variation (ANOVA) and hierarchial/dendrographic analyses, 520 pbs are shown to involve the dose-dependent 226 and dose-specific 294 pbs. It is also shown that at 3 Gy, expression of common genes are rather suppressed, including those related to the proliferation/apoptosis of B/T cells, and of stochastic genes, related to cell division/signaling. Ven diagram of the common genes of above 520 pbs, stochastic 2113 pbs at 3 Gy and 1284 pbs at 0.6 Gy shows the overlapping genes 29, 2 and 4, respectively, indicating only 35 pbs are overlapping in total. Network analysis of changes by GR shows the rather high expression of genes around hub of cAMP response element binding protein (CREB) at 0.6 Gy, and rather variable expression around CREB hub/suppressed expression of kinesin hub at 3 Gy; in the network by BZ exposure, unchanged or low expression around p53 hub and suppression

  1. An updated comprehensive techno-economic analysis of algae biodiesel.

    Science.gov (United States)

    Nagarajan, Sanjay; Chou, Siaw Kiang; Cao, Shenyan; Wu, Chen; Zhou, Zhi

    2013-10-01

    Algae biodiesel is a promising but expensive alternative fuel to petro-diesel. To overcome cost barriers, detailed cost analyses are needed. A decade-old cost analysis by the U.S. National Renewable Energy Laboratory indicated that the costs of algae biodiesel were in the range of $0.53-0.85/L (2012 USD values). However, the cost of land and transesterification were just roughly estimated. In this study, an updated comprehensive techno-economic analysis was conducted with optimized processes and improved cost estimations. Latest process improvement, quotes from vendors, government databases, and other relevant data sources were used to calculate the updated algal biodiesel costs, and the final costs of biodiesel are in the range of $0.42-0.97/L. Additional improvements on cost-effective biodiesel production around the globe to cultivate algae was also recommended. Overall, the calculated costs seem promising, suggesting that a single step biodiesel production process is close to commercial reality. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  3. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  4. Pressure Points in Reading Comprehension: A Quantile Multiple Regression Analysis

    Science.gov (United States)

    Logan, Jessica

    2017-01-01

    The goal of this study was to examine how selected pressure points or areas of vulnerability are related to individual differences in reading comprehension and whether the importance of these pressure points varies as a function of the level of children's reading comprehension. A sample of 245 third-grade children were given an assessment battery…

  5. Correlates of Early Reading Comprehension Skills: A Componential Analysis

    Science.gov (United States)

    Babayigit, Selma; Stainthorp, Rhona

    2014-01-01

    This study had three main aims. First, we examined to what extent listening comprehension, vocabulary, grammatical skills and verbal short-term memory (VSTM) assessed prior to formal reading instruction explained individual differences in early reading comprehension levels. Second, we examined to what extent the three common component skills,…

  6. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles

  7. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  8. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  9. A Comprehensive Sensitivity Analysis of a Data Center Network with Server Virtualization for Business Continuity

    Directory of Open Access Journals (Sweden)

    Tuan Anh Nguyen

    2015-01-01

    Full Text Available Sensitivity assessment of availability for data center networks (DCNs is of paramount importance in design and management of cloud computing based businesses. Previous work has presented a performance modeling and analysis of a fat-tree based DCN using queuing theory. In this paper, we present a comprehensive availability modeling and sensitivity analysis of a DCell-based DCN with server virtualization for business continuity using stochastic reward nets (SRN. We use SRN in modeling to capture complex behaviors and dependencies of the system in detail. The models take into account (i two DCell configurations, respectively, composed of two and three physical hosts in a DCell0 unit, (ii failure modes and corresponding recovery behaviors of hosts, switches, and VMs, and VM live migration mechanism within and between DCell0s, and (iii dependencies between subsystems (e.g., between a host and VMs and between switches and VMs in the same DCell0. The constructed SRN models are analyzed in detail with regard to various metrics of interest to investigate system’s characteristics. A comprehensive sensitivity analysis of system availability is carried out in consideration of the major impacting parameters in order to observe the system’s complicated behaviors and find the bottlenecks of system availability. The analysis results show the availability improvement, capability of fault tolerance, and business continuity of the DCNs complying with DCell network topology. This study provides a basis of designing and management of DCNs for business continuity.

  10. Retrospective analysis of outcomes from two intensive comprehensive aphasia programs.

    Science.gov (United States)

    Persad, Carol; Wozniak, Linda; Kostopoulos, Ellina

    2013-01-01

    Positive outcomes from intensive therapy for individuals with aphasia have been reported in the literature. Little is known about the characteristics of individuals who attend intensive comprehensive aphasia programs (ICAPs) and what factors may predict who makes clinically significant changes when attending such programs. Demographic data on participants from 6 ICAPs showed that individuals who attend these programs spanned the entire age range (from adolescence to late adulthood), but they generally tended to be middle-aged and predominantly male. Analysis of outcome data from 2 of these ICAPs found that age and gender were not significant predictors of improved outcome on measures of language ability or functional communication. However, time post onset was related to clinical improvement in functional communication as measured by the Communication Activities of Daily Living, second edition (CADL-2). In addition, for one sample, initial severity of aphasia was related to outcome on the Western Aphasia Battery-Revised, such that individuals with more severe aphasia tended to show greater recovery compared to those with mild aphasia. Initial severity of aphasia also was highly correlated with changes in CADL-2 scores. These results suggest that adults of all ages with aphasia in either the acute or chronic phase of recovery can continue to show positive improvements in language ability and functional communication with intensive treatment.

  11. Neural activity associated with metaphor comprehension: spatial analysis.

    Science.gov (United States)

    Sotillo, María; Carretié, Luis; Hinojosa, José A; Tapia, Manuel; Mercado, Francisco; López-Martín, Sara; Albert, Jacobo

    2005-01-03

    Though neuropsychological data indicate that the right hemisphere (RH) plays a major role in metaphor processing, other studies suggest that, at least during some phases of this processing, a RH advantage may not exist. The present study explores, through a temporally agile neural signal--the event-related potentials (ERPs)--, and through source-localization algorithms applied to ERP recordings, whether the crucial phase of metaphor comprehension presents or not a RH advantage. Participants (n=24) were submitted to a S1-S2 experimental paradigm. S1 consisted of visually presented metaphoric sentences (e.g., "Green lung of the city"), followed by S2, which consisted of words that could (i.e., "Park") or could not (i.e., "Semaphore") be defined by S1. ERPs elicited by S2 were analyzed using temporal principal component analysis (tPCA) and source-localization algorithms. These analyses revealed that metaphorically related S2 words showed significantly higher N400 amplitudes than non-related S2 words. Source-localization algorithms showed differential activity between the two S2 conditions in the right middle/superior temporal areas. These results support the existence of an important RH contribution to (at least) one phase of metaphor processing and, furthermore, implicate the temporal cortex with respect to that contribution.

  12. A comprehensive overview on the foundations of formal concept analysis

    Directory of Open Access Journals (Sweden)

    K. Sumangali

    2017-12-01

    Full Text Available The immersion of voluminous collection of data is inevitable almost everywhere. The invention of mathematical models to analyse the patterns and trends of the data is an emerging necessity to extract and predict useful information in any Knowledge Discovery from Data (KDD process. The Formal Concept Analysis (FCA is an efficient mathematical model used in the process of KDD which is specially designed to portray the structure of the data in a context and depict the underlying patterns and hierarchies in it. Due to the huge increase in the application of FCA in various fields, the number of research and review articles on FCA has raised to a large extent. This review differs from the existing ones in presenting the comprehensive survey on the fundamentals of FCA in a compact and crisp manner to benefit the beginners and its focuses on the scalability issues in FCA. Further, we present the generic anatomy of FCA apart from its origin and growth at a primary level.

  13. Comprehensive proteomic analysis of the wheat pathogenic fungus Zymoseptoria tritici.

    Science.gov (United States)

    Yang, Fen; Yin, Qi

    2016-01-01

    Zymoseptoria tritici causes Septoria tritici blotch disease of wheat. To obtain a comprehensive protein dataset of this fungal pathogen, proteomes of Z. tritici growing in nutrient-limiting and rich media and in vivo at a late stage of wheat infection were fractionated by 1D gel or strong cation exchange (SCX) chromatography and analyzed by LC-MS/MS. A total of 5731, 5376 and 3168 Z. tritici proteins were confidently identified from these conditions, respectively. Of these in vitro and in planta proteins, 9 and 11% were predicted to contain signal peptides, respectively. Functional classification analysis revealed the proteins were involved in the various cellular activities. Comparison of three distinct protein expression profiles demonstrates the elevated carbohydrate, lipid and secondary metabolisms, transport, protein processing and energy production specifically in the host environment, in contrast to the enhancement of signaling, defense, replication, transcription and cell division in vitro. The data provide useful targets towards a better understanding of the molecular basis of Z. tritici growth, development, stress response and pathogenicity. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Comprehensive Study on Lexicon-based Ensemble Classification Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Łukasz Augustyniak

    2015-12-01

    Full Text Available We propose a novel method for counting sentiment orientation that outperforms supervised learning approaches in time and memory complexity and is not statistically significantly different from them in accuracy. Our method consists of a novel approach to generating unigram, bigram and trigram lexicons. The proposed method, called frequentiment, is based on calculating the frequency of features (words in the document and averaging their impact on the sentiment score as opposed to documents that do not contain these features. Afterwards, we use ensemble classification to improve the overall accuracy of the method. What is important is that the frequentiment-based lexicons with sentiment threshold selection outperform other popular lexicons and some supervised learners, while being 3–5 times faster than the supervised approach. We compare 37 methods (lexicons, ensembles with lexicon’s predictions as input and supervised learners applied to 10 Amazon review data sets and provide the first statistical comparison of the sentiment annotation methods that include ensemble approaches. It is one of the most comprehensive comparisons of domain sentiment analysis in the literature.

  15. A Comprehensive Analysis of Alternative Splicing in Paleopolyploid Maize

    Directory of Open Access Journals (Sweden)

    Wenbin Mei

    2017-05-01

    Full Text Available Identifying and characterizing alternative splicing (AS enables our understanding of the biological role of transcript isoform diversity. This study describes the use of publicly available RNA-Seq data to identify and characterize the global diversity of AS isoforms in maize using the inbred lines B73 and Mo17, and a related species, sorghum. Identification and characterization of AS within maize tissues revealed that genes expressed in seed exhibit the largest differential AS relative to other tissues examined. Additionally, differences in AS between the two genotypes B73 and Mo17 are greatest within genes expressed in seed. We demonstrate that changes in the level of alternatively spliced transcripts (intron retention and exon skipping do not solely reflect differences in total transcript abundance, and we present evidence that intron retention may act to fine-tune gene expression across seed development stages. Furthermore, we have identified temperature sensitive AS in maize and demonstrate that drought-induced changes in AS involve distinct sets of genes in reproductive and vegetative tissues. Examining our identified AS isoforms within B73 × Mo17 recombinant inbred lines (RILs identified splicing QTL (sQTL. The 43.3% of cis-sQTL regulated junctions are actually identified as alternatively spliced junctions in our analysis, while 10 Mb windows on each side of 48.2% of trans-sQTLs overlap with splicing related genes. Using sorghum as an out-group enabled direct examination of loss or conservation of AS between homeologous genes representing the two subgenomes of maize. We identify several instances where AS isoforms that are conserved between one maize homeolog and its sorghum ortholog are absent from the second maize homeolog, suggesting that these AS isoforms may have been lost after the maize whole genome duplication event. This comprehensive analysis provides new insights into the complexity of AS in maize.

  16. Current topics in pure and computational complex analysis

    CERN Document Server

    Dorff, Michael; Lahiri, Indrajit

    2014-01-01

    The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.

  17. TranslatomeDB: a comprehensive database and cloud-based analysis platform for translatome sequencing data.

    Science.gov (United States)

    Liu, Wanting; Xiang, Lunping; Zheng, Tingkai; Jin, Jingjie; Zhang, Gong

    2018-01-04

    Translation is a key regulatory step, linking transcriptome and proteome. Two major methods of translatome investigations are RNC-seq (sequencing of translating mRNA) and Ribo-seq (ribosome profiling). To facilitate the investigation of translation, we built a comprehensive database TranslatomeDB (http://www.translatomedb.net/) which provides collection and integrated analysis of published and user-generated translatome sequencing data. The current version includes 2453 Ribo-seq, 10 RNC-seq and their 1394 corresponding mRNA-seq datasets in 13 species. The database emphasizes the analysis functions in addition to the dataset collections. Differential gene expression (DGE) analysis can be performed between any two datasets of same species and type, both on transcriptome and translatome levels. The translation indices translation ratios, elongation velocity index and translational efficiency can be calculated to quantitatively evaluate translational initiation efficiency and elongation velocity, respectively. All datasets were analyzed using a unified, robust, accurate and experimentally-verifiable pipeline based on the FANSe3 mapping algorithm and edgeR for DGE analyzes. TranslatomeDB also allows users to upload their own datasets and utilize the identical unified pipeline to analyze their data. We believe that our TranslatomeDB is a comprehensive platform and knowledgebase on translatome and proteome research, releasing the biologists from complex searching, analyzing and comparing huge sequencing data without needing local computational power. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. A comprehensive computational model of sound transmission through the porcine lung.

    Science.gov (United States)

    Dai, Zoujun; Peng, Ying; Henry, Brian M; Mansy, Hansen A; Sandler, Richard H; Royston, Thomas J

    2014-09-01

    A comprehensive computational simulation model of sound transmission through the porcine lung is introduced and experimentally evaluated. This "subject-specific" model utilizes parenchymal and major airway geometry derived from x-ray CT images. The lung parenchyma is modeled as a poroviscoelastic material using Biot theory. A finite element (FE) mesh of the lung that includes airway detail is created and used in comsol FE software to simulate the vibroacoustic response of the lung to sound input at the trachea. The FE simulation model is validated by comparing simulation results to experimental measurements using scanning laser Doppler vibrometry on the surface of an excised, preserved lung. The FE model can also be used to calculate and visualize vibroacoustic pressure and motion inside the lung and its airways caused by the acoustic input. The effect of diffuse lung fibrosis and of a local tumor on the lung acoustic response is simulated and visualized using the FE model. In the future, this type of visualization can be compared and matched with experimentally obtained elastographic images to better quantify regional lung material properties to noninvasively diagnose and stage disease and response to treatment.

  19. Comprehensive Behavioral Analysis of Activating Transcription Factor 5-Deficient Mice

    Directory of Open Access Journals (Sweden)

    Mariko Umemura

    2017-07-01

    Full Text Available Activating transcription factor 5 (ATF5 is a member of the CREB/ATF family of basic leucine zipper transcription factors. We previously reported that ATF5-deficient (ATF5-/- mice demonstrated abnormal olfactory bulb development due to impaired interneuron supply. Furthermore, ATF5-/- mice were less aggressive than ATF5+/+ mice. Although ATF5 is widely expressed in the brain, and involved in the regulation of proliferation and development of neurons, the physiological role of ATF5 in the higher brain remains unknown. Our objective was to investigate the physiological role of ATF5 in the higher brain. We performed a comprehensive behavioral analysis using ATF5-/- mice and wild type littermates. ATF5-/- mice exhibited abnormal locomotor activity in the open field test. They also exhibited abnormal anxiety-like behavior in the light/dark transition test and open field test. Furthermore, ATF5-/- mice displayed reduced social interaction in the Crawley’s social interaction test and increased pain sensitivity in the hot plate test compared with wild type. Finally, behavioral flexibility was reduced in the T-maze test in ATF5-/- mice compared with wild type. In addition, we demonstrated that ATF5-/- mice display disturbances of monoamine neurotransmitter levels in several brain regions. These results indicate that ATF5 deficiency elicits abnormal behaviors and the disturbance of monoamine neurotransmitter levels in the brain. The behavioral abnormalities of ATF5-/- mice may be due to the disturbance of monoamine levels. Taken together, these findings suggest that ATF5-/- mice may be a unique animal model of some psychiatric disorders.

  20. Comprehensive benefit analysis of regional water resources based on multi-objective evaluation

    Science.gov (United States)

    Chi, Yixia; Xue, Lianqing; Zhang, Hui

    2018-01-01

    The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.

  1. Using Primary Language Support via Computer to Improve Reading Comprehension Skills of First-Grade English Language Learners

    Science.gov (United States)

    Rodriguez, Cathi Draper; Filler, John; Higgins, Kyle

    2012-01-01

    Through this exploratory study the authors investigated the effects of primary language support delivered via computer on the English reading comprehension skills of English language learners. Participants were 28 First-grade students identified as Limited English Proficient. The primary language of all participants was Spanish. Students were…

  2. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  3. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  4. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  5. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  6. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  7. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  8. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  9. Comprehensive spectral analysis of Cyg X-1 using RXTE data

    International Nuclear Information System (INIS)

    Shahid, Rizwan; Jaaffrey, S. N. A.; Misra, Ranjeev

    2012-01-01

    spectra with Γ < 1.6, despite a large number having Γ ∼ 1.65. This comprehensive analysis lays the framework by which more detailed and sophisticated broadband observations may be understood. (research papers)

  10. Comprehensive proteomic analysis of the wheat pathogenic fungus Zymoseptoria tritici

    DEFF Research Database (Denmark)

    Yang, Fen; Yin, Qi

    2016-01-01

    Zymoseptoria tritici causes Septoria tritici blotch disease of wheat. To obtain a comprehensive protein dataset of this fungal pathogen, proteomes of Z. tritici growing in nutrient-limiting and rich media and in vivo at a late stage of wheat infection were fractionated by 1D gel or strong cation...

  11. Advanced AEM by Comprehensive Analysis and Modeling of System Drift

    Science.gov (United States)

    Schiller, Arnulf; Klune, Klaus; Schattauer, Ingrid

    2010-05-01

    The quality of the assessment of risks outgoing from environmental hazards strongly depends on the spatial and temporal distribution of the data collected in a survey area. Natural hazards generally emerge from wide areas as it is in the case of volcanoes or land slides. Conventional surface measurements are restricted to few lines or locations and often can't be conducted in difficult terrain. So they only give a spatial and temporary limited data set and therefore limit the reliability of risk analysis. Aero-geophysical measurements potentially provide a valuable tool for completing the data set as they can be performed over a wide area, even above difficult terrain within a short time. A most desirable opportunity in course of such measurements is the ascertainment of the dynamics of such potentially hazardous environmental processes. This necessitates repeated and reproducible measurements. Current HEM systems can't accomplish this adequately due to their system immanent drift and - in some cases - bad signal to noise ratio. So, to develop comprising concepts for advancing state of the art HEM-systems to a valuable tool for data acquisition in risk assessment or hydrological problems, different studies have been undertaken which form the contents of the presented work conducted in course of the project HIRISK (Helicopter Based Electromagnetic System for Advanced Environmental Risk Assessment - FWF L-354 N10, supported by the Austrian Science Fund). The methodology is based upon two paths: A - Comprehensive experimental testing on an existing HEM system serving as an experimental platform. B - The setup of a numerical model which is continuously refined according to the results of the experimental data. The model then serves to simulate the experimental as well as alternative configurations and to analyze them subject to their drift behavior. Finally, concepts for minimizing the drift are derived and tested. Different test series - stationary on ground as well

  12. Basic complex analysis a comprehensive course in analysis, part 2a

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2A is devoted to basic complex analysis. It interweaves three analytic threads associated with Cauchy, Riemann, and Weierstrass, respectively. Cauchy's view focuses on th

  13. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  14. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  15. Comprehensive analysis of RNA-Seq data reveals extensive RNA editing in a human transcriptome

    DEFF Research Database (Denmark)

    Peng, Zhiyu; Cheng, Yanbing; Tan, Bertrand Chin-Ming

    2012-01-01

    a computational pipeline that carefully controls for false positives while calling RNA editing events from genome and whole-transcriptome data of the same individual. We identified 22,688 RNA editing events in noncoding genes and introns, untranslated regions and coding sequences of protein-coding genes. Most......RNA editing is a post-transcriptional event that recodes hereditary information. Here we describe a comprehensive profile of the RNA editome of a male Han Chinese individual based on analysis of ∼767 million sequencing reads from poly(A)(+), poly(A)(-) and small RNA samples. We developed...... changes (∼93%) converted A to I(G), consistent with known editing mechanisms based on adenosine deaminase acting on RNA (ADAR). We also found evidence of other types of nucleotide changes; however, these were validated at lower rates. We found 44 editing sites in microRNAs (miRNAs), suggesting a potential...

  16. Development validation and use of computer codes for inelastic analysis

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    A finite element scheme is a system which provides routines so carry out the operations which are common to all finite element programs. The list of items that can be provided as standard by the finite element scheme is surprisingly large and the list provided by the UNCLE finite element scheme is unusually comprehensive. This presentation covers the following: construction of the program, setting up a finite element mesh, generation of coordinates, incorporating boundary and load conditions. Program validation was done by creep calculations performed using CAUSE code. Program use is illustrated by calculating a typical inelastic analysis problem. This includes computer model of the PFR intermediate heat exchanger

  17. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  18. A Comprehensive study on Cloud Green Computing: To Reduce Carbon Footprints Using Clouds

    OpenAIRE

    Chiranjeeb Roy Chowdhury, Arindam Chatterjee, Alap Sardar, Shalabh Agarwal, Asoke Nath

    2013-01-01

    Cloud computing and Green computing are twomostemergent areas in information communicationtechnology (ICT) with immense applications in theentire globe. The future trends of ICT will be moretowards cloud computing and green computing.Due to tremendous improvements in computernetworks now the people prefer the Network-basedcomputing instead of doing something inan in-house based computing.In any business sector dailybusiness and individual computing are nowmigrating from individual hard drives...

  19. DFT computational analysis of piracetam

    Science.gov (United States)

    Rajesh, P.; Gunasekaran, S.; Seshadri, S.; Gnanasambandan, T.

    2014-11-01

    Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.

  20. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  1. Turbo Pascal Computer Code for PIXE Analysis

    International Nuclear Information System (INIS)

    Darsono

    2002-01-01

    To optimal utilization of the 150 kV ion accelerator facilities and to govern the analysis technique using ion accelerator, the research and development of low energy PIXE technology has been done. The R and D for hardware of the low energy PIXE installation in P3TM have been carried on since year 2000. To support the R and D of PIXE accelerator facilities in harmonize with the R and D of the PIXE hardware, the development of PIXE software for analysis is also needed. The development of database of PIXE software for analysis using turbo Pascal computer code is reported in this paper. This computer code computes the ionization cross-section, the fluorescence yield, and the stopping power of elements also it computes the coefficient attenuation of X- rays energy. The computer code is named PIXEDASIS and it is part of big computer code planed for PIXE analysis that will be constructed in the near future. PIXEDASIS is designed to be communicative with the user. It has the input from the keyboard. The output shows in the PC monitor, which also can be printed. The performance test of the PIXEDASIS shows that it can be operated well and it can provide data agreement with data form other literatures. (author)

  2. iSmaRT: a toolkit for a comprehensive analysis of small RNA-Seq data.

    Science.gov (United States)

    Panero, Riccardo; Rinaldi, Antonio; Memoli, Domenico; Nassa, Giovanni; Ravo, Maria; Rizzo, Francesca; Tarallo, Roberta; Milanesi, Luciano; Weisz, Alessandro; Giurato, Giorgio

    2017-03-15

    The interest in investigating the biological roles of small non-coding RNAs (sncRNAs) is increasing, due to the pleiotropic effects of these molecules exert in many biological contexts. While several methods and tools are available to study microRNAs (miRNAs), only few focus on novel classes of sncRNAs, in particular PIWI-interacting RNAs (piRNAs). To overcome these limitations, we implemented iSmaRT ( i ntegrative Sm all R NA T ool-kit), an automated pipeline to analyze smallRNA-Seq data. iSmaRT is a collection of bioinformatics tools and own algorithms, interconnected through a Graphical User Interface (GUI). In addition to performing comprehensive analyses on miRNAs, it implements specific computational modules to analyze piRNAs, predicting novel ones and identifying their RNA targets. A smallRNA-Seq dataset generated from brain samples of Huntington's Disease patients was used here to illustrate iSmaRT performances, demonstrating how the pipeline can provide, in a rapid and user friendly way, a comprehensive analysis of different classes of sncRNAs. iSmaRT is freely available on the web at ftp://labmedmolge-1.unisa.it (User: iSmart - Password: password). aweisz@unisa.it or ggiurato@unisa.it. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  3. ECG Signal Processing, Classification and Interpretation A Comprehensive Framework of Computational Intelligence

    CERN Document Server

    Pedrycz, Witold

    2012-01-01

    Electrocardiogram (ECG) signals are among the most important sources of diagnostic information in healthcare so improvements in their analysis may also have telling consequences. Both the underlying signal technology and a burgeoning variety of algorithms and systems developments have proved successful targets for recent rapid advances in research. ECG Signal Processing, Classification and Interpretation shows how the various paradigms of Computational Intelligence, employed either singly or in combination, can produce an effective structure for obtaining often vital information from ECG signals. Neural networks do well at capturing the nonlinear nature of the signals, information granules realized as fuzzy sets help to confer interpretability on the data and evolutionary optimization may be critical in supporting the structural development of ECG classifiers and models of ECG signals. The contributors address concepts, methodology, algorithms, and case studies and applications exploiting the paradigm of Comp...

  4. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  5. A comprehensive review of the use of computational intelligence methods in mineral exploration

    Directory of Open Access Journals (Sweden)

    Habibollah Bazdar

    2017-11-01

    Full Text Available Introduction Mineral exploration is a process by which it is decided whether or not continuing explorations at the end of each stage t will be cost-effective or not. This decision is dependent upon many factors including technical factors, economic, social and other related factors. All new methods used in mineral exploration are meant to make this decision making more simplified. In recent years, advanced computational intelligence methods for modeling along with many other disciplines of science, including the science of mineral exploration have been used. Although the results of the application of these methods show a good performance, it is essential to determine the mineral potential in terms of geology, mineralogy, petrology and other factors for a final decision. The purpose of this paper is to provide a comprehensive set of mineral exploration research and different applications of computational intelligence techniques in this respect during the last decades. Materials and methods Artificial neural network and its application in mineral exploration Artificial neural network (ANN is a series of communications between the units or nodes that try to function like neurons of the human brain (Jorjani et al., 2008. The network processing capability of communication between the units and the weights connection originates or comes from learning or are predetermined (Monjezi and Dehghani, 2008. The ANN method has been applied in different branches of mining exploration in the last decades (Brown et al., 2000; Leite and de Souza Filho, 2009; Porwal et al., 2003. Support vector machines (SVM and its application in mineral exploration SVM uses a set of examples with known class of information to build a linear hyperplane separating samples of different classes. This initial dataset is known as a training set and every sample within it is characterized by features upon which the classification is based (Smirnoff et al., 2008. The SVM classifier is a

  6. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    Fiala, C.; Kulak, R.F.

    1989-01-01

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  7. Acute myocardial ischemia after aortic valve replacement: A comprehensive diagnostic evaluation using dynamic multislice spiral computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lembcke, Alexander [Department of Radiology, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany)]. E-mail: alexander.lembcke@gmx.de; Hein, Patrick A. [Department of Radiology, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany); Enzweiler, Christian N.H. [Department of Radiology, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany); Hoffmann, Udo [Department of Radiology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Klessen, Christian [Department of Radiology, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany); Dohmen, Pascal M. [Department of Cardiovascular Surgery, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany)

    2006-03-15

    We describe the case of a 72-year-old man presenting with endocarditis and clinical signs of acute myocardial ischemia after biological aortic valve replacement. A comprehensive cardiac dynamic multislice spiral computed tomography demonstrated: (1) an endocarditic vegetation of the aortic valve; (2) a subvalvular leakage feeding a paravalvular pseudoaneurysm based on an aortic root abscess with subsequent compromise of the systolic blood flow in the left main coronary artery and the resulting myocardial perfusion deficit.

  8. Acute myocardial ischemia after aortic valve replacement: A comprehensive diagnostic evaluation using dynamic multislice spiral computed tomography

    International Nuclear Information System (INIS)

    Lembcke, Alexander; Hein, Patrick A.; Enzweiler, Christian N.H.; Hoffmann, Udo; Klessen, Christian; Dohmen, Pascal M.

    2006-01-01

    We describe the case of a 72-year-old man presenting with endocarditis and clinical signs of acute myocardial ischemia after biological aortic valve replacement. A comprehensive cardiac dynamic multislice spiral computed tomography demonstrated: (1) an endocarditic vegetation of the aortic valve; (2) a subvalvular leakage feeding a paravalvular pseudoaneurysm based on an aortic root abscess with subsequent compromise of the systolic blood flow in the left main coronary artery and the resulting myocardial perfusion deficit

  9. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  10. Revealing and Quantifying the Impaired Phonological Analysis Underpinning Impaired Comprehension in Wernicke's Aphasia

    Science.gov (United States)

    Robson, Holly; Keidel, James L.; Lambon Ralph, Matthew A.; Sage, Karen

    2012-01-01

    Wernicke's aphasia is a condition which results in severely disrupted language comprehension following a lesion to the left temporo-parietal region. A phonological analysis deficit has traditionally been held to be at the root of the comprehension impairment in Wernicke's aphasia, a view consistent with current functional neuroimaging which finds…

  11. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  12. Comprehensive analysis of earthquake source spectra in southern California

    OpenAIRE

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  13. Temporal fringe pattern analysis with parallel computing

    International Nuclear Information System (INIS)

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-01-01

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis

  14. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  15. A Comprehensive, Open-source Platform for Mass Spectrometry-based Glycoproteomics Data Analysis.

    Science.gov (United States)

    Liu, Gang; Cheng, Kai; Lo, Chi Y; Li, Jun; Qu, Jun; Neelamegham, Sriram

    2017-11-01

    Glycosylation is among the most abundant and diverse protein post-translational modifications (PTMs) identified to date. The structural analysis of this PTM is challenging because of the diverse monosaccharides which are not conserved among organisms, the branched nature of glycans, their isomeric structures, and heterogeneity in the glycan distribution at a given site. Glycoproteomics experiments have adopted the traditional high-throughput LC-MS n proteomics workflow to analyze site-specific glycosylation. However, comprehensive computational platforms for data analyses are scarce. To address this limitation, we present a comprehensive, open-source, modular software for glycoproteomics data analysis called GlycoPAT (GlycoProteomics Analysis Toolbox; freely available from www.VirtualGlycome.org/glycopat). The program includes three major advances: (1) "SmallGlyPep," a minimal linear representation of glycopeptides for MS n data analysis. This format allows facile serial fragmentation of both the peptide backbone and PTM at one or more locations. (2) A novel scoring scheme based on calculation of the "Ensemble Score (ES)," a measure that scores and rank-orders MS/MS spectrum for N- and O-linked glycopeptides using cross-correlation and probability based analyses. (3) A false discovery rate (FDR) calculation scheme where decoy glycopeptides are created by simultaneously scrambling the amino acid sequence and by introducing artificial monosaccharides by perturbing the original sugar mass. Parallel computing facilities and user-friendly GUIs (Graphical User Interfaces) are also provided. GlycoPAT is used to catalogue site-specific glycosylation on simple glycoproteins, standard protein mixtures and human plasma cryoprecipitate samples in three common MS/MS fragmentation modes: CID, HCD and ETD. It is also used to identify 960 unique glycopeptides in cell lysates from prostate cancer cells. The results show that the simultaneous consideration of peptide and glycan

  16. Statistical Analysis of a Comprehensive List of Visual Binaries

    Directory of Open Access Journals (Sweden)

    Kovaleva D.

    2015-12-01

    Full Text Available Visual binary stars are the most abundant class of observed binaries. The most comprehensive list of data on visual binaries compiled recently by cross-matching the largest catalogues of visual binaries allowed a statistical investigation of observational parameters of these systems. The dataset was cleaned by correcting uncertainties and misclassifications, and supplemented with available parallax data. The refined dataset is free from technical biases and contains 3676 presumably physical visual pairs of luminosity class V with known angular separations, magnitudes of the components, spectral types, and parallaxes. We also compiled a restricted sample of 998 pairs free from observational biases due to the probability of binary discovery. Certain distributions of observational and physical parameters of stars of our dataset are discussed.

  17. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  18. A comprehensive methodology for the analysis of highway sight distance

    OpenAIRE

    Castro Malpica, María; Santos Berbel, César de; Iglesias Martínez, Luis

    2017-01-01

    As one of the main elements of geometric design, sight distance must be considered carefully for the safe and efficient operation of highways. An application developed on geographic information systems (GIS) was con-ceived for the three-dimensional estimation of sight distance on highways, as opposed to conventional two-dimensional techniques, which may underestimate or overestimate the actual visibility conditions. It is capable of computing the available sight distance of a highway section ...

  19. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  20. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  1. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  2. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  3. Comprehensive evaluation of anomalous pulmonary venous connection by electron beam computed tomography as compared with ultrasound

    International Nuclear Information System (INIS)

    Zhang Shaoxiong; Dai Ruping; Bai Hua; He Sha; Jing Baolian

    1999-01-01

    Objective: To investigate the clinical value of electron beam computed tomography (EBCT) in diagnosis of anomalous pulmonary venous connection. Methods: Retrospective analysis on 14 cases with anomalous pulmonary venous connection was performed using EBCT volume scan. The slice thickness and scan time were 3 mm and 100 ms respectively. Non-ionic contrast medium was applied. Three dimensional reconstruction of EBCT images were carried out on all cases. Meanwhile, ultrasound echocardiography was performed on all patients. Conventional cardiovascular angiography was performed on 8 patients and 2 cases received operation. Results: Ten patients with total anomalous pulmonary venous connection, including 6 cases of supra-cardiac type and 4 cases of cardiac type, were proved by EBCT examination. Among them, 3 cases of abnormal pulmonary venous drainage were not revealed by conventional cardiovascular angiography. Among four patients with partial pulmonary venous connection, including cardiac type in 2 cases, supra-cardiac type and infra-cardiac type in 1 case respectively, only one of them was demonstrated by echocardiography. Conclusion: EBCT has significant value in diagnosis of anomalous pulmonary venous connection which may not be detectable with echocardiography or even cardiovascular angiography

  4. Cognitive Load for Configuration Comprehension in Computer-Supported Geometry Problem Solving: An Eye Movement Perspective

    Science.gov (United States)

    Lin, John Jr-Hung; Lin, Sunny S. J.

    2014-01-01

    The present study investigated (a) whether the perceived cognitive load was different when geometry problems with various levels of configuration comprehension were solved and (b) whether eye movements in comprehending geometry problems showed sources of cognitive loads. In the first investigation, three characteristics of geometry configurations…

  5. Computational analysis of a multistage axial compressor

    Science.gov (United States)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  6. Identification Of Protein Vaccine Candidates Using Comprehensive Proteomic Analysis Strategies

    National Research Council Canada - National Science Library

    Rohrbough, James G

    2007-01-01

    Presented in this dissertation are proteomic analysis studies focused on identifying proteins to be used as vaccine candidates against Coccidioidomycosis, a potentially fatal human pulmonary disease...

  7. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  8. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  9. Comprehensive Case Analysis on Participatory Approaches, from Nexus Perspectives

    Science.gov (United States)

    Masuhara, N.; Baba, K.

    2014-12-01

    According to Messages from the Bonn2011 Conference, involving local communities fully and effectively in the planning and implementation processes related to water, energy and food nexus for local ownership and commitment should be strongly needed. The participatory approaches such as deliberative polling, "joint fact-finding" and so on have been applied so far to resolve various environmental disputes, however the drivers and barriers in such processes have not been necessarily enough analyzed in a comprehensive manner, especially in Japan. Our research aims to explore solutions for conflicts in the context of water-energy-food nexus in local communities. To achieve it, we clarify drivers and barriers of each approaches applied so far in water, energy and food policy, focusing on how to deal with scientific facts. We generate hypotheses primarily that multi-issue solutions through policy integration will be more effective for conflicts in the context of water-energy-food nexus than single issue solutions for each policy. One of the key factors to formulate effective solutions is to integrate "scientific fact (expert knowledge)" and "local knowledge". Given this primary hypothesis, more specifically, we assume that it is effective for building consensus to provide opportunities to resolve the disagreement of "framing" that stakeholders can offer experts the points for providing scientific facts and that experts can get common understanding of scientific facts in the early stage of the process. To verify the hypotheses, we develop a database of the cases which such participatory approaches have been applied so far to resolve various environmental disputes based on literature survey of journal articles and public documents of Japanese cases. At present, our database is constructing. But it's estimated that conditions of framing and providing scientific information are important driving factors for problem solving and consensus building. And it's important to refine

  10. A Comprehensive Classification and Evolutionary Analysis of Plant Homeobox Genes

    OpenAIRE

    Mukherjee, Krishanu; Brocchieri, Luciano; B?rglin, Thomas R.

    2009-01-01

    The full complement of homeobox transcription factor sequences, including genes and pseudogenes, was determined from the analysis of 10 complete genomes from flowering plants, moss, Selaginella, unicellular green algae, and red algae. Our exhaustive genome-wide searches resulted in the discovery in each class of a greater number of homeobox genes than previously reported. All homeobox genes can be unambiguously classified by sequence evolutionary analysis into 14 distinct classes also charact...

  11. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...

  12. Instructional Systems Development: Conceptual Analysis and Comprehensive Bibliography

    Science.gov (United States)

    1976-02-01

    M. P., Daily, J. T. An analysis of elementary pilot performance from instructors’ comments. Amer. Psychol., 1946. 1, 292. Creelman , J. A. Evaluation...of approach training procedures. Report No. 2, Project No. NM 001-I09-T7, U. S. Naval School of Aviation Medicine, Pensacola, Florida, 1955. Creelman

  13. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  14. A Comprehensive Analysis of Authorship in Radiology Journals.

    Science.gov (United States)

    Dang, Wilfred; McInnes, Matthew D F; Kielar, Ania Z; Hong, Jiho

    2015-01-01

    The purpose of our study was to investigate authorship trends in radiology journals, and whether International Committee of Medical Journal Editors (ICMJE) recommendations have had an impact on these trends. A secondary objective was to explore other variables associated with authorship trends. A retrospective, bibliometric analysis of 49 clinical radiology journals published from 1946-2013 was conducted. The following data was exported from MEDLINE (1946 to May 2014) for each article: authors' full name, year of publication, primary author institution information, language of publication and publication type. Microsoft Excel Visual Basics for Applications scripts were programmed to categorize extracted data. Statistical analysis was performed to determine the overall mean number of authors per article over time, impact of ICMJE guidelines, authorship frequency per journal, country of origin, article type and language of publication. 216,271 articles from 1946-2013 were included. A univariate analysis of the mean authorship frequency per year of all articles yielded a linear relationship between time and authorship frequency. The mean number of authors per article in 1946 (1.42) was found to have increased consistently by 0.07 authors/ article per year (R² = 0.9728, Pjournals, country of origin, language of publication and article type. Overall authorship for 49 radiology journals across 68 years has increased markedly with no demonstrated impact from ICMJE guidelines. A higher number of authors per article was seen in articles from: higher impact journals, European and Asian countries, original research type, and those journals who explicitly endorse the ICMJE guidelines.

  15. Analysis of electronic circuits using digital computers

    International Nuclear Information System (INIS)

    Tapu, C.

    1968-01-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr

  16. Computer controlled automated assay for comprehensive studies of enzyme kinetic parameters.

    Directory of Open Access Journals (Sweden)

    Felix Bonowski

    Full Text Available Stability and biological activity of proteins is highly dependent on their physicochemical environment. The development of realistic models of biological systems necessitates quantitative information on the response to changes of external conditions like pH, salinity and concentrations of substrates and allosteric modulators. Changes in just a few variable parameters rapidly lead to large numbers of experimental conditions, which go beyond the experimental capacity of most research groups. We implemented a computer-aided experimenting framework ("robot lab assistant" that allows us to parameterize abstract, human-readable descriptions of micro-plate based experiments with variable parameters and execute them on a conventional 8 channel liquid handling robot fitted with a sensitive plate reader. A set of newly developed R-packages translates the instructions into machine commands, executes them, collects the data and processes it without user-interaction. By combining script-driven experimental planning, execution and data-analysis, our system can react to experimental outcomes autonomously, allowing outcome-based iterative experimental strategies. The framework was applied in a response-surface model based iterative optimization of buffer conditions and investigation of substrate, allosteric effector, pH and salt dependent activity profiles of pyruvate kinase (PYK. A diprotic model of enzyme kinetics was used to model the combined effects of changing pH and substrate concentrations. The 8 parameters of the model could be estimated from a single two-hour experiment using nonlinear least-squares regression. The model with the estimated parameters successfully predicted pH and PEP dependence of initial reaction rates, while the PEP concentration dependent shift of optimal pH could only be reproduced with a set of manually tweaked parameters. Differences between model-predictions and experimental observations at low pH suggest additional protonation

  17. Computational Chemical Synthesis Analysis and Pathway Design

    Directory of Open Access Journals (Sweden)

    Fan Feng

    2018-06-01

    Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.

  18. Comprehensive computational model for combining fluid hydrodynamics, light transport and biomass growth in a Taylor vortex algal photobioreactor: Lagrangian approach.

    Science.gov (United States)

    Gao, Xi; Kong, Bo; Vigil, R Dennis

    2017-01-01

    A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Peripheral blood smear image analysis: A comprehensive review

    Directory of Open Access Journals (Sweden)

    Emad A Mohammed

    2014-01-01

    Full Text Available Peripheral blood smear image examination is a part of the routine work of every laboratory. The manual examination of these images is tedious, time-consuming and suffers from interobserver variation. This has motivated researchers to develop different algorithms and methods to automate peripheral blood smear image analysis. Image analysis itself consists of a sequence of steps consisting of image segmentation, features extraction and selection and pattern classification. The image segmentation step addresses the problem of extraction of the object or region of interest from the complicated peripheral blood smear image. Support vector machine (SVM and artificial neural networks (ANNs are two common approaches to image segmentation. Features extraction and selection aims to derive descriptive characteristics of the extracted object, which are similar within the same object class and different between different objects. This will facilitate the last step of the image analysis process: pattern classification. The goal of pattern classification is to assign a class to the selected features from a group of known classes. There are two types of classifier learning algorithms: supervised and unsupervised. Supervised learning algorithms predict the class of the object under test using training data of known classes. The training data have a predefined label for every class and the learning algorithm can utilize this data to predict the class of a test object. Unsupervised learning algorithms use unlabeled training data and divide them into groups using similarity measurements. Unsupervised learning algorithms predict the group to which a new test object belong to, based on the training data without giving an explicit class to that object. ANN, SVM, decision tree and K-nearest neighbor are possible approaches to classification algorithms. Increased discrimination may be obtained by combining several classifiers together.

  20. Integrated care: a comprehensive bibliometric analysis and literature review

    Directory of Open Access Journals (Sweden)

    Xiaowei Sun

    2014-06-01

    Full Text Available Introduction: Integrated care could not only fix up fragmented health care but also improve the continuity of care and the quality of life. Despite the volume and variety of publications, little is known about how ‘integrated care’ has developed. There is a need for a systematic bibliometric analysis on studying the important features of the integrated care literature.Aim: To investigate the growth pattern, core journals and jurisdictions and identify the key research domains of integrated care.Methods: We searched Medline/PubMed using the search strategy ‘(delivery of health care, integrated [MeSH Terms] OR integrated care [Title/Abstract]’ without time and language limits. Second, we extracted the publishing year, journals, jurisdictions and keywords of the retrieved articles. Finally, descriptive statistical analysis by the Bibliographic Item Co-occurrence Matrix Builder and hierarchical clustering by SPSS were used.Results: As many as 9090 articles were retrieved. Results included: (1 the cumulative numbers of the publications on integrated care rose perpendicularly after 1993; (2 all documents were recorded by 1646 kinds of journals. There were 28 core journals; (3 the USA is the predominant publishing country; and (4 there are six key domains including: the definition/models of integrated care, interdisciplinary patient care team, disease management for chronically ill patients, types of health care organizations and policy, information system integration and legislation/jurisprudence.Discussion and conclusion: Integrated care literature has been most evident in developed countries. International Journal of Integrated Care is highly recommended in this research area. The bibliometric analysis and identification of publication hotspots provides researchers and practitioners with core target journals, as well as an overview of the field for further research in integrated care.

  1. Emotions while awaiting lung transplantation: A comprehensive qualitative analysis

    Science.gov (United States)

    Brügger, Aurelia; Aubert, John-David

    2014-01-01

    Patients awaiting lung transplantation are at risk of negative emotional and physical experiences. How do they talk about emotions? Semi-structured interviews were performed (15 patients). Categorical analysis focusing on emotion-related descriptions was organized into positive–negative–neutral descriptions: for primary and secondary emotions, evaluation processes, coping strategies, personal characteristics, emotion descriptions associated with physical states, (and) contexts were listed. Patients develop different strategies to maintain positive identity and attitude, while preserving significant others from extra emotional load. Results are discussed within various theoretical and research backgrounds, in emphasizing their importance in the definition of emotional support starting from the patient’s perspective. PMID:28070345

  2. Comprehensive safeguards evaluation methods and societal risk analysis

    International Nuclear Information System (INIS)

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures

  3. Emotions while awaiting lung transplantation: A comprehensive qualitative analysis.

    Science.gov (United States)

    Brügger, Aurelia; Aubert, John-David; Piot-Ziegler, Chantal

    2014-07-01

    Patients awaiting lung transplantation are at risk of negative emotional and physical experiences. How do they talk about emotions? Semi-structured interviews were performed (15 patients). Categorical analysis focusing on emotion-related descriptions was organized into positive-negative-neutral descriptions: for primary and secondary emotions, evaluation processes, coping strategies, personal characteristics, emotion descriptions associated with physical states, (and) contexts were listed. Patients develop different strategies to maintain positive identity and attitude, while preserving significant others from extra emotional load. Results are discussed within various theoretical and research backgrounds, in emphasizing their importance in the definition of emotional support starting from the patient's perspective.

  4. Emotions while awaiting lung transplantation: A comprehensive qualitative analysis

    Directory of Open Access Journals (Sweden)

    Aurelia Brügger

    2014-12-01

    Full Text Available Patients awaiting lung transplantation are at risk of negative emotional and physical experiences. How do they talk about emotions? Semi-structured interviews were performed (15 patients. Categorical analysis focusing on emotion-related descriptions was organized into positive–negative–neutral descriptions: for primary and secondary emotions, evaluation processes, coping strategies, personal characteristics, emotion descriptions associated with physical states, (and contexts were listed. Patients develop different strategies to maintain positive identity and attitude, while preserving significant others from extra emotional load. Results are discussed within various theoretical and research backgrounds, in emphasizing their importance in the definition of emotional support starting from the patient’s perspective.

  5. A comprehensive fuel nuclide analysis at the reprocessing plant

    International Nuclear Information System (INIS)

    Arenz, H.J.; Koch, L.

    1983-01-01

    The composition of spent fuel can be determined by various methods. They rely partially on different information. Therefore the synopsis of the results of all methods permits a detection of systematic errors and their explanation. Methods for determining the masses of fuel nuclides at the reprocessing input point range from pure calculations (shipper data) to mere experimental determinations (volumetric analysis). In between, a mix of ''fresh'' experimental results and ''historical'' data is used to establish a material balance. Deviations in the results obtained by the individual methods can be attributed to the information source, which is unique for the method in question. The methodology of the approach consists of three steps: by paired comparison of the operator analysis (usually volumetric or gravimetric) with remeasurements the error components are determined on a batch-by-batch basis. Using the isotope correlation technique the operator data as well as the remeasurements are checked on an inter-batch basis for outliers, precision and bias. Systematic errors can be uncovered by inter-lab comparison of remeasurements and confirmed by using historical information. Experience collected during the reprocessing of LWR fuel at two reprocessing plants prove the flexibility and effectiveness of this approach. An example is presented to demonstrate its capability in detecting outliers and determining systematic errors. (author)

  6. Clinical applications of cone beam computed tomography in endodontics: A comprehensive review.

    Science.gov (United States)

    Cohenca, Nestor; Shemesh, Hagay

    2015-09-01

    The use of cone beam computed tomography (CBCT) in endodontics has been extensively reported in the literature. Compared with the traditional spiral computed tomography, limited field of view (FOV) CBCT results in a fraction of the effective absorbed dose of radiation. The purpose of this manuscript is to review the application and advantages associated with advanced endodontic problems and complications, while reducing radiation exposure during complex endodontic procedures. The benefits of the added diagnostic information provided by intraoperative CBCT images in select cases justify the risk associated with the limited level of radiation exposure.

  7. COMPREHENSIVE REVIEW OF AES AND RSA SECURITY ALGORITHMS IN CLOUD COMPUTING

    OpenAIRE

    Shubham Kansal*, Harkiran Kaur

    2017-01-01

    Cloud Computing referred as revolutionary approach which has changed the IT and business integration. It has benefits to almost every type of IT requirement, it can be used by enterprises to cut their IT costs, and it can be used by individual to use it as a storage solution with a disaster recovery solution. One major problem that exists with Cloud Computing, in the present scenario, is security and privacy of the data. Encryption is the most important part of the security if you own a priva...

  8. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  9. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  10. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    Science.gov (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  11. COMPREHENSIVE ANALYSIS OF PREBIOTIC PROPENAL UP TO 660 GHz

    Energy Technology Data Exchange (ETDEWEB)

    Daly, A. M.; Bermúdez, C.; Kolesniková, L.; Alonso, J. L., E-mail: Adam.M.Daly@jpl.nasa.gov [Grupo de Espectroscopia Molecular (GEM), Edificio Quifima, Área de Química-Física, Laboratorios de Espectroscopia y Bioespectroscopia, Parque Científico UVa, Unidad Asociada CSIC, Universidad de Valladolid, E-47011 Valladolid (Spain)

    2015-06-22

    Since interstellar detection of propenal is only based on two rotational transitions in the centimeter wave region, its high resolution rotational spectrum has been measured up to 660 GHz and fully characterized by assignment of more than 12,000 transitions to provide direct laboratory data to the astronomical community. Spectral assignments and analysis include transitions from the ground state of the trans and cis isomers, three trans-{sup 13}C isotopologues, and ten excited vibrational states of the trans form. Combining new millimeter and submillimeter data with those from the far-infrared region has yielded the most precise set of spectroscopic constants of trans-propenal obtained to date. Newly determined rotational constants, centrifugal distortion constants, vibrational energies, and Coriolis and Fermi interaction constants are given with high accuracy and were used to predict transition frequencies and intensities over a wide frequency range. Results of this work should facilitate astronomers further observation of propenal in the interstellar medium.

  12. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  13. Computer Assisted Comprehension of Distant Worlds: Understanding Hunger Dynamics in Africa.

    Science.gov (United States)

    Moseley, William G.

    2001-01-01

    Describes a computer program called RiskMap. Explains that after completing an assignment on rural economics and hunger dynamics in Africa, students showed an increased level of understanding and felt that using RiskMap was helpful in learning the material. Includes references. (DAJ)

  14. Quantifying cross-linguistic influence with a computational model : A study of case-marking comprehension

    NARCIS (Netherlands)

    Matusevych, Yevgen; Alishahi, Afra; Backus, Albert

    2017-01-01

    Cross-linguistic influence (CLI) is one of the key phenomena in bilingual and second language learning. We propose a method for quantifying CLI in the use of linguistic constructions with the help of a computational model, which acquires constructions in two languages from bilingual input. We focus

  15. Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis

    Science.gov (United States)

    2016-07-31

    distribution unlimited Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis Matthew...vital importance for hydrocarbon -fueled propulsion systems: fuel thermal performance as indicated by physical and chemical effects of cooling passage... analysis . The selection and acquisition of a set of chemically diverse fuels is pivotal for a successful outcome since test method validation and

  16. Pixel-based analysis of comprehensive two-dimensional gas chromatograms (color plots) of petroleum

    DEFF Research Database (Denmark)

    Furbo, Søren; Hansen, Asger B.; Skov, Thomas

    2014-01-01

    We demonstrate how to process comprehensive two-dimensional gas chromatograms (GC × GC chromatograms) to remove nonsample information (artifacts), including background and retention time shifts. We also demonstrate how this, combined with further reduction of the influence of irrelevant informati......, allows for data analysis without integration or peak deconvolution (pixelbased analysis)....

  17. Maintenance analysis method and operational feedback: a comprehensive maintenance management

    International Nuclear Information System (INIS)

    Mathieu Riou; Victor Planchon

    2006-01-01

    Full text of publication follows: Current periodic inspections program carried out on the COGEMA LOGISTICS casks is required by regulations and approved by the competent Authority. Thus, Safety and casks conformity to the according certificate of approval are guaranteed. Nonetheless, based on experience it appeared that some maintenance operations did not seem relevant or were redundant. Then, it was decided to rethink completely our maintenance program to reach the following objectives: - Set up the 'a minima' required inspection operations required to guarantee Safety and conformity to the certificate of approval, - Optimize criteria and periodicities of inspections taking into account: operational feedback, routine inspections carried out for each transport, regulations, environmental impact (ALARA, waste reduction,...), cost-effectiveness (reduction of cask's immobilization period,...). - Set up a maintenance program in Safety Analysis Reports that: stands alone (no need to check the specification or the certificate of approval to have the complete list of inspections mandatory to guarantee Safety), gives objectives instead of means of controls. This approach needs then to be re-evaluated by the competent Authority. Study's scope has been limited to the TN TM 12 cask family which is intensely used. COGEMA LOGISTICS has a high operational feedback on these casks. After Authority agreement, and in accordance with its requirements, study will then be extended to the other casks belonging to the COGEMA LOGISTICS cask fleet. Actually, the term 'maintenance' is linked to 'Base maintenance' and 'Main maintenance' and implicitly means that the cask is immobilized for a given period. To emphasize the modifications, the term 'maintenance' is no longer used and is substituted by 'periodic upkeep'. By changing the name, COGEMA LOGISTICS wants to emphasize that: some operations can for instance be realized while the cask is unloaded, periodicities are thought in terms of

  18. ANALYSIS, SELECTION AND RANKING OF FOREIGN MARKETS. A COMPREHENSIVE APPROACH

    Directory of Open Access Journals (Sweden)

    LIVIU NEAMŢU

    2013-12-01

    Full Text Available Choosing the appropriate markets for growth and development is essential for a company that wishes expanding businesses through international economic exchanges. But in this business case foreign markets research is not sufficient even though is an important chapter in the decision technology and an indispensable condition for achieving firm’s objectives. If in marketing on the national market this market is defined requiring no more than its prospection and segmentation, in the case of the international market outside the research process there is a need of a selection of markets and their classification. Companies that have this intention know little or nothing about the conditions offered by a new market or another. Therefore, they must go, step by step, through a complex analysis process, multilevel- type, composed of selection and ranking of markets followed by the proper research through exploration and segmentation, which can lead to choosing the most profitable markets. In this regard, within this study, we propose a multi-criteria model for selection and ranking of international development markets, allowing companies access to those markets which are in compliance with the company's development strategy.

  19. Analyse Factorielle d'une Batterie de Tests de Comprehension Orale et Ecrite (Factor Analysis of a Battery of Tests of Listening and Reading Comprehension). Melanges Pedagogiques, 1971.

    Science.gov (United States)

    Lonchamp, F.

    This is a presentation of the results of a factor analysis of a battery of tests intended to measure listening and reading comprehension in English as a second language. The analysis sought to answer the following questions: (1) whether the factor analysis method yields results when applied to tests which are not specifically designed for this…

  20. Comprehensive analysis of NuMA variation in breast cancer

    Directory of Open Access Journals (Sweden)

    Aittomäki Kristiina

    2008-03-01

    Full Text Available Abstract Background A recent genome wide case-control association study identified NuMA region on 11q13 as a candidate locus for breast cancer susceptibility. Specifically, the variant Ala794Gly was suggested to be associated with increased risk of breast cancer. Methods In order to evaluate the NuMa gene for breast cancer susceptibility, we have here screened the entire coding region and exon-intron boundaries of NuMa in 92 familial breast cancer patients and constructed haplotypes of the identified variants. Five missense variants were further screened in 341 breast cancer cases with a positive family history and 368 controls. We examined the frequency of Ala794Gly in an extensive series of familial (n = 910 and unselected (n = 884 breast cancer cases and controls (n = 906, with a high power to detect the suggested breast cancer risk. We also tested if the variant is associated with histopathologic features of breast tumors. Results Screening of NuMA resulted in identification of 11 exonic variants and 12 variants in introns or untranslated regions. Five missense variants that were further screened in breast cancer cases with a positive family history and controls, were each carried on a unique haplotype. None of the variants, or the haplotypes represented by them, was associated with breast cancer risk although due to low power in this analysis, very low risk alleles may go unrecognized. The NuMA Ala794Gly showed no difference in frequency in the unselected breast cancer case series or familial case series compared to control cases. Furthermore, Ala794Gly did not show any significant association with histopathologic characteristics of the tumors, though Ala794Gly was slightly more frequent among unselected cases with lymph node involvement. Conclusion Our results do not support the role of NuMA variants as breast cancer susceptibility alleles.

  1. Eye laterality: a comprehensive analysis in refractive surgery candidates.

    Science.gov (United States)

    Linke, Stephan J; Druchkiv, Vasyl; Steinberg, Johannes; Richard, Gisbert; Katz, Toam

    2013-08-01

    To explore eye laterality (higher refractive error in one eye) and its association with refractive state, spherical/astigmatic anisometropia, age and sex in refractive surgery candidates. Medical records of 12 493 consecutive refractive surgery candidates were filtered. Refractive error (subjective and cycloplegic) was measured in each subject and correlated with eye laterality. Only subjects with corrected distance visual acuity (CDVA) of >20/22 in each eye were enrolled to exclude amblyopia. Associations between eye laterality and refractive state were analysed by means of t-test, chi-squared test, Spearman's correlation and multivariate logistic regression analysis, respectively. There was no statistically significant difference in spherical equivalent between right (-3.47 ± 2.76 D) and left eyes (-3.47 ± 2.76 D, p = 0.510; Pearson's r = 0.948, p laterality for anisometropia >2.5 D in myopic (-5.64 ± 2.5 D versus -4.92 ± 2.6 D; p = 0.001) and in hyperopic (4.44 ± 1.69 D versus 3.04 ± 1.79 D; p = 0.025) subjects, (II) a tendency for left eye cylindrical laterality in myopic subjects, and (III) myopic male subjects had a higher prevalence of left eye laterality. (IV) Age did not show any significant impact on laterality. Over the full refractive spectrum, this study confirmed previously described strong interocular refractive correlation but revealed a statistically significant higher rate of right eye laterality for anisometropia >2.5 D. In general, our results support the use of data from one eye only in studies of ocular refraction. © 2013 The Authors. Acta Ophthalmologica © 2013 Acta Ophthalmologica Scandinavica Foundation.

  2. Comprehensive analysis of NuMA variation in breast cancer

    International Nuclear Information System (INIS)

    Kilpivaara, Outi; Rantanen, Matias; Tamminen, Anitta; Aittomäki, Kristiina; Blomqvist, Carl; Nevanlinna, Heli

    2008-01-01

    A recent genome wide case-control association study identified NuMA region on 11q13 as a candidate locus for breast cancer susceptibility. Specifically, the variant Ala794Gly was suggested to be associated with increased risk of breast cancer. In order to evaluate the NuMa gene for breast cancer susceptibility, we have here screened the entire coding region and exon-intron boundaries of NuMa in 92 familial breast cancer patients and constructed haplotypes of the identified variants. Five missense variants were further screened in 341 breast cancer cases with a positive family history and 368 controls. We examined the frequency of Ala794Gly in an extensive series of familial (n = 910) and unselected (n = 884) breast cancer cases and controls (n = 906), with a high power to detect the suggested breast cancer risk. We also tested if the variant is associated with histopathologic features of breast tumors. Screening of NuMA resulted in identification of 11 exonic variants and 12 variants in introns or untranslated regions. Five missense variants that were further screened in breast cancer cases with a positive family history and controls, were each carried on a unique haplotype. None of the variants, or the haplotypes represented by them, was associated with breast cancer risk although due to low power in this analysis, very low risk alleles may go unrecognized. The NuMA Ala794Gly showed no difference in frequency in the unselected breast cancer case series or familial case series compared to control cases. Furthermore, Ala794Gly did not show any significant association with histopathologic characteristics of the tumors, though Ala794Gly was slightly more frequent among unselected cases with lymph node involvement. Our results do not support the role of NuMA variants as breast cancer susceptibility alleles

  3. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  4. Aerodynamic analysis of Pegasus - Computations vs reality

    Science.gov (United States)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  5. A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing

    Science.gov (United States)

    2016-12-08

    contexts businesses are also hesitant to make their proprietary available to the cloud [1]. While in general sensitive data can be protected by the...data sources, gathering and maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this...project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables

  6. Diagnostic and therapeutic implications of genetic heterogeneity in myeloid neoplasms uncovered by comprehensive mutational analysis

    Directory of Open Access Journals (Sweden)

    Sarah M. Choi

    2017-01-01

    Full Text Available While growing use of comprehensive mutational analysis has led to the discovery of innumerable genetic alterations associated with various myeloid neoplasms, the under-recognized phenomenon of genetic heterogeneity within such neoplasms creates a potential for diagnostic confusion. Here, we describe two cases where expanded mutational testing led to amendment of an initial diagnosis of chronic myelogenous leukemia with subsequent altered treatment of each patient. We demonstrate the power of comprehensive testing in ensuring appropriate classification of genetically heterogeneous neoplasms, and emphasize thoughtful analysis of molecular and genetic data as an essential component of diagnosis and management.

  7. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.

    2008-01-01

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  8. Social sciences via network analysis and computation

    CERN Document Server

    Kanduc, Tadej

    2015-01-01

    In recent years information and communication technologies have gained significant importance in the social sciences. Because there is such rapid growth of knowledge, methods and computer infrastructure, research can now seamlessly connect interdisciplinary fields such as business process management, data processing and mathematics. This study presents some of the latest results, practices and state-of-the-art approaches in network analysis, machine learning, data mining, data clustering and classifications in the contents of social sciences. It also covers various real-life examples such as t

  9. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2017-05-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  10. A comprehensive approach for computation and implementation of efficient electricity transmission network charges

    Energy Technology Data Exchange (ETDEWEB)

    Olmos, Luis; Perez-Arriaga, Ignacio J. [Instituto de Investigacion Tecnologica, Universidad Pontificia Comillas, Alberto Aguilera, 23, 28015 Madrid (Spain)

    2009-12-15

    This paper presents a comprehensive design of electricity transmission charges that are meant to recover regulated network costs. In addition, these charges must be able to meet a set of inter-related objectives. Most importantly, they should encourage potential network users to internalize transmission costs in their location decisions, while interfering as least as possible with the short-term behaviour of the agents in the power system, since this should be left to regulatory instruments in the operation time range. The paper also addresses all those implementation issues that are essential for the sound design of a system of transmission network charges: stability and predictability of the charges; fair and efficient split between generation and demand charges; temporary measures to account for the low loading of most new lines; number and definition of the scenarios to be employed for the calculation and format of the final charges to be adopted: capacity, energy or per customer charges. The application of the proposed method is illustrated with a realistic numerical example that is based on a single scenario of the 2006 winter peak in the Spanish power system. (author)

  11. Comprehensive analysis of shielding effectiveness for HDPE, BPE and concrete as candidate materials for neutron shielding

    International Nuclear Information System (INIS)

    Dhang, Prosenjit; Verma, Rishi; Shyam, Anurag

    2015-01-01

    In the compact accelerator based DD neutron generator, the deuterium ions generated by the ion source are accelerated after the extraction and bombarded to a deuterated titanium target. The emitted neutrons have typical energy of ∼2.45MeV. Utilization of these compact accelerator based neutron generators of yield up to 10 9 neutron/second (DD) is under active consideration in many research laboratories for conducting active neutron interrogation experiments. Requirement of an adequately shielded laboratory is mandatory for the effective and safe utilization of these generators for intended applications. In this reference, we report the comprehensive analysis of shielding effectiveness for High Density Polyethylene (HDPE), Borated Polyethylene (BPE) and Concrete as candidate materials for neutron shielding. In shielding calculations, neutron induced scattering and absorption gamma dose has also been considered along with neutron dose. Contemporarily any material with higher hydrogenous concentration is best suited for neutron shielding. Choice of shielding material is also dominated by practical issues like economic viability and availability of space. Our computational analysis results reveal that utilization of BPE sheets results in minimum wall thickness requirement for attaining similar range of attenuation in neutron and gamma dose. The added advantage of using borated polyethylene is that it reduces the effect of both neutron and gamma dose by absorbing neutron and producing lithium and alpha particle. It has also been realized that for deciding upon optimum thickness determination of any shielding material, three important factors to be necessarily considered are: use factor, occupancy factor and work load factor. (author)

  12. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  13. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  14. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  15. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Maragni, M.G.

    1992-01-01

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  16. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Suleman Khan

    2014-01-01

    Full Text Available Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  17. A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.

    Science.gov (United States)

    Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  18. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Science.gov (United States)

    Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880

  19. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  20. A Comprehensive Analysis of the Quality of Online Health-Related Information regarding Schizophrenia

    Science.gov (United States)

    Guada, Joseph; Venable, Victoria

    2011-01-01

    Social workers are major mental health providers and, thus, can be key players in guiding consumers and their families to accurate information regarding schizophrenia. The present study, using the WebMedQual scale, is a comprehensive analysis across a one-year period at two different time points of the top for-profit and nonprofit sites that…

  1. Impact of comprehensive two-dimensional gas chromatography with mass spectrometry on food analysis.

    Science.gov (United States)

    Tranchida, Peter Q; Purcaro, Giorgia; Maimone, Mariarosa; Mondello, Luigi

    2016-01-01

    Comprehensive two-dimensional gas chromatography with mass spectrometry has been on the separation-science scene for about 15 years. This three-dimensional method has made a great positive impact on various fields of research, and among these that related to food analysis is certainly at the forefront. The present critical review is based on the use of comprehensive two-dimensional gas chromatography with mass spectrometry in the untargeted (general qualitative profiling and fingerprinting) and targeted analysis of food volatiles; attention is focused not only on its potential in such applications, but also on how recent advances in comprehensive two-dimensional gas chromatography with mass spectrometry will potentially be important for food analysis. Additionally, emphasis is devoted to the many instances in which straightforward gas chromatography with mass spectrometry is a sufficiently-powerful analytical tool. Finally, possible future scenarios in the comprehensive two-dimensional gas chromatography with mass spectrometry food analysis field are discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Comprehensive two-dimensional gas chromatography for the analysis of organohalogenated micro-contaminants

    NARCIS (Netherlands)

    Korytar, P.; Haglund, P.; Boer, de J.; Brinkman, U.A.Th.

    2006-01-01

    We explain the principles of comprehensive two-dimensional gas chromatography (GC × GC), and discuss key instrumental aspects - with emphasis on column combinations and mass spectrometric detection. As the main item of interest, we review the potential of GC × GC for the analysis of

  3. Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones

    Science.gov (United States)

    2015-07-01

    Android source . 3.1.2.2 Analyzers An analyzer conforms to specifications defined by the Security Toolbox. Specifically an analyzer encapsulates a...COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed

  4. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms.

  5. Use of the Comprehensive Inversion method for Swarm satellite data analysis

    DEFF Research Database (Denmark)

    Sabaka, T. J.; Tøffner-Clausen, Lars; Olsen, Nils

    2013-01-01

    An advanced algorithm, known as the “Comprehensive Inversion” (CI), is presented for the analysis of Swarm measurements to generate a consistent set of Level-2 data products to be delivered by the Swarm “Satellite Constellation Application and Research Facility” (SCARF) to the European Space Agency...

  6. L2 Reading Comprehension and Its Correlates: A Meta-Analysis

    Science.gov (United States)

    Jeon, Eun Hee; Yamashita, Junko

    2014-01-01

    The present meta-analysis examined the overall average correlation (weighted for sample size and corrected for measurement error) between passage-level second language (L2) reading comprehension and 10 key reading component variables investigated in the research domain. Four high-evidence correlates (with 18 or more accumulated effect sizes: L2…

  7. Comprehensive Mass Analysis for Chemical Processes, a Case Study on L-Dopa Manufacture

    Science.gov (United States)

    To evaluate the “greenness” of chemical processes in route selection and process development, we propose a comprehensive mass analysis to inform the stakeholders from different fields. This is carried out by characterizing the mass intensity for each contributing chemical or wast...

  8. A comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements.

    Science.gov (United States)

    Abdelgaied, A; Fisher, J; Jennings, L M

    2018-02-01

    A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily

  9. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  10. A Comprehensive and Cost-Effective Computer Infrastructure for K-12 Schools

    Science.gov (United States)

    Warren, G. P.; Seaton, J. M.

    1996-01-01

    Since 1993, NASA Langley Research Center has been developing and implementing a low-cost Internet connection model, including system architecture, training, and support, to provide Internet access for an entire network of computers. This infrastructure allows local area networks which exceed 50 machines per school to independently access the complete functionality of the Internet by connecting to a central site, using state-of-the-art commercial modem technology, through a single standard telephone line. By locating high-cost resources at this central site and sharing these resources and their costs among the school districts throughout a region, a practical, efficient, and affordable infrastructure for providing scale-able Internet connectivity has been developed. As the demand for faster Internet access grows, the model has a simple expansion path that eliminates the need to replace major system components and re-train personnel. Observations of optical Internet usage within an environment, particularly school classrooms, have shown that after an initial period of 'surfing,' the Internet traffic becomes repetitive. By automatically storing requested Internet information on a high-capacity networked disk drive at the local site (network based disk caching), then updating this information only when it changes, well over 80 percent of the Internet traffic that leaves a location can be eliminated by retrieving the information from the local disk cache.

  11. Succinate overproduction: A case study of computational strain design using a comprehensive Escherichia coli kinetic model

    Directory of Open Access Journals (Sweden)

    Ali eKhodayari

    2015-01-01

    Full Text Available Computational strain design prediction accuracy has been the focus for many recent efforts through the selective integration of kinetic information into metabolic models. In general, kinetic model prediction quality is determined by the range and scope of genetic and/or environmental perturbations used during parameterization. In this effort, we apply the k-OptForce procedure on a kinetic model of E. coli core metabolism constructed using the Ensemble Modeling (EM method and parameterized using multiple mutant strains data under aerobic respiration with glucose as the carbon source. Minimal interventions are identified that improve succinate yield under both aerobic and anaerobic conditions to test the fidelity of model predictions under both genetic and environmental perturbations. Under aerobic condition, k-OptForce identifies interventions that match existing experimental strategies pointing at a number of unexplored flux redirections such as routing glyoxylate flux through the glycerate metabolism to improve succinate yield. Many of the identified interventions rely on the kinetic descriptions and would not be discoverable by a purely stoichiometric description. In contrast, under fermentative (anaerobic conditions, k-OptForce fails to identify key interventions including up-regulation of anaplerotic reactions and elimination of competitive fermentative products. This is due to the fact that the pathways activated under anaerobic conditions were not properly parameterized as only aerobic flux data were used in the model construction. This study shed light on the importance of condition-specific model parameterization and provides insight onto how to augment kinetic models so as to correctly respond to multiple environmental perturbations.

  12. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    Willie, R.R.

    1978-08-01

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  13. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  14. [Semantic Network Analysis of Online News and Social Media Text Related to Comprehensive Nursing Care Service].

    Science.gov (United States)

    Kim, Minji; Choi, Mona; Youm, Yoosik

    2017-12-01

    As comprehensive nursing care service has gradually expanded, it has become necessary to explore the various opinions about it. The purpose of this study is to explore the large amount of text data regarding comprehensive nursing care service extracted from online news and social media by applying a semantic network analysis. The web pages of the Korean Nurses Association (KNA) News, major daily newspapers, and Twitter were crawled by searching the keyword 'comprehensive nursing care service' using Python. A morphological analysis was performed using KoNLPy. Nodes on a 'comprehensive nursing care service' cluster were selected, and frequency, edge weight, and degree centrality were calculated and visualized with Gephi for the semantic network. A total of 536 news pages and 464 tweets were analyzed. In the KNA News and major daily newspapers, 'nursing workforce' and 'nursing service' were highly rated in frequency, edge weight, and degree centrality. On Twitter, the most frequent nodes were 'National Health Insurance Service' and 'comprehensive nursing care service hospital.' The nodes with the highest edge weight were 'national health insurance,' 'wards without caregiver presence,' and 'caregiving costs.' 'National Health Insurance Service' was highest in degree centrality. This study provides an example of how to use atypical big data for a nursing issue through semantic network analysis to explore diverse perspectives surrounding the nursing community through various media sources. Applying semantic network analysis to online big data to gather information regarding various nursing issues would help to explore opinions for formulating and implementing nursing policies. © 2017 Korean Society of Nursing Science

  15. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  16. POLYMAT-C: a comprehensive SPSS program for computing the polychoric correlation matrix.

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2015-09-01

    We provide a free noncommercial SPSS program that implements procedures for (a) obtaining the polychoric correlation matrix between a set of ordered categorical measures, so that it can be used as input for the SPSS factor analysis (FA) program; (b) testing the null hypothesis of zero population correlation for each element of the matrix by using appropriate simulation procedures; (c) obtaining valid and accurate confidence intervals via bootstrap resampling for those correlations found to be significant; and (d) performing, if necessary, a smoothing procedure that makes the matrix amenable to any FA estimation procedure. For the main purpose (a), the program uses a robust unified procedure that allows four different types of estimates to be obtained at the user's choice. Overall, we hope the program will be a very useful tool for the applied researcher, not only because it provides an appropriate input matrix for FA, but also because it allows the researcher to carefully check the appropriateness of the matrix for this purpose. The SPSS syntax, a short manual, and data files related to this article are available as Supplemental materials that are available for download with this article.

  17. Analysis on the security of cloud computing

    Science.gov (United States)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  18. Incremental ALARA cost/benefit computer analysis

    International Nuclear Information System (INIS)

    Hamby, P.

    1987-01-01

    Commonwealth Edison Company has developed and is testing an enhanced Fortran Computer Program to be used for cost/benefit analysis of Radiation Reduction Projects at its six nuclear power facilities and Corporate Technical Support Groups. This paper describes a Macro-Diven IBM Mainframe Program comprised of two different types of analyses-an Abbreviated Program with fixed costs and base values, and an extended Engineering Version for a detailed, more through and time-consuming approach. The extended engineering version breaks radiation exposure costs down into two components-Health-Related Costs and Replacement Labor Costs. According to user input, the program automatically adjust these two cost components and applies the derivation to company economic analyses such as replacement power costs, carrying charges, debt interest, and capital investment cost. The results from one of more program runs using different parameters may be compared in order to determine the most appropriate ALARA dose reduction technique. Benefits of this particular cost / benefit analysis technique includes flexibility to accommodate a wide range of user data and pre-job preparation, as well as the use of proven and standardized company economic equations

  19. Computing in Qualitative Analysis: A Healthy Development?

    Science.gov (United States)

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  20. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  1. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  2. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  3. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  4. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  5. Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.

    Science.gov (United States)

    Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru

    2015-01-21

    Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.

  6. Comprehensive sequence analysis of nine Usher syndrome genes in the UK National Collaborative Usher Study

    OpenAIRE

    Le Quesne Stabej, Polona; Saihan, Zubin; Rangesh, Nell; Steele-Stallard, Heather B; Ambrose, John; Coffey, Alison; Emmerson, Jenny; Haralambous, Elene; Hughes, Yasmin; Steel, Karen P; Luxon, Linda M; Webster, Andrew R; Bitner-Glindzicz, Maria

    2011-01-01

    Background Usher syndrome (USH) is an autosomal recessive disorder comprising retinitis pigmentosa, hearing loss and, in some cases, vestibular dysfunction. It is clinically and genetically heterogeneous with three distinctive clinical types (I?III) and nine Usher genes identified. This study is a comprehensive clinical and genetic analysis of 172 Usher patients and evaluates the contribution of digenic inheritance. Methods The genes MYO7A, USH1C, CDH23, PCDH15, USH1G, USH2A, GPR98, WHRN, CLR...

  7. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  8. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  9. Computational systems analysis of dopamine metabolism.

    Directory of Open Access Journals (Sweden)

    Zhen Qi

    2008-06-01

    Full Text Available A prominent feature of Parkinson's disease (PD is the loss of dopamine in the striatum, and many therapeutic interventions for the disease are aimed at restoring dopamine signaling. Dopamine signaling includes the synthesis, storage, release, and recycling of dopamine in the presynaptic terminal and activation of pre- and post-synaptic receptors and various downstream signaling cascades. As an aid that might facilitate our understanding of dopamine dynamics in the pathogenesis and treatment in PD, we have begun to merge currently available information and expert knowledge regarding presynaptic dopamine homeostasis into a computational model, following the guidelines of biochemical systems theory. After subjecting our model to mathematical diagnosis and analysis, we made direct comparisons between model predictions and experimental observations and found that the model exhibited a high degree of predictive capacity with respect to genetic and pharmacological changes in gene expression or function. Our results suggest potential approaches to restoring the dopamine imbalance and the associated generation of oxidative stress. While the proposed model of dopamine metabolism is preliminary, future extensions and refinements may eventually serve as an in silico platform for prescreening potential therapeutics, identifying immediate side effects, screening for biomarkers, and assessing the impact of risk factors of the disease.

  10. Computational Analysis of Pharmacokinetic Behavior of Ampicillin

    Directory of Open Access Journals (Sweden)

    Mária Ďurišová

    2016-07-01

    Full Text Available orrespondence: Institute of Experimental Pharmacology and Toxicology, Slovak Academy of Sciences, 841 04 Bratislava, Slovak Republic. Phone + 42-1254775928; Fax +421254775928; E-mail: maria.durisova@savba.sk 84 RESEARCH ARTICLE The objective of this study was to perform a computational analysis of the pharmacokinetic behavior of ampicillin, using data from the literature. A method based on the theory of dynamic systems was used for modeling purposes. The method used has been introduced to pharmacokinetics with the aim to contribute to the knowledge base in pharmacokinetics by including the modeling method which enables researchers to develop mathematical models of various pharmacokinetic processes in an identical way, using identical model structures. A few examples of a successful use of the modeling method considered here in pharmacokinetics can be found in full texts articles available free of charge at the website of the author, and in the example given in the this study. The modeling method employed in this study can be used to develop a mathematical model of the pharmacokinetic behavior of any drug, under the condition that the pharmacokinetic behavior of the drug under study can be at least partially approximated using linear models.

  11. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  12. Comprehensive optimisation of China’s energy prices, taxes and subsidy policies based on the dynamic computable general equilibrium model

    International Nuclear Information System (INIS)

    He, Y.X.; Liu, Y.Y.; Du, M.; Zhang, J.X.; Pang, Y.X.

    2015-01-01

    Highlights: • Energy policy is defined as a complication of energy price, tax and subsidy policies. • The maximisation of total social benefit is the optimised objective. • A more rational carbon tax ranges from 10 to 20 Yuan/ton under the current situation. • The optimal coefficient pricing is more conducive to maximise total social benefit. - Abstract: Under the condition of increasingly serious environmental pollution, rational energy policy plays an important role in the practical significance of energy conservation and emission reduction. This paper defines energy policies as the compilation of energy prices, taxes and subsidy policies. Moreover, it establishes the optimisation model of China’s energy policy based on the dynamic computable general equilibrium model, which maximises the total social benefit, in order to explore the comprehensive influences of a carbon tax, the sales pricing mechanism and the renewable energy fund policy. The results show that when the change rates of gross domestic product and consumer price index are ±2%, ±5% and the renewable energy supply structure ratio is 7%, the more reasonable carbon tax ranges from 10 to 20 Yuan/ton, and the optimal coefficient pricing mechanism is more conducive to the objective of maximising the total social benefit. From the perspective of optimising the overall energy policies, if the upper limit of change rate in consumer price index is 2.2%, the existing renewable energy fund should be improved

  13. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  14. Architectural analysis for wirelessly powered computing platforms

    NARCIS (Netherlands)

    Kapoor, A.; Pineda de Gyvez, J.

    2013-01-01

    We present a design framework for wirelessly powered generic computing platforms that takes into account various system parameters in response to a time-varying energy source. These parameters are the charging profile of the energy source, computing speed (fclk), digital supply voltage (VDD), energy

  15. iDC: A comprehensive toolkit for the analysis of residual dipolar couplings for macromolecular structure determination

    International Nuclear Information System (INIS)

    Wei Yufeng; Werner, Milton H.

    2006-01-01

    Measurement of residual dipolar couplings (RDCs) has become an important method for the determination and validation of protein or nucleic acid structures by NMRf spectroscopy. A number of toolkits have been devised for the handling of RDC data which run in the Linux/Unix operating environment and require specifically formatted input files. The outputs from these programs, while informative, require format modification prior to the incorporation of this data into commonly used personal computer programs for manuscript preparation. To bridge the gap between analysis and publication, an easy-to-use, comprehensive toolkit for RDC analysis has been created, iDC. iDC is written for the WaveMetrics Igor Pro mathematics program, a widely used graphing and data analysis software program that runs on both Windows PC and Mac OS X computers. Experimental RDC values can be loaded into iDC using simple data formats accessible to Igor's tabular data function. The program can perform most useful RDC analyses, including alignment tensor estimation from a histogram of RDC occurrence versus values and order tensor analysis by singular value decomposition (SVD). SVD analysis can be performed on an entire structure family at once, a feature missing in other applications of this kind. iDC can also import from and export to several different commonly used programs for the analysis of RDC data (DC, PALES, REDCAT) and can prepare formatted files for RDC-based refinement of macromolecular structures using XPLOR-NIH, CNS and ARIA. The graphical user interface provides an easy-to-use I/O for data, structures and formatted outputs

  16. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  17. Computer vision syndrome (CVS) - Thermographic Analysis

    Science.gov (United States)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  18. The Effects of Visual Attention Span and Phonological Decoding in Reading Comprehension in Dyslexia: A Path Analysis

    OpenAIRE

    Chen, C.; Schneps, M.; Masyn, K.; Thomson, J.

    2016-01-01

    Increasing evidence has shown visual attention span to be a factor, distinct from phonological skills, that explains single-word identification (pseudo-word/word reading) performance in dyslexia. Yet, little is known about how well visual attention span explains text comprehension. Observing reading comprehension in a sample of 105 high school students with dyslexia, we used a pathway analysis to examine the direct and indirect path between visual attention span and reading comprehension whil...

  19. Comprehensive experimental analysis of nonlinear dynamics in an optically-injected semiconductor laser

    Directory of Open Access Journals (Sweden)

    Kevin Schires

    2011-09-01

    Full Text Available We present the first comprehensive experimental study, to our knowledge, of the routes between nonlinear dynamics induced in a semiconductor laser under external optical injection based on an analysis of time-averaged measurements of the optical and RF spectra and phasors of real-time series of the laser output. The different means of analysis are compared for several types of routes and the benefits of each are discussed in terms of the identification and mapping of the nonlinear dynamics. Finally, the results are presented in a novel audio/video format that describes the evolution of the dynamics with the injection parameters.

  20. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  1. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  2. Key Concept Identification: A Comprehensive Analysis of Frequency and Topical Graph-Based Approaches

    Directory of Open Access Journals (Sweden)

    Muhammad Aman

    2018-05-01

    Full Text Available Automatic key concept extraction from text is the main challenging task in information extraction, information retrieval and digital libraries, ontology learning, and text analysis. The statistical frequency and topical graph-based ranking are the two kinds of potentially powerful and leading unsupervised approaches in this area, devised to address the problem. To utilize the potential of these approaches and improve key concept identification, a comprehensive performance analysis of these approaches on datasets from different domains is needed. The objective of the study presented in this paper is to perform a comprehensive empirical analysis of selected frequency and topical graph-based algorithms for key concept extraction on three different datasets, to identify the major sources of error in these approaches. For experimental analysis, we have selected TF-IDF, KP-Miner and TopicRank. Three major sources of error, i.e., frequency errors, syntactical errors and semantical errors, and the factors that contribute to these errors are identified. Analysis of the results reveals that performance of the selected approaches is significantly degraded by these errors. These findings can help us develop an intelligent solution for key concept extraction in the future.

  3. Computer science: Data analysis meets quantum physics

    Science.gov (United States)

    Schramm, Steven

    2017-10-01

    A technique that combines machine learning and quantum computing has been used to identify the particles known as Higgs bosons. The method could find applications in many areas of science. See Letter p.375

  4. Introducing computational thinking through hands-on projects using R with applications to calculus, probability and data analysis

    Science.gov (United States)

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-04-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.

  5. Analysis On Security Of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Muhammad Zunnurain Hussain

    2017-01-01

    Full Text Available In this paper Author will be discussing the security issues and challenges faced by the industry in securing the cloud computing and how these problems can be tackled. Cloud computing is modern technique of sharing resources like data sharing file sharing basically sharing of resources without launching own infrastructure and using some third party resources to avoid huge investment . It is very challenging these days to secure the communication between two users although people use different encryption techniques 1.

  6. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  7. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  8. Comprehensive analysis of a straw-fired power plant in Vojvodina

    Directory of Open Access Journals (Sweden)

    Urošević Dragan M.

    2012-01-01

    Full Text Available In recent years, renewable energy sources have played an increasingly important role in potential energy production. The integration of renewable energy technologies into existing national energy system has therefore become a major challenge for many countries. Due to the importance of this matter, this paper deals with the comprehensive analysis for implementation of a power plant on biomass (straw. The analysis is conducted regarding several key indicators: availability of biomass, regulation, reduction of greenhouse gas emissions, location, land use, electricity price and social impacts. The analysis also includes favorable price for electricity produced from biomass relevant to national feed in tariffs. In order to demonstrate all above mentioned indicators, the region in Serbia (Province of Vojvodina with significant potential in biomass, especially in straw, is selected. The results of the analysis are validated trough environmental and social aspects. Special attention is given to identifying risks for this application.

  9. Computer-Assisted Linguistic Analysis of the Peshitta

    NARCIS (Netherlands)

    Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.

    2014-01-01

    CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van

  10. Run 2 analysis computing for CDF and D0

    International Nuclear Information System (INIS)

    Fuess, S.

    1995-11-01

    Two large experiments at the Fermilab Tevatron collider will use upgraded of running. The associated analysis software is also expected to change, both to account for higher data rates and to embrace new computing paradigms. A discussion is given to the problems facing current and future High Energy Physics (HEP) analysis computing, and several issues explored in detail

  11. Comprehensive Evaluation and Analysis of China's Mainstream Online Map Service Websites

    Science.gov (United States)

    Zhang, H.; Jiang, J.; Huang, W.; Wang, Q.; Gu, X.

    2012-08-01

    With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a) the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b) a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c) a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.

  12. COMPREHENSIVE EVALUATION AND ANALYSIS OF CHINA’S MAINSTREAM ONLINE MAP SERVICE WEBSITES

    Directory of Open Access Journals (Sweden)

    H. Zhang

    2012-08-01

    Full Text Available With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.

  13. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    Science.gov (United States)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  14. SANDPIPER I (A comprehensive analysis programme for liquid moderated UO2 lattices)

    International Nuclear Information System (INIS)

    Alpiar, R.A.

    1962-04-01

    Methods of calculation for light water moderated reactors have recently been reviewed in AEEW R64. Calculation schemes for lattice parameters were presented which depended on the use of a number of IBM 704 and Perranti MERCURY Computer Programmes. SANDPIPER I is a comprehensive MERCURY programme designed to cover all the operations with a degree of accuracy adequate for survey calculations. The present version is restricted to regular or near regular UO 2 pin type lattices moderated by H 2 O, D 2 O, or organic liquids; it is planned to allow for greater flexibility in later versions of the programme. The present version is written in Autocode and requires a 4 drum machine. (author)

  15. THE ROLE OF GENDER IN READING COMPREHENSION: AN ANALYSIS OF COLLEGE-LEVEL EFL STUDENTS’ COMPREHENSION OF DIFFERENT GENRES

    Directory of Open Access Journals (Sweden)

    Didem Koban Koç

    2016-07-01

    Full Text Available The purpose of the present study is to examine the effects of gender on comprehending different types of genre. The study involved 60 first year college students (30 males and 30 females who were taking an advanced reading course at a government university in Turkey. The students were given three reading passages of different genres such as historical fiction, essay and fantasy and were asked to answer comprehension questions related to the passages. Descriptives statistics, one-way ANOVA and repeated measures ANOVA were employed to analyse the relationship between gender and the test scores for each text type. The results showed that (1 the participants, in general, were significantly better at understanding the essay than historical fiction and fantasy (2 there was not a statistically significant difference between males and females regarding comprehending the different types of genres (3 both the male and female participants were significantly better at understanding the essay than historical fiction and fantasy. The study offers suggestions regarding incorporating different types of genre in the classroom.

  16. Computer-aided pulmonary image analysis in small animal models

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J. [Center for Infectious Disease Imaging (CIDI), Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Bagci, Ulas, E-mail: ulasbagci@gmail.com [Center for Research in Computer Vision (CRCV), University of Central Florida (UCF), Orlando, Florida 32816 (United States); Kramer-Marek, Gabriela [The Institute of Cancer Research, London SW7 3RP (United Kingdom); Luna, Brian [Microfluidic Laboratory Automation, University of California-Irvine, Irvine, California 92697-2715 (United States); Kubler, Andre [Department of Medicine, Imperial College London, London SW7 2AZ (United Kingdom); Dey, Bappaditya; Jain, Sanjay [Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Foster, Brent [Department of Biomedical Engineering, University of California-Davis, Davis, California 95817 (United States); Papadakis, Georgios Z. [Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Camp, Jeremy V. [Department of Microbiology and Immunology, University of Louisville, Louisville, Kentucky 40202 (United States); Jonsson, Colleen B. [National Institute for Mathematical and Biological Synthesis, University of Tennessee, Knoxville, Tennessee 37996 (United States); Bishai, William R. [Howard Hughes Medical Institute, Chevy Chase, Maryland 20815 and Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Udupa, Jayaram K. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  17. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  18. Computational Analysis of SAXS Data Acquisition.

    Science.gov (United States)

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals.

  19. Multichannel microscale system for high throughput preparative separation with comprehensive collection and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel

    2003-12-09

    A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.

  20. Introducing People – Genre Analysis and Oral Comprehension and Oral Production Tasks

    Directory of Open Access Journals (Sweden)

    Keila Rocha Reis de Carvalho

    2012-02-01

    Full Text Available This paper aims at presenting an analysis of the genre introducing people and at suggesting listening comprehension and oral production tasks. This work was developed according to the characterization of the rhetorical organization of situations taken from seventeen films that contain the genre under analysis. Although several studies in the ESP area carried out recently (Andrade, 2003; Cardoso, 2003; Shergue, 2003; Belmonte, 2003; Serafini, 2003 have identified listening comprehension and oral production as the abilities that should be prioritized in an English course, much needs to be done, especially concerning the oral genres that take into account the language the learners of English as a second language need in their target situation. This work is based on Hutchinson & Waters (1987 theoretical background on ESP, Swales’ (1990 genre analysis, Ramos’ (2004 pedagogical proposal, and also on Ellis´ (2003 tasks concept. The familiarization of learners of English as a second language with this genre will provide them with the opportunity to better understand and use the English language in their academic and professional life.

  1. Quantitative Proteomics for the Comprehensive Analysis of Stress Responses of Lactobacillus paracasei subsp. paracasei F19.

    Science.gov (United States)

    Schott, Ann-Sophie; Behr, Jürgen; Geißler, Andreas J; Kuster, Bernhard; Hahne, Hannes; Vogel, Rudi F

    2017-10-06

    Lactic acid bacteria are broadly employed as starter cultures in the manufacture of foods. Upon technological preparation, they are confronted with drying stress that amalgamates numerous stress conditions resulting in losses of fitness and survival. To better understand and differentiate physiological stress responses, discover general and specific markers for the investigated stress conditions, and predict optimal preconditioning for starter cultures, we performed a comprehensive genomic and quantitative proteomic analysis of a commonly used model system, Lactobacillus paracasei subsp. paracasei TMW 1.1434 (isogenic with F19) under 11 typical stress conditions, including among others oxidative, osmotic, pH, and pressure stress. We identified and quantified >1900 proteins in triplicate analyses, representing 65% of all genes encoded in the genome. The identified genes were thoroughly annotated in terms of subcellular localization prediction and biological functions, suggesting unbiased and comprehensive proteome coverage. In total, 427 proteins were significantly differentially expressed in at least one condition. Most notably, our analysis suggests that optimal preconditioning toward drying was predicted to be alkaline and high-pressure stress preconditioning. Taken together, we believe the presented strategy may serve as a prototypic example for the analysis and utility of employing quantitative-mass-spectrometry-based proteomics to study bacterial physiology.

  2. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    Science.gov (United States)

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Computational and Physical Analysis of Catalytic Compounds

    Science.gov (United States)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  4. Classification and Analysis of Computer Network Traffic

    DEFF Research Database (Denmark)

    Bujlow, Tomasz

    2014-01-01

    various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks...

  5. Computer programs simplify optical system analysis

    Science.gov (United States)

    1965-01-01

    The optical ray-trace computer program performs geometrical ray tracing. The energy-trace program calculates the relative monochromatic flux density on a specific target area. This program uses the ray-trace program as a subroutine to generate a representation of the optical system.

  6. Analysis of airways in computed tomography

    DEFF Research Database (Denmark)

    Petersen, Jens

    Chronic Obstructive Pulmonary Disease (COPD) is major cause of death and disability world-wide. It affects lung function through destruction of lung tissue known as emphysema and inflammation of airways, leading to thickened airway walls and narrowed airway lumen. Computed Tomography (CT) imaging...

  7. Affect and Learning : a computational analysis

    NARCIS (Netherlands)

    Broekens, Douwe Joost

    2007-01-01

    In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation

  8. Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.

    Science.gov (United States)

    Xia, Jianguo; Wishart, David S

    2016-09-07

    MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  9. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  10. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  11. Computational analysis of nanofluids: A review

    Science.gov (United States)

    Qureshi, M. Zubair Akbar; Ashraf, Muhammad

    2018-02-01

    Nanofluids and heat transfer enhancement in real systems continue to be a widely research area of nanotechnology. An effort has been made to give a comprehensive review on time-wise development from different aspects of the nanofluids. The exceptional structures of nanofluids, for example, dispersion of nanoparticles volume fraction, thermophoresis phenomenon, Brownian motion, improvement in thermal conductivity, and especially heat transfer enhancement, etc., have been addressed in a mathematical perspective. The influence of important parameters like particle's (loading, material, size and shape-factor), base fluids type, temperature, additives, clustering and p H value has been considered. In addition, the summary-chart is presented for a better understanding of the mathematical structure of the Newtonian as well as non-Newtonian nanofluids. Some important results have been discussed for future work. This review article will be helpful for scientists and researchers.

  12. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  13. Comprehensive analysis of temporal alterations in cellular proteome of Bacillus subtilis under curcumin treatment.

    Directory of Open Access Journals (Sweden)

    Panga Jaipal Reddy

    Full Text Available Curcumin is a natural dietary compound with antimicrobial activity against various gram positive and negative bacteria. This study aims to investigate the proteome level alterations in Bacillus subtilis due to curcumin treatment and identification of its molecular/cellular targets to understand the mechanism of action. We have performed a comprehensive proteomic analysis of B. subtilis AH75 strain at different time intervals of curcumin treatment (20, 60 and 120 min after the drug exposure, three replicates to compare the protein expression profiles using two complementary quantitative proteomic techniques, 2D-DIGE and iTRAQ. To the best of our knowledge, this is the first comprehensive longitudinal investigation describing the effect of curcumin treatment on B. subtilis proteome. The proteomics analysis revealed several interesting targets such UDP-N-acetylglucosamine 1-carboxyvinyltransferase 1, putative septation protein SpoVG and ATP-dependent Clp protease proteolytic subunit. Further, in silico pathway analysis using DAVID and KOBAS has revealed modulation of pathways related to the fatty acid metabolism and cell wall synthesis, which are crucial for cell viability. Our findings revealed that curcumin treatment lead to inhibition of the cell wall and fatty acid synthesis in addition to differential expression of many crucial proteins involved in modulation of bacterial metabolism. Findings obtained from proteomics analysis were further validated using 5-cyano-2,3-ditolyl tetrazolium chloride (CTC assay for respiratory activity, resazurin assay for metabolic activity and membrane integrity assay by potassium and inorganic phosphate leakage measurement. The gene expression analysis of selected cell wall biosynthesis enzymes has strengthened the proteomics findings and indicated the major effect of curcumin on cell division.

  14. Computational analysis of ozonation in bubble columns

    International Nuclear Information System (INIS)

    Quinones-Bolanos, E.; Zhou, H.; Otten, L.

    2002-01-01

    This paper presents a new computational ozonation model based on the principle of computational fluid dynamics along with the kinetics of ozone decay and microbial inactivation to predict the performance of ozone disinfection in fine bubble columns. The model can be represented using a mixture two-phase flow model to simulate the hydrodynamics of the water flow and using two transport equations to track the concentration profiles of ozone and microorganisms along the height of the column, respectively. The applicability of this model was then demonstrated by comparing the simulated ozone concentrations with experimental measurements obtained from a pilot scale fine bubble column. One distinct advantage of this approach is that it does not require the prerequisite assumptions such as plug flow condition, perfect mixing, tanks-in-series, uniform radial or longitudinal dispersion in predicting the performance of disinfection contactors without carrying out expensive and tedious tracer studies. (author)

  15. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    Directory of Open Access Journals (Sweden)

    Ikeda Noriaki

    2006-10-01

    Full Text Available Abstract Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel interface, and the

  16. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  17. [Recent advances in analysis of petroleum geological samples by comprehensive two-dimensional gas chromatography].

    Science.gov (United States)

    Gao, Xuanbo; Chang, Zhenyang; Dai, Wei; Tong, Ting; Zhang, Wanfeng; He, Sheng; Zhu, Shukui

    2014-10-01

    Abundant geochemical information can be acquired by analyzing the chemical compositions of petroleum geological samples. The information obtained from the analysis provides scientifical evidences for petroleum exploration. However, these samples are complicated and can be easily influenced by physical (e. g. evaporation, emulsification, natural dispersion, dissolution and sorption), chemical (photodegradation) and biological (mainly microbial degradation) weathering processes. Therefore, it is very difficult to analyze the petroleum geological samples and they cannot be effectively separated by traditional gas chromatography/mass spectrometry. A newly developed separation technique, comprehensive two-dimensional gas chromatography (GC x GC), has unique advantages in complex sample analysis, and recently it has been applied to petroleum geological samples. This article mainly reviews the research progres- ses in the last five years, the main problems and the future research about GC x GC applied in the area of petroleum geology.

  18. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  19. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  20. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  1. Practical guidelines for the comprehensive analysis of ChIP-seq data.

    Directory of Open Access Journals (Sweden)

    Timothy Bailey

    Full Text Available Mapping the chromosomal locations of transcription factors, nucleosomes, histone modifications, chromatin remodeling enzymes, chaperones, and polymerases is one of the key tasks of modern biology, as evidenced by the Encyclopedia of DNA Elements (ENCODE Project. To this end, chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq is the standard methodology. Mapping such protein-DNA interactions in vivo using ChIP-seq presents multiple challenges not only in sample preparation and sequencing but also for computational analysis. Here, we present step-by-step guidelines for the computational analysis of ChIP-seq data. We address all the major steps in the analysis of ChIP-seq data: sequencing depth selection, quality checking, mapping, data normalization, assessment of reproducibility, peak calling, differential binding analysis, controlling the false discovery rate, peak annotation, visualization, and motif analysis. At each step in our guidelines we discuss some of the software tools most frequently used. We also highlight the challenges and problems associated with each step in ChIP-seq data analysis. We present a concise workflow for the analysis of ChIP-seq data in Figure 1 that complements and expands on the recommendations of the ENCODE and modENCODE projects. Each step in the workflow is described in detail in the following sections.

  2. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  3. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  4. Isogeometric analysis : a calculus for computational mechanics

    NARCIS (Netherlands)

    Benson, D.J.; Borst, de R.; Hughes, T.J.R.; Scott, M.A.; Verhoosel, C.V.; Topping, B.H.V.; Adam, J.M.; Pallarés, F.J.; Bru, R.; Romero, M.L.

    2010-01-01

    The first paper on isogeometric analysis appeared only five years ago [1], and the first book appeared last year [2]. Progress has been rapid. Isogeometric analysis has been applied to a wide variety of problems in solids, fluids and fluid-structure interactions. Superior accuracy to traditional

  5. Trident: scalable compute archives: workflows, visualization, and analysis

    Science.gov (United States)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub

  6. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  7. Isotopic analysis of plutonium by computer controlled mass spectrometry

    International Nuclear Information System (INIS)

    1974-01-01

    Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control

  8. Computer Programme for the Dynamic Analysis of Tall Regular ...

    African Journals Online (AJOL)

    The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines the dynamic ...

  9. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. FUNCTIONAL ANALYSIS OF FUTURE MUSIC ART TEACHERS’ TRAINING FOR SINGING ACTIVITY OF COMPREHENSIVE SCHOOL SENIOR STUDENTS

    Directory of Open Access Journals (Sweden)

    Ma Chen

    2017-04-01

    Full Text Available In the article the functional analysis of future music art teachers’ training for singing activity of comprehensive school senior students is depicted. This issue is very important because improving educators and musicians’ training contributes not only to professional selfactualisation, but also to young generation’s encouraging for thorough learning music art works and their creative development in the process of group music tuitions. Extracurricular singing activity also plays an important part. It reveals art images to students, enriching creativity experience, forms the spiritual world, develops independent thinking, awakens creativity. The author points out the main functions of future music art teachers’ training. They are system and value, information, communication, creative and transformative, projective functions. The special attention is paid to characterizing the features of each function. The author claims that system and value function relates to the necessity to analyze the results of the educational process that contributes to productive solving problems by students and main tasks of music training. Information function is a subject background of art music teachers’ and pedagogical activities. Communicative function is realized in a teacher’s ability to develop the student’ initiative to plan cooperative activities, to be able to distribute duties, to carry out instructions, to coordinate cooperative activities, to create special situations for the implementation of educational influence. The analysis of pedagogical and methodological literature shows that The creative and transformative function is manifested in the creative use of pedagogical and methodological ideas in specific pedagogical conditions. The projective function is thought to promote the most complete realization of content of comprehensive and art education. Functional analysis of students’ training of art faculties at pedagogical universities to

  11. Comprehensive RNA-Seq Analysis on the Regulation of Tomato Ripening by Exogenous Auxin.

    Directory of Open Access Journals (Sweden)

    Jiayin Li

    Full Text Available Auxin has been shown to modulate the fruit ripening process. However, the molecular mechanisms underlying auxin regulation of fruit ripening are still not clear. Illumina RNA sequencing was performed on mature green cherry tomato fruit 1 and 7 days after auxin treatment, with untreated fruit as a control. The results showed that exogenous auxin maintained system 1 ethylene synthesis and delayed the onset of system 2 ethylene synthesis and the ripening process. At the molecular level, genes associated with stress resistance were significantly up-regulated, but genes related to carotenoid metabolism, cell degradation and energy metabolism were strongly down-regulated by exogenous auxin. Furthermore, genes encoding DNA demethylases were inhibited by auxin, whereas genes encoding cytosine-5 DNA methyltransferases were induced, which contributed to the maintenance of high methylation levels in the nucleus and thus inhibited the ripening process. Additionally, exogenous auxin altered the expression patterns of ethylene and auxin signaling-related genes that were induced or repressed in the normal ripening process, suggesting significant crosstalk between these two hormones during tomato ripening. The present work is the first comprehensive transcriptome analysis of auxin-treated tomato fruit during ripening. Our results provide comprehensive insights into the effects of auxin on the tomato ripening process and the mechanism of crosstalk between auxin and ethylene.

  12. A Principal Component Analysis/Fuzzy Comprehensive Evaluation for Rockburst Potential in Kimberlite

    Science.gov (United States)

    Pu, Yuanyuan; Apel, Derek; Xu, Huawei

    2018-02-01

    Kimberlite is an igneous rock which sometimes bears diamonds. Most of the diamonds mined in the world today are found in kimberlite ores. Burst potential in kimberlite has not been investigated, because kimberlite is mostly mined using open-pit mining, which poses very little threat of rock bursting. However, as the mining depth keeps increasing, the mines convert to underground mining methods, which can pose a threat of rock bursting in kimberlite. This paper focuses on the burst potential of kimberlite at a diamond mine in northern Canada. A combined model with the methods of principal component analysis (PCA) and fuzzy comprehensive evaluation (FCE) is developed to process data from 12 different locations in kimberlite pipes. Based on calculated 12 fuzzy evaluation vectors, 8 locations show a moderate burst potential, 2 locations show no burst potential, and 2 locations show strong and violent burst potential, respectively. Using statistical principles, a Mahalanobis distance is adopted to build a comprehensive fuzzy evaluation vector for the whole mine and the final evaluation for burst potential is moderate, which is verified by a practical rockbursting situation at mine site.

  13. Analysis of oxidised heavy paraffininc products by high temperature comprehensive two-dimensional gas chromatography.

    Science.gov (United States)

    Potgieter, H; Bekker, R; Beigley, J; Rohwer, E

    2017-08-04

    Heavy petroleum fractions are produced during crude and synthetic crude oil refining processes and they need to be upgraded to useable products to increase their market value. Usually these fractions are upgraded to fuel products by hydrocracking, hydroisomerization and hydrogenation processes. These fractions are also upgraded to other high value commercial products like lubricant oils and waxes by distillation, hydrogenation, and oxidation and/or blending. Oxidation of hydrogenated heavy paraffinic fractions produces high value products that contain a variety of oxygenates and the characterization of these heavy oxygenates is very important for the control of oxidation processes. Traditionally titrimetric procedures are used to monitor oxygenate formation, however, these titrimetric procedures are tedious and lack selectivity toward specific oxygenate classes in complex matrices. Comprehensive two-dimensional gas chromatography (GC×GC) is a way of increasing peak capacity for the comprehensive analysis of complex samples. Other groups have used HT-GC×GC to extend the carbon number range attainable by GC×GC and have optimised HT-GC×GC parameters for the separation of aromatics, nitrogen-containing compounds as well as sulphur-containing compounds in heavy petroleum fractions. HT-GC×GC column combinations for the separation of oxygenates in oxidised heavy paraffinic fractions are optimised in this study. The advantages of the HT-GC×GC method in the monitoring of the oxidation reactions of heavy paraffinic fraction samples are illustrated. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Tolerance analysis through computational imaging simulations

    Science.gov (United States)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  15. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  16. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  18. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  19. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  20. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  1. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nurokhim; Sumarbagiono

    2008-01-01

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  2. System Matrix Analysis for Computed Tomography Imaging

    Science.gov (United States)

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  3. Computational analysis of sequence selection mechanisms.

    Science.gov (United States)

    Meyerguz, Leonid; Grasso, Catherine; Kleinberg, Jon; Elber, Ron

    2004-04-01

    Mechanisms leading to gene variations are responsible for the diversity of species and are important components of the theory of evolution. One constraint on gene evolution is that of protein foldability; the three-dimensional shapes of proteins must be thermodynamically stable. We explore the impact of this constraint and calculate properties of foldable sequences using 3660 structures from the Protein Data Bank. We seek a selection function that receives sequences as input, and outputs survival probability based on sequence fitness to structure. We compute the number of sequences that match a particular protein structure with energy lower than the native sequence, the density of the number of sequences, the entropy, and the "selection" temperature. The mechanism of structure selection for sequences longer than 200 amino acids is approximately universal. For shorter sequences, it is not. We speculate on concrete evolutionary mechanisms that show this behavior.

  4. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  5. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  6. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  7. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    Science.gov (United States)

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  8. Generic dynamic wind turbine models for power system stability analysis: A comprehensive review

    DEFF Research Database (Denmark)

    Honrubia-Escribano, A.; Gómez-Lázaro, E.; Fortmann, J.

    2018-01-01

    In recent years, international working groups, mainly from the International Electrotechnical Commission (IEC) and the Western Electricity Coordinating Council (WECC), have made a major effort to develop generic —also known as simplified or standard— dynamic wind turbine models to be used for power...... system stability analysis. These models are required by power system operators to conduct the planning and operation activities of their networks since the use of detailed manufacturer models is not practical. This paper presents a comprehensive review of the work done in this field, based on the results...... obtained by IEC and WECC working groups in the course of their research, which have motivated the publication of the IEC 61400-27 in February 2015. The final published versions of the generic models developed according to the existing four wind turbine technology types are detailed, highlighting...

  9. Comprehensive analysis of the specificity of transcription activator-like effector nucleases

    DEFF Research Database (Denmark)

    Juillerat, Alexandre; Dubois, Gwendoline; Valton, Julien

    2014-01-01

    A key issue when designing and using DNA-targeting nucleases is specificity. Ideally, an optimal DNA-targeting tool has only one recognition site within a genomic sequence. In practice, however, almost all designer nucleases available today can accommodate one to several mutations within...... their target site. The ability to predict the specificity of targeting is thus highly desirable. Here, we describe the first comprehensive experimental study focused on the specificity of the four commonly used repeat variable diresidues (RVDs; NI:A, HD:C, NN:G and NG:T) incorporated in transcription activator......-like effector nucleases (TALEN). The analysis of >15 500 unique TALEN/DNA cleavage profiles allowed us to monitor the specificity gradient of the RVDs along a TALEN/DNA binding array and to present a specificity scoring matrix for RVD/nucleotide association. Furthermore, we report that TALEN can only...

  10. Computational Analysis of Spray Jet Flames

    Science.gov (United States)

    Jain, Utsav

    There is a boost in the utilization of renewable sources of energy but because of high energy density applications, combustion will never be obsolete. Spray combustion is a type of multiphase combustion which has tremendous engineering applications in different fields, varying from energy conversion devices to rocket propulsion system. Developing accurate computational models for turbulent spray combustion is vital for improving the design of combustors and making them energy efficient. Flamelet models have been extensively used for gas phase combustion because of their relatively low computational cost to model the turbulence-chemistry interaction using a low dimensional manifold approach. This framework is designed for gas phase non-premixed combustion and its implementation is not very straight forward for multiphase and multi-regime combustion such as spray combustion. This is because of the use of a conserved scalar and various flamelet related assumptions. Mixture fraction has been popularly employed as a conserved scalar and hence used to parameterize the characteristics of gaseous flamelets. However, for spray combustion, the mixture fraction is not monotonic and does not give a unique mapping in order to parameterize the structure of spray flames. In order to develop a flamelet type model for spray flames, a new variable called the mixing variable is introduced which acts as an ideal conserved scalar and takes into account the convection and evaporation of fuel droplets. In addition to the conserved scalar, it has been observed that though gaseous flamelets can be characterized by the conserved scalar and its dissipation, this might not be true for spray flamelets. Droplet dynamics has a significant influence on the spray flamelet and because of effects such as flame penetration of droplets and oscillation of droplets across the stagnation plane, it becomes important to accommodate their influence in the flamelet formulation. In order to recognize the

  11. Computational analysis of thresholds for magnetophosphenes

    International Nuclear Information System (INIS)

    Laakso, Ilkka; Hirata, Akimasa

    2012-01-01

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m −2 (−20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of

  12. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  13. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  14. A comprehensive analysis of high-magnitude streamflow and trends in the Central Valley, California

    Science.gov (United States)

    Kocis, T. N.; Dahlke, H. E.

    2017-12-01

    California's climate is characterized by the largest precipitation and streamflow variability observed within the conterminous US. This, combined with chronic groundwater overdraft of 0.6-3.5 km3 yr-1, creates the need to identify additional surface water sources available for groundwater recharge using methods such as agricultural groundwater banking, aquifer storage and recovery, and spreading basins. High-magnitude streamflow, i.e. flow above the 90th percentile, that exceeds environmental flow requirements and current surface water allocations under California water rights, could be a viable source of surface water for groundwater banking. Here, we present a comprehensive analysis of the magnitude, frequency, duration and timing of high-magnitude streamflow (HMF "metrics") over multiple time periods for 93 stream gauges covering the Sacramento, San Joaquin and Tulare basins in California. In addition, we present trend analyses conducted on the same dataset and all HMF metrics using generalized additive models, the Mann-Kendall trend test, and the Signal to Noise Ratio test. The results of the comprehensive analysis show, in short, that in an average year with HMF approximately 3.2 km3 of high-magnitude flow is exported from the entire Central Valley to the Sacramento-San Joaquin Delta, often at times when environmental flow requirements of the Delta and major rivers are exceeded. High-magnitude flow occurs, on average, during 7 and 4.7 out of 10 years in the Sacramento River and the San Joaquin-Tulare Basins, respectively, from just a few storm events (5-7 1-day peak events) lasting for a total of 25-30 days between November and April. Preliminary trend tests suggest that all HMF metrics show limited change over the last 50 years. As a whole, the results suggest that there is sufficient unmanaged surface water physically available to mitigate long-term groundwater overdraft in the Central Valley.

  15. Computed tomographic analysis of urinary calculi

    International Nuclear Information System (INIS)

    Naito, Akira; Ito, Katsuhide; Ito, Shouko

    1986-01-01

    Computed tomography (CT) was employed in an effort to analyze the chemical composition of urinary calculi. Twenty-three surgically removed calculi were scanned in a water bath (in vitro study). Forteen of them in the body were scanned (in vivo study). The calculi consisted of four types: mixed calcium oxalate and phosphate, mixed calcium carbonate and phosphate, magnesium ammonium phosphate, and uric acid. The in vitro study showed that the mean and maximum CT values of uric acid stones were significantly lower than those of the other three types of stones. This indicated that stones with less than 450 HU are composed of uric acid. In an in vivo study, CT did not help to differentiate the three types of urinary calculi, except for uric acid stones. Regarding the mean CT values, there was no correlation between in vitro and in vivo studies. An experiment with commercially available drugs showed that CT values of urinary calculi were not dependent upon the composition, but dependent upon the density of the calculi. (Namekawa, K.)

  16. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  17. Classification and Analysis of Computer Network Traffic

    OpenAIRE

    Bujlow, Tomasz

    2014-01-01

    Traffic monitoring and analysis can be done for multiple different reasons: to investigate the usage of network resources, assess the performance of network applications, adjust Quality of Service (QoS) policies in the network, log the traffic to comply with the law, or create realistic models of traffic for academic purposes. We define the objective of this thesis as finding a way to evaluate the performance of various applications in a high-speed Internet infrastructure. To satisfy the obje...

  18. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  19. Normal-Gamma-Bernoulli Peak Detection for Analysis of Comprehensive Two-Dimensional Gas Chromatography Mass Spectrometry Data.

    Science.gov (United States)

    Kim, Seongho; Jang, Hyejeong; Koo, Imhoi; Lee, Joohyoung; Zhang, Xiang

    2017-01-01

    Compared to other analytical platforms, comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS) has much increased separation power for analysis of complex samples and thus is increasingly used in metabolomics for biomarker discovery. However, accurate peak detection remains a bottleneck for wide applications of GC×GC-MS. Therefore, the normal-exponential-Bernoulli (NEB) model is generalized by gamma distribution and a new peak detection algorithm using the normal-gamma-Bernoulli (NGB) model is developed. Unlike the NEB model, the NGB model has no closed-form analytical solution, hampering its practical use in peak detection. To circumvent this difficulty, three numerical approaches, which are fast Fourier transform (FFT), the first-order and the second-order delta methods (D1 and D2), are introduced. The applications to simulated data and two real GC×GC-MS data sets show that the NGB-D1 method performs the best in terms of both computational expense and peak detection performance.

  20. Comprehensive structural analysis of the HCPB demo blanket under thermal, mechanical, electromagnetic and radiation induced loads

    International Nuclear Information System (INIS)

    Boccaccini, L.V.; Norajitra, P.; Ruatto, P.; Scaffidi-Argentina, F.

    1998-01-01

    For the helium-cooled pebble bed (HCPB) blanket, which is one of the two reference concepts studied within the European Demo Development Program, a comprehensive finite element (FEM) structural analysis has been performed. The analysis refers to the steady-state operating conditions of an outboard blanket segment. On the basis of a three-dimensional model of radial-toroidal sections of the segment box, thermal stresses caused by the temperature gradients in the blanket structure have been calculated. Furthermore, the mechanical loads due to coolant pressure in normal operating conditions as well as an accidental over-pressurization of the blanket box have been accounted for. The stresses caused by a central plasma major disruption from an initial current of 20 MA to zero in 20 ms have been also taken into account. Radiation-induced dimensional changes of breeder and multiplier material caused by both helium production and neutron damage, have also been evaluated and discussed. All the above loads have been combined as input for a FEM stress analysis and the resulting stress distribution has been evaluated according to the American Society of Mechanical Engineers (ASME) norms. (orig.)

  1. Comprehensive neutron cross-section and secondary energy distribution uncertainty analysis for a fusion reactor

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.; LaBauve, R.J.; Young, P.G.

    1980-05-01

    On the example of General Atomic's well-documented Power Generating Fusion Reactor (PGFR) design, this report exercises a comprehensive neutron cross-section and secondary energy distribution (SED) uncertainty analysis. The LASL sensitivity and uncertainty analysis code SENSIT is used to calculate reaction cross-section sensitivity profiles and integral SED sensitivity coefficients. These are then folded with covariance matrices and integral SED uncertainties to obtain the resulting uncertainties of three calculated neutronics design parameters: two critical radiation damage rates and a nuclear heating rate. The report documents the first sensitivity-based data uncertainty analysis, which incorporates a quantitative treatment of the effects of SED uncertainties. The results demonstrate quantitatively that the ENDF/B-V cross-section data files for C, H, and O, including their SED data, are fully adequate for this design application, while the data for Fe and Ni are at best marginally adequate because they give rise to response uncertainties up to 25%. Much higher response uncertainties are caused by cross-section and SED data uncertainties in Cu (26 to 45%), tungsten (24 to 54%), and Cr (up to 98%). Specific recommendations are given for re-evaluations of certain reaction cross-sections, secondary energy distributions, and uncertainty estimates

  2. Comprehensive Method for Culturing Embryonic Dorsal Root Ganglion Neurons for Seahorse Extracellular Flux XF24 Analysis.

    Science.gov (United States)

    Lange, Miranda; Zeng, Yan; Knight, Andrew; Windebank, Anthony; Trushina, Eugenia

    2012-01-01

    Changes in mitochondrial dynamics and function contribute to progression of multiple neurodegenerative diseases including peripheral neuropathies. The Seahorse Extracellular Flux XF24 analyzer provides a comprehensive assessment of the relative state of glycolytic and aerobic metabolism in live cells making this method instrumental in assessing mitochondrial function. One of the most important steps in the analysis of mitochondrial respiration using the Seahorse XF24 analyzer is plating a uniform monolayer of firmly attached cells. However, culturing of primary dorsal root ganglion (DRG) neurons is associated with multiple challenges, including their propensity to form clumps and detach from the culture plate. This could significantly interfere with proper analysis and interpretation of data. We have tested multiple cell culture parameters including coating substrates, culture medium, XF24 microplate plastics, and plating techniques in order to optimize plating conditions. Here we describe a highly reproducible method to obtain neuron-enriched monolayers of securely attached dissociated primary embryonic (E15) rat DRG neurons suitable for analysis with the Seahorse XF24 platform.

  3. Comprehensive method for culturing embryonic dorsal root ganglion neurons for Seahorse Extracellular Flux XF24 Analysis

    Directory of Open Access Journals (Sweden)

    Miranda L. Lange

    2012-12-01

    Full Text Available Changes in mitochondrial dynamics and function contribute to progression of multiple neurodegenerative diseases including peripheral neuropathies. The Seahorse Extracellular Flux XF24 analyzer provides a comprehensive assessment of the relative state of glycolytic and aerobic metabolism in live cells making this method instrumental in assessing mitochondrial function. One of the most important steps in the analysis of mitochondrial respiration using the Seahorse XF24 analyzer is plating a uniform monolayer of firmly attached cells. However, culturing of primary dorsal root ganglion (DRG neurons is associated with multiple challenges, including their propensity to form clumps and detach from the culture plate. This could significantly interfere with proper analysis and interpretation of data. We have tested multiple cell culture parameters including coating substrates, culture medium, XF24 microplate plastics, and plating techniques in order to optimize plating conditions. Here we describe a highly reproducible method to obtain neuron-enriched monolayers of securely attached dissociated primary embryonic (E15 rat DRG neurons suitable for analysis with the Seahorse XF24 platform.

  4. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  5. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis.

    Science.gov (United States)

    Kim, Seongsoon; Park, Donghyeon; Choi, Yonghwa; Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon; Kang, Jaewoo

    2018-01-05

    With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last

  6. Primary treatments for clinically localised prostate cancer: a comprehensive lifetime cost-utility analysis.

    Science.gov (United States)

    Cooperberg, Matthew R; Ramakrishna, Naren R; Duff, Steven B; Hughes, Kathleen E; Sadownik, Sara; Smith, Joseph A; Tewari, Ashutosh K

    2013-03-01

    WHAT'S KNOWN ON THE SUBJECT? AND WHAT DOES THE STUDY ADD?: Multiple treatment alternatives exist for localised prostate cancer, with few high-quality studies directly comparing their comparative effectiveness and costs. The present study is the most comprehensive cost-effectiveness analysis to date for localised prostate cancer, conducted with a lifetime horizon and accounting for survival, health-related quality-of-life, and cost impact of secondary treatments and other downstream events, as well as primary treatment choices. The analysis found minor differences, generally slightly favouring surgical methods, in quality-adjusted life years across treatment options. However, radiation therapy (RT) was consistently more expensive than surgery, and some alternatives, e.g. intensity-modulated RT for low-risk disease, were dominated - that is, both more expensive and less effective than competing alternatives. To characterise the costs and outcomes associated with radical prostatectomy (open, laparoscopic, or robot-assisted) and radiation therapy (RT: dose-escalated three-dimensional conformal RT, intensity-modulated RT, brachytherapy, or combination), using a comprehensive, lifetime decision analytical model. A Markov model was constructed to follow hypothetical men with low-, intermediate-, and high-risk prostate cancer over their lifetimes after primary treatment; probabilities of outcomes were based on an exhaustive literature search yielding 232 unique publications. In each Markov cycle, patients could have remission, recurrence, salvage treatment, metastasis, death from prostate cancer, and death from other causes. Utilities for each health state were determined, and disutilities were applied for complications and toxicities of treatment. Costs were determined from the USA payer perspective, with incorporation of patient costs in a sensitivity analysis. Differences across treatments in quality-adjusted life years across methods were modest, ranging from 10.3 to

  7. Comprehensive validation of computational fluid dynamics simulationsof in-vivo blood flow in patient-specific cerebral aneurysms

    NARCIS (Netherlands)

    Sun, Q.; Groth, A.; Aach, T.

    2012-01-01

    Purpose: Recently, image-based computational fluid dynamic (CFD) simulations have been proposed to investigate the local hemodynamics inside human cerebral aneurysms. It is suggested that the knowledge ofthe computed three-dimensional flow fields can be used to assist clinical risk assessment and

  8. Cafts: computer aided fault tree analysis

    International Nuclear Information System (INIS)

    Poucet, A.

    1985-01-01

    The fault tree technique has become a standard tool for the analysis of safety and reliability of complex system. In spite of the costs, which may be high for a complete and detailed analysis of a complex plant, the fault tree technique is popular and its benefits are fully recognized. Due to this applications of these codes have mostly been restricted to simple academic examples and rarely concern complex, real world systems. In this paper an interactive approach to fault tree construction is presented. The aim is not to replace the analyst, but to offer him an intelligent tool which can assist him in modeling complex systems. Using the CAFTS-method, the analyst interactively constructs a fault tree in two phases: (1) In a first phase he generates an overall failure logic structure of the system; the macrofault tree. In this phase, CAFTS features an expert system approach to assist the analyst. It makes use of a knowledge base containing generic rules on the behavior of subsystems and components; (2) In a second phase the macrofault tree is further refined and transformed in a fully detailed and quantified fault tree. In this phase a library of plant-specific component failure models is used

  9. RDS; A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    International Nuclear Information System (INIS)

    Mohd Faiz Salim; Ridha Roslan; Mohd Rizal Mamat

    2013-01-01

    Full-text: Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBIMOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges. (author)

  10. RDS - A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    International Nuclear Information System (INIS)

    Salim, Mohd Faiz; Roslan, Ridha; Ibrahim, Mohd Rizal Mamat

    2014-01-01

    Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges

  11. RDS - A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Salim, Mohd Faiz, E-mail: mohdfaizs@tnb.com.my [Nuclear Energy Department, Tenaga Nasional Berhad, Level 32, Dua Sentral, 50470 Kuala Lumpur (Malaysia); Roslan, Ridha [Nuclear Installation Division, Atomic Energy Licensing Board, Batu 24, Jalan Dengkil, 43800 Dengkil, Selangor (Malaysia); Ibrahim, Mohd Rizal Mamat [Technical Support Division, Malaysian Nuclear Agency, Bangi, 43000 Kajang, Selangor (Malaysia)

    2014-02-12

    Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges.

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  13. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  14. Prioritizing strategies for comprehensive liver cancer control in Asia: a conjoint analysis

    Directory of Open Access Journals (Sweden)

    Bridges John FP

    2012-10-01

    Full Text Available Abstract Background Liver cancer is a complex and burdensome disease, with Asia accounting for 75% of known cases. Comprehensive cancer control requires the use of multiple strategies, but various stakeholders may have different views as to which strategies should have the highest priority. This study identified priorities across multiple strategies for comprehensive liver cancer control (CLCC from the perspective of liver cancer clinical, policy, and advocacy stakeholders in China, Japan, South Korea and Taiwan. Concordance of priorities was assessed across the region and across respondent roles. Methods Priorities for CLCC were examined as part of a cross-sectional survey of liver cancer experts. Respondents completed several conjoint-analysis choice tasks to prioritize 11 strategies. In each task, respondents judged which of two competing CLCC plans, consisting of mutually exclusive and exhaustive subsets of the strategies, would have the greatest impact. The dependent variable was the chosen plan, which was then regressed on the strategies of different plans. The restricted least squares (RLS method was utilized to compare aggregate and stratified models, and t-tests and Wald tests were used to test for significance and concordance, respectively. Results Eighty respondents (69.6% were eligible and completed the survey. Their primary interests were hepatitis (26%, hepatocellular carcinoma (HCC (58%, metastatic liver cancer (10% and transplantation (6%. The most preferred strategies were monitoring at-risk populations (pclinician education (pnational guidelines (ptransplantation infrastructure (p=0.009 was valued lower in China, measuring social burden (p=0.037 was valued higher in Taiwan, and national guidelines (p=0.025 was valued higher in China. Priorities did not differ across stakeholder groups (p=0.438. Conclusions Priorities for CLCC in Asia include monitoring at-risk populations, clinician education, national guidelines

  15. Prioritizing strategies for comprehensive liver cancer control in Asia: a conjoint analysis.

    Science.gov (United States)

    Bridges, John F P; Dong, Liming; Gallego, Gisselle; Blauvelt, Barri M; Joy, Susan M; Pawlik, Timothy M

    2012-10-30

    Liver cancer is a complex and burdensome disease, with Asia accounting for 75% of known cases. Comprehensive cancer control requires the use of multiple strategies, but various stakeholders may have different views as to which strategies should have the highest priority. This study identified priorities across multiple strategies for comprehensive liver cancer control (CLCC) from the perspective of liver cancer clinical, policy, and advocacy stakeholders in China, Japan, South Korea and Taiwan. Concordance of priorities was assessed across the region and across respondent roles. Priorities for CLCC were examined as part of a cross-sectional survey of liver cancer experts. Respondents completed several conjoint-analysis choice tasks to prioritize 11 strategies. In each task, respondents judged which of two competing CLCC plans, consisting of mutually exclusive and exhaustive subsets of the strategies, would have the greatest impact. The dependent variable was the chosen plan, which was then regressed on the strategies of different plans. The restricted least squares (RLS) method was utilized to compare aggregate and stratified models, and t-tests and Wald tests were used to test for significance and concordance, respectively. Eighty respondents (69.6%) were eligible and completed the survey. Their primary interests were hepatitis (26%), hepatocellular carcinoma (HCC) (58%), metastatic liver cancer (10%) and transplantation (6%). The most preferred strategies were monitoring at-risk populations (p<0.001), clinician education (p<0.001), and national guidelines (p<0.001). Most priorities were concordant across sites except for three strategies: transplantation infrastructure (p=0.009) was valued lower in China, measuring social burden (p=0.037) was valued higher in Taiwan, and national guidelines (p=0.025) was valued higher in China. Priorities did not differ across stakeholder groups (p=0.438). Priorities for CLCC in Asia include monitoring at-risk populations

  16. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  17. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  18. Computer code for qualitative analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Yule, H.P.

    1979-01-01

    Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper

  19. A single-chip computer analysis system for liquid fluorescence

    International Nuclear Information System (INIS)

    Zhang Yongming; Wu Ruisheng; Li Bin

    1998-01-01

    The single-chip computer analysis system for liquid fluorescence is an intelligent analytic instrument, which is based on the principle that the liquid containing hydrocarbons can give out several characteristic fluorescences when irradiated by strong light. Besides a single-chip computer, the system makes use of the keyboard and the calculation and printing functions of a CASIO printing calculator. It combines optics, mechanism and electronics into one, and is small, light and practical, so it can be used for surface water sample analysis in oil field and impurity analysis of other materials

  20. A Computational Discriminability Analysis on Twin Fingerprints

    Science.gov (United States)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  1. Comprehensive Care

    Science.gov (United States)

    ... Comprehensive Care Share this page Facebook Twitter Email Comprehensive Care Understand the importance of comprehensive MS care ... In this article A complex disease requires a comprehensive approach Today multiple sclerosis (MS) is not a ...

  2. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  3. Computer-Aided Communication Satellite System Analysis and Optimization.

    Science.gov (United States)

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  4. Comprehensive curation and analysis of global interaction networks in Saccharomyces cerevisiae

    Science.gov (United States)

    Reguly, Teresa; Breitkreutz, Ashton; Boucher, Lorrie; Breitkreutz, Bobby-Joe; Hon, Gary C; Myers, Chad L; Parsons, Ainslie; Friesen, Helena; Oughtred, Rose; Tong, Amy; Stark, Chris; Ho, Yuen; Botstein, David; Andrews, Brenda; Boone, Charles; Troyanskya, Olga G; Ideker, Trey; Dolinski, Kara; Batada, Nizar N; Tyers, Mike

    2006-01-01

    Background The study of complex biological networks and prediction of gene function has been enabled by high-throughput (HTP) methods for detection of genetic and protein interactions. Sparse coverage in HTP datasets may, however, distort network properties and confound predictions. Although a vast number of well substantiated interactions are recorded in the scientific literature, these data have not yet been distilled into networks that enable system-level inference. Results We describe here a comprehensive database of genetic and protein interactions, and associated experimental evidence, for the budding yeast Saccharomyces cerevisiae, as manually curated from over 31,793 abstracts and online publications. This literature-curated (LC) dataset contains 33,311 interactions, on the order of all extant HTP datasets combined. Surprisingly, HTP protein-interaction datasets currently achieve only around 14% coverage of the interactions in the literature. The LC network nevertheless shares attributes with HTP networks, including scale-free connectivity and correlations between interactions, abundance, localization, and expression. We find that essential genes or proteins are enriched for interactions with other essential genes or proteins, suggesting that the global network may be functionally unified. This interconnectivity is supported by a substantial overlap of protein and genetic interactions in the LC dataset. We show that the LC dataset considerably improves the predictive power of network-analysis approaches. The full LC dataset is available at the BioGRID () and SGD () databases. Conclusion Comprehensive datasets of biological interactions derived from the primary literature provide critical benchmarks for HTP methods, augment functional prediction, and reveal system-level attributes of biological networks. PMID:16762047

  5. Differentiation of endosperm transfer cells of barley: a comprehensive analysis at the micro-scale.

    Science.gov (United States)

    Thiel, Johannes; Riewe, David; Rutten, Twan; Melzer, Michael; Friedel, Swetlana; Bollenbeck, Felix; Weschke, Winfriede; Weber, Hans

    2012-08-01

    Barley endosperm cells differentiate into transfer cells (ETCs) opposite the nucellar projection. To comprehensively analyse ETC differentiation, laser microdissection-based transcript and metabolite profiles were obtained from laser microdissected tissues and cell morphology was analysed. Flange-like secondary-wall ingrowths appeared between 5 and 7 days after pollination within the three outermost cell layers. Gene expression analysis indicated that ethylene-signalling pathways initiate ETC morphology. This is accompanied by gene activity related to cell shape control and vesicle transport, with abundant mitochondria and endomembrane structures. Gene expression analyses indicate predominant formation of hemicelluloses, glucuronoxylans and arabinoxylans, and transient formation of callose, together with proline and 4-hydroxyproline biosynthesis. Activation of the methylation cycle is probably required for biosynthesis of phospholipids, pectins and ethylene. Membrane microdomains involving sterols/sphingolipids and remorins are potentially involved in ETC development. The transcriptional activity of assimilate and micronutrient transporters suggests ETCs as the main uptake organs of solutes into the endosperm. Accordingly, the endosperm grows maximally after ETCs are fully developed. Up-regulated gene expression related to amino acid catabolism, C:N balances, carbohydrate oxidation, mitochondrial activity and starch degradation meets high demands for respiratory energy and carbohydrates, required for cell proliferation and wall synthesis. At 10 days after pollination, ETCs undergo further differentiation, potentially initiated by abscisic acid, and metabolism is reprogrammed as shown by activated storage and stress-related processes. Overall, the data provide a comprehensive view of barley ETC differentiation and development, and identify candidate genes and associated pathways. © 2012 The Authors. The Plant Journal © 2012 Blackwell Publishing Ltd.

  6. Comprehensive analysis of the mutation spectrum in 301 German ALS families.

    Science.gov (United States)

    Müller, Kathrin; Brenner, David; Weydt, Patrick; Meyer, Thomas; Grehl, Torsten; Petri, Susanne; Grosskreutz, Julian; Schuster, Joachim; Volk, Alexander E; Borck, Guntram; Kubisch, Christian; Klopstock, Thomas; Zeller, Daniel; Jablonka, Sibylle; Sendtner, Michael; Klebe, Stephan; Knehr, Antje; Günther, Kornelia; Weis, Joachim; Claeys, Kristl G; Schrank, Berthold; Sperfeld, Anne-Dorte; Hübers, Annemarie; Otto, Markus; Dorst, Johannes; Meitinger, Thomas; Strom, Tim M; Andersen, Peter M; Ludolph, Albert C; Weishaupt, Jochen H

    2018-04-12

    Recent advances in amyotrophic lateral sclerosis (ALS) genetics have revealed that mutations in any of more than 25 genes can cause ALS, mostly as an autosomal-dominant Mendelian trait. Detailed knowledge about the genetic architecture of ALS in a specific population will be important for genetic counselling but also for genotype-specific therapeutic interventions. Here we combined fragment length analysis, repeat-primed PCR, Southern blotting, Sanger sequencing and whole exome sequencing to obtain a comprehensive profile of genetic variants in ALS disease genes in 301 German pedigrees with familial ALS. We report C9orf72 mutations as well as variants in consensus splice sites and non-synonymous variants in protein-coding regions of ALS genes. We furthermore estimate their pathogenicity by taking into account type and frequency of the respective variant as well as segregation within the families. 49% of our German ALS families carried a likely pathogenic variant in at least one of the earlier identified ALS genes. In 45% of the ALS families, likely pathogenic variants were detected in C9orf72, SOD1, FUS, TARDBP or TBK1 , whereas the relative contribution of the other ALS genes in this familial ALS cohort was 4%. We identified several previously unreported rare variants and demonstrated the absence of likely pathogenic variants in some of the recently described ALS disease genes. We here present a comprehensive genetic characterisation of German familial ALS. The present findings are of importance for genetic counselling in clinical practice, for molecular research and for the design of diagnostic gene panels or genotype-specific therapeutic interventions in Europe. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  8. Discerning molecular interactions: A comprehensive review on biomolecular interaction databases and network analysis tools.

    Science.gov (United States)

    Miryala, Sravan Kumar; Anbarasu, Anand; Ramaiah, Sudha

    2018-02-05

    Computational analysis of biomolecular interaction networks is now gaining a lot of importance to understand the functions of novel genes/proteins. Gene interaction (GI) network analysis and protein-protein interaction (PPI) network analysis play a major role in predicting the functionality of interacting genes or proteins and gives an insight into the functional relationships and evolutionary conservation of interactions among the genes. An interaction network is a graphical representation of gene/protein interactome, where each gene/protein is a node, and interaction between gene/protein is an edge. In this review, we discuss the popular open source databases that serve as data repositories to search and collect protein/gene interaction data, and also tools available for the generation of interaction network, visualization and network analysis. Also, various network analysis approaches like topological approach and clustering approach to study the network properties and functional enrichment server which illustrates the functions and pathway of the genes and proteins has been discussed. Hence the distinctive attribute mentioned in this review is not only to provide an overview of tools and web servers for gene and protein-protein interaction (PPI) network analysis but also to extract useful and meaningful information from the interaction networks. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  10. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  11. Computer programs for analysis of geophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  12. Computer programs for analysis of geophysical data

    International Nuclear Information System (INIS)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon's problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution

  13. A COMPREHENSIVE ANALYSIS OF ISOLATED INFRANUCLEAR ABDUCENS NERVE PALSY IN A TERTIARY EYE CARE CENTRE

    Directory of Open Access Journals (Sweden)

    G. Dhamodara

    2017-01-01

    Full Text Available BACKGROUND A comprehensive analysis of the aetiology and clinical profile of isolated infranuclear abducens nerve palsy in a tertiary eye care centre. MATERIALS AND METHODS A hospital-based retrospective case series analysis of 90 isolated infranuclear neurogenic abducens nerve palsies. Documentation included age, gender, presenting complaints, history of diabetes mellitus, hypertension, mode of onset, progression of the disease, treatment given and recovery rate was evaluated. Detailed ophthalmic evaluation of both eyes including anterior segment examination, extraocular movements, diplopia charting and Hess charting. Thorough central nervous system examination and systemic examination was done. Inclusion Criteria- All isolated infranuclear neurogenic lesions of abducens nerve palsy. Exclusion Criteria- Conditions like supranuclear lesions, myasthenia, orbital inflammation and myopathies, false localising sign of abducens nerve palsy were excluded by appropriate testing and investigations. RESULTS Total cases were 90 patients. Mean age of presentation was between 3rd to 5th decades with male preponderance. Commonest presenting symptom was diplopia (71.1%, commonest cause being idiopathic neuritis (48%, diabetes mellitus (20%, hypertension (15%, trauma (10% and others (7%. CONCLUSION In our study, isolated infranuclear abducens nerve palsy with nonspecific aetiology predominantly affecting males of 3 rd to 5 th decade with variable recovery rates were seen. Hence, careful clinical examination in all cases is essential with close follow up on a long-time basis.

  14. A comprehensive segmentation analysis of crude oil market based on time irreversibility

    Science.gov (United States)

    Xia, Jianan; Shang, Pengjian; Lu, Dan; Yin, Yi

    2016-05-01

    In this paper, we perform a comprehensive entropic segmentation analysis of crude oil future prices from 1983 to 2014 which used the Jensen-Shannon divergence as the statistical distance between segments, and analyze the results from original series S and series begin at 1986 (marked as S∗) to find common segments which have same boundaries. Then we apply time irreversibility analysis of each segment to divide all segments into two groups according to their asymmetry degree. Based on the temporal distribution of the common segments and high asymmetry segments, we figure out that these two types of segments appear alternately and do not overlap basically in daily group, while the common portions are also high asymmetry segments in weekly group. In addition, the temporal distribution of the common segments is fairly close to the time of crises, wars or other events, because the hit from severe events to oil price makes these common segments quite different from their adjacent segments. The common segments can be confirmed in daily group series, or weekly group series due to the large divergence between common segments and their neighbors. While the identification of high asymmetry segments is helpful to know the segments which are not affected badly by the events and can recover to steady states automatically. Finally, we rearrange the segments by merging the connected common segments or high asymmetry segments into a segment, and conjoin the connected segments which are neither common nor high asymmetric.

  15. Comprehensive techno-economic analysis of wastewater-based algal biofuel production: A case study.

    Science.gov (United States)

    Xin, Chunhua; Addy, Min M; Zhao, Jinyu; Cheng, Yanling; Cheng, Sibo; Mu, Dongyan; Liu, Yuhuan; Ding, Rijia; Chen, Paul; Ruan, Roger

    2016-07-01

    Combining algae cultivation and wastewater treatment for biofuel production is considered the feasible way for resource utilization. An updated comprehensive techno-economic analysis method that integrates resources availability into techno-economic analysis was employed to evaluate the wastewater-based algal biofuel production with the consideration of wastewater treatment improvement, greenhouse gases emissions, biofuel production costs, and coproduct utilization. An innovative approach consisting of microalgae cultivation on centrate wastewater, microalgae harvest through flocculation, solar drying of biomass, pyrolysis of biomass to bio-oil, and utilization of co-products, was analyzed and shown to yield profound positive results in comparison with others. The estimated break even selling price of biofuel ($2.23/gallon) is very close to the acceptable level. The approach would have better overall benefits and the internal rate of return would increase up to 18.7% if three critical components, namely cultivation, harvest, and downstream conversion could achieve breakthroughs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A Comprehensive Analysis on Wearable Acceleration Sensors in Human Activity Recognition

    Directory of Open Access Journals (Sweden)

    Majid Janidarmian

    2017-03-01

    Full Text Available Sensor-based motion recognition integrates the emerging area of wearable sensors with novel machine learning techniques to make sense of low-level sensor data and provide rich contextual information in a real-life application. Although Human Activity Recognition (HAR problem has been drawing the attention of researchers, it is still a subject of much debate due to the diverse nature of human activities and their tracking methods. Finding the best predictive model in this problem while considering different sources of heterogeneities can be very difficult to analyze theoretically, which stresses the need of an experimental study. Therefore, in this paper, we first create the most complete dataset, focusing on accelerometer sensors, with various sources of heterogeneities. We then conduct an extensive analysis on feature representations and classification techniques (the most comprehensive comparison yet with 293 classifiers for activity recognition. Principal component analysis is applied to reduce the feature vector dimension while keeping essential information. The average classification accuracy of eight sensor positions is reported to be 96.44% ± 1.62% with 10-fold evaluation, whereas accuracy of 79.92% ± 9.68% is reached in the subject-independent evaluation. This study presents significant evidence that we can build predictive models for HAR problem under more realistic conditions, and still achieve highly accurate results.

  17. A Comprehensive Analysis on Wearable Acceleration Sensors in Human Activity Recognition.

    Science.gov (United States)

    Janidarmian, Majid; Roshan Fekr, Atena; Radecka, Katarzyna; Zilic, Zeljko

    2017-03-07

    Sensor-based motion recognition integrates the emerging area of wearable sensors with novel machine learning techniques to make sense of low-level sensor data and provide rich contextual information in a real-life application. Although Human Activity Recognition (HAR) problem has been drawing the attention of researchers, it is still a subject of much debate due to the diverse nature of human activities and their tracking methods. Finding the best predictive model in this problem while considering different sources of heterogeneities can be very difficult to analyze theoretically, which stresses the need of an experimental study. Therefore, in this paper, we first create the most complete dataset, focusing on accelerometer sensors, with various sources of heterogeneities. We then conduct an extensive analysis on feature representations and classification techniques (the most comprehensive comparison yet with 293 classifiers) for activity recognition. Principal component analysis is applied to reduce the feature vector dimension while keeping essential information. The average classification accuracy of eight sensor positions is reported to be 96.44% ± 1.62% with 10-fold evaluation, whereas accuracy of 79.92% ± 9.68% is reached in the subject-independent evaluation. This study presents significant evidence that we can build predictive models for HAR problem under more realistic conditions, and still achieve highly accurate results.

  18. Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.

    Science.gov (United States)

    Latha, Indu; Reichenbach, Stephen E; Tao, Qingping

    2011-09-23

    Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. IRSN global process for leading a comprehensive fire safety analysis for nuclear installations

    International Nuclear Information System (INIS)

    Ormieres, Yannick; Lacoue, Jocelyne

    2013-01-01

    A fire safety analysis (FSA) is requested to justify the adequacy of fire protection measures set by the operator. A recent document written by IRSN outlines a global process for such a comprehensive fire safety analysis. Thanks to the French nuclear fire safety regulation evolutions, from prescriptive requirements to objective requirements, the proposed fire safety justification process focuses on compliance with performance criteria for fire protection measures. These performance criteria are related to the vulnerability of targets to effects of fire, and not only based upon radiological consequences out side the installation caused by a fire. In his FSA, the operator has to define the safety functions that should continue to ensure its mission even in the case of fire in order to be in compliance with nuclear safety objectives. Then, in order to maintain these safety functions, the operator has to justify the adequacy of fire protection measures, defined according to defence in depth principles. To reach the objective, the analysis process is based on the identification of targets to be protected in order to maintain safety functions, taken into account facility characteristics. These targets include structures, systems, components and personal important to safety. Facility characteristics include, for all operating conditions, potential ignition sources and fire protections systems. One of the key points of the fire analysis is the assessment of possible fire scenarios in the facility. Given the large number of possible fire scenarios, it is then necessary to evaluate 'reference fires' which are the worst case scenarios of all possible fire scenarios and which are used by the operator for the design of fire protection measures. (authors)

  20. A comprehensive association analysis of homocysteine metabolic pathway genes in Singaporean Chinese with ischemic stroke.

    Directory of Open Access Journals (Sweden)

    Hui-Qi Low

    Full Text Available BACKGROUND: The effect of genetic factors, apart from 5,10-methylenetetrahydrofolate reductase (MTHFR polymorphisms, on elevated plasma homocysteine levels and increasing ischemic stroke risk have not been fully elucidated. We conducted a comprehensive analysis of 25 genes involved in homocysteine metabolism to investigate association of common variants within these genes with ischemic stroke risk. METHODOLOGY/PRINCIPAL FINDINGS: The study was done in two stages. In the initial study, SNP and haplotype-based association analyses were performed using 147 tagging Single Nucleotide Polymorphisms (SNPs in 360 stroke patients and 354 non-stroke controls of Singaporean Chinese ethnicity. Joint association analysis of significant SNPs was then performed to assess the cumulative effect of these variants on ischemic stroke risk. In the replication study, 8 SNPs were selected for validation in an independent set of 420 matched case-control pairs of Singaporean Chinese ethnicity. SNP analysis from the initial study suggested 3 risk variants in the MTRR, SHMT1 and TCN2 genes which were moderately associated with ischemic stroke risk, independent of known stroke risk factors. Although the replication study failed to support single-SNP associations observed in the initial study, joint association analysis of the 3 variants in combined initial and replication samples revealed a trend of elevated risk with an increased number of risk alleles (Joint P(trend = 1.2×10(-6. CONCLUSIONS: Our study did not find direct evidence of associations between any single polymorphisms of homocysteine metabolic pathway genes and ischemic stroke, but suggests that the cumulative effect of several small to moderate risk variants from genes involved in homocysteine metabolism may jointly confer a significant impact on ischemic stroke risk.

  1. Introducing remarks upon the analysis of computer systems performance

    International Nuclear Information System (INIS)

    Baum, D.

    1980-05-01

    Some of the basis ideas of analytical techniques to study the behaviour of computer systems are presented. Single systems as well as networks of computers are viewed as stochastic dynamical systems which may be modelled by queueing networks. Therefore this report primarily serves as an introduction to probabilistic methods for qualitative analysis of systems. It is supplemented by an application example of Chandy's collapsing method. (orig.) [de

  2. Computer-aided visualization and analysis system for sequence evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  3. Strategic Analysis of Autodesk and the Move to Cloud Computing

    OpenAIRE

    Kewley, Kathleen

    2012-01-01

    This paper provides an analysis of the opportunity for Autodesk to move its core technology to a cloud delivery model. Cloud computing offers clients a number of advantages, such as lower costs for computer hardware, increased access to technology and greater flexibility. With the IT industry embracing this transition, software companies need to plan for future change and lead with innovative solutions. Autodesk is in a unique position to capitalize on this market shift, as it is the leader i...

  4. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  5. A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS

    OpenAIRE

    Monika Raghuvanshi*, Rahul Patel

    2016-01-01

    In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...

  6. MetaComp: comprehensive analysis software for comparative meta-omics including comparative metagenomics.

    Science.gov (United States)

    Zhai, Peng; Yang, Longshu; Guo, Xiao; Wang, Zhe; Guo, Jiangtao; Wang, Xiaoqi; Zhu, Huaiqiu

    2017-10-02

    During the past decade, the development of high throughput nucleic sequencing and mass spectrometry analysis techniques have enabled the characterization of microbial communities through metagenomics, metatranscriptomics, metaproteomics and metabolomics data. To reveal the diversity of microbial communities and interactions between living conditions and microbes, it is necessary to introduce comparative analysis based upon integration of all four types of data mentioned above. Comparative meta-omics, especially comparative metageomics, has been established as a routine process to highlight the significant differences in taxon composition and functional gene abundance among microbiota samples. Meanwhile, biologists are increasingly concerning about the correlations between meta-omics features and environmental factors, which may further decipher the adaptation strategy of a microbial community. We developed a graphical comprehensive analysis software named MetaComp comprising a series of statistical analysis approaches with visualized results for metagenomics and other meta-omics data comparison. This software is capable to read files generated by a variety of upstream programs. After data loading, analyses such as multivariate statistics, hypothesis testing of two-sample, multi-sample as well as two-group sample and a novel function-regression analysis of environmental factors are offered. Here, regression analysis regards meta-omic features as independent variable and environmental factors as dependent variables. Moreover, MetaComp is capable to automatically choose an appropriate two-group sample test based upon the traits of input abundance profiles. We further evaluate the performance of its choice, and exhibit applications for metagenomics, metaproteomics and metabolomics samples. MetaComp, an integrative software capable for applying to all meta-omics data, originally distills the influence of living environment on microbial community by regression analysis

  7. Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data

    Science.gov (United States)

    Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.

    2018-03-01

    One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.

  8. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  9. PIXAN: the Lucas Heights PIXE analysis computer package

    International Nuclear Information System (INIS)

    Clayton, E.

    1986-11-01

    To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis

  10. Language and Cognitive Predictors of Text Comprehension: Evidence from Multivariate Analysis

    Science.gov (United States)

    Kim, Young-Suk

    2015-01-01

    Using data from children in South Korea (N = 145, M[subscript age] = 6.08), it was determined how low-level language and cognitive skills (vocabulary, syntactic knowledge, and working memory) and high-level cognitive skills (comprehension monitoring and theory of mind [ToM]) are related to listening comprehension and whether listening…

  11. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  12. Conceptual design of pipe whip restraints using interactive computer analysis

    International Nuclear Information System (INIS)

    Rigamonti, G.; Dainora, J.

    1975-01-01

    Protection against pipe break effects necessitates a complex interaction between failure mode analysis, piping layout, and structural design. Many iterations are required to finalize structural designs and equipment arrangements. The magnitude of the pipe break loads transmitted by the pipe whip restraints to structural embedments precludes the application of conservative design margins. A simplified analytical formulation of the nonlinear dynamic problems associated with pipe whip has been developed and applied using interactive computer analysis techniques. In the dynamic analysis, the restraint and the associated portion of the piping system, are modeled using the finite element lumped mass approach to properly reflect the dynamic characteristics of the piping/restraint system. The analysis is performed as a series of piecewise linear increments. Each of these linear increments is terminated by either formation of plastic conditions or closing/opening of gaps. The stiffness matrix is modified to reflect the changed stiffness characteristics of the system and re-started using the previous boundary conditions. The formation of yield hinges are related to the plastic moment of the section and unloading paths are automatically considered. The conceptual design of the piping/restraint system is performed using interactive computer analysis. The application of the simplified analytical approach with interactive computer analysis results in an order of magnitude reduction in engineering time and computer cost. (Auth.)

  13. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  14. Comprehensive Analysis of the Gas- and Particle-Phase Products of VOC Oxidation

    Science.gov (United States)

    Bakker-Arkema, J.; Ziemann, P. J.

    2017-12-01

    Controlled environmental chamber studies are important for determining atmospheric reaction mechanisms and gas and aerosol products formed in the oxidation of volatile organic compounds (VOCs). Such information is necessary for developing detailed chemical models for use in predicting the atmospheric fate of VOCs and also secondary organic aerosol (SOA) formation. However, complete characterization of atmospheric oxidation reactions, including gas- and particle-phase product yields, and reaction branching ratios, are difficult to achieve. In this work, we investigated the reactions of terminal and internal alkenes with OH radicals in the presence of NOx in an attempt to fully characterize the chemistry of these systems while minimizing and accounting for the inherent uncertainties associated with environmental chamber experiments. Gas-phase products (aldehydes formed by alkoxy radical decomposition) and particle-phase products (alkyl nitrates, β-hydroxynitrates, dihydroxynitrates, 1,4-hydroxynitrates, 1,4-hydroxycarbonyls, and dihydroxycarbonyls) formed through pathways involving addition of OH to the C=C double bond as well as H-atom abstraction were identified and quantified using a suite of analytical techniques. Particle-phase products were analyzed in real time with a thermal desorption particle beam mass spectrometer; and off-line by collection onto filters, extraction, and subsequent analysis of functional groups by derivatization-spectrophotometric methods developed in our lab. Derivatized products were also separated by liquid chromatography for molecular quantitation by UV absorbance and identification using chemical ionization-ion trap mass spectrometry. Gas phase aldehydes were analyzed off-line by collection onto Tenax and a 5-channel denuder with subsequent analysis by gas chromatography, or by collection onto DNPH-coated cartridges and subsequent analysis by liquid chromatography. The full product identification and quantitation, with careful

  15. Exergetic performance analysis of an ice-cream manufacturing plant: A comprehensive survey

    International Nuclear Information System (INIS)

    Dowlati, Majid; Aghbashlo, Mortaza; Mojarab Soufiyan, Mohamad

    2017-01-01

    In this study, a comprehensive exergetic performance analysis of an ice-cream manufacturing plant was conducted in order to pinpoint the locations of thermodynamic inefficiencies. Exergetic performance parameters of each subunit of the plant were determined and illustrated individually through writing and solving energy and exergy balance equations on the basis of real operational data. The required data were acquired from a local ice-cream factory located in Tehran, Iran. The plant included three main subsystems including water steam generator, refrigeration system, and ice-cream production line. An attempt was also made to quantify the specific exergy destruction of the ice-cream manufacturing process. The functional exergetic efficiency of the water steam generator, refrigeration system, and ice-cream production line was determined at 17.45%, 25.52%, and 5.71%, respectively. The overall functional exergetic efficiency of the process was found to be 2.15%, while the specific exergy destruction was calculated as 719.80 kJ/kg. In general, exergy analysis and its derivatives could provide invaluable information over the conventional energy analysis, suggesting potential locations for the plant performance improvement. - Highlights: • An ice-cream manufacturing plant was exergetically analyzed using the actual data. • Water steaming unit had the highest irreversibility rate among the plant subunits. • The specific exergy destruction of the ice-cream manufacturing was 719.80 kJ/kg. • The overall process exergetic efficiency of the process was found to be 2.15%.

  16. Comprehensive analysis of the soybean (Glycine max GmLAX auxin transporter gene family

    Directory of Open Access Journals (Sweden)

    Chenglin eChai

    2016-03-01

    Full Text Available The phytohormone auxin plays a critical role in regulation of plant growth and development as well as plant responses to abiotic stresses. This is mainly achieved through its uneven distribution in plants via a polar auxin transport process. Auxin transporters are major players in polar auxin transport. The AUXIN RESISTANT 1 ⁄ LIKE AUX1 (AUX⁄LAX auxin influx carriers belong to the amino acid permease family of proton-driven transporters and function in the uptake of indole-3-acetic acid (IAA. In this study, genome-wide comprehensive analysis of the soybean AUX⁄LAX (GmLAX gene family, including phylogenic relationships, chromosome localization, and gene structure, were carried out. A total of 15 GmLAX genes, including seven duplicated gene pairs, were identified in the soybean genome. They were distributed on 10 chromosomes. Despite their higher percentage identities at the protein level, GmLAXs exhibited versatile tissue-specific expression patterns, indicating coordinated functioning during plant growth and development. Most GmLAXs were responsive to drought and dehydration stresses and auxin and abscisic acid (ABA stimuli, in a tissue- and/or time point- sensitive mode. Several GmLAX members were involved in responding to salt stress. Sequence analysis revealed that promoters of GmLAXs contained different combinations of stress-related cis-regulatory elements. These studies suggest that the soybean GmLAXs were under control of a very complex regulatory network, responding to various internal and external signals. This study helps to identity candidate GmLAXs for further analysis of their roles in soybean development and adaption to adverse environments.

  17. The Effects of Visual Attention Span and Phonological Decoding in Reading Comprehension in Dyslexia: A Path Analysis.

    Science.gov (United States)

    Chen, Chen; Schneps, Matthew H; Masyn, Katherine E; Thomson, Jennifer M

    2016-11-01

    Increasing evidence has shown visual attention span to be a factor, distinct from phonological skills, that explains single-word identification (pseudo-word/word reading) performance in dyslexia. Yet, little is known about how well visual attention span explains text comprehension. Observing reading comprehension in a sample of 105 high school students with dyslexia, we used a pathway analysis to examine the direct and indirect path between visual attention span and reading comprehension while controlling for other factors such as phonological awareness, letter identification, short-term memory, IQ and age. Integrating phonemic decoding efficiency skills in the analytic model, this study aimed to disentangle how visual attention span and phonological skills work together in reading comprehension for readers with dyslexia. We found visual attention span to have a significant direct effect on more difficult reading comprehension but not on an easier level. It also had a significant direct effect on pseudo-word identification but not on word identification. In addition, we found that visual attention span indirectly explains reading comprehension through pseudo-word reading and word reading skills. This study supports the hypothesis that at least part of the dyslexic profile can be explained by visual attention abilities. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Microprocessors & their operating systems a comprehensive guide to 8, 16 & 32 bit hardware, assembly language & computer architecture

    CERN Document Server

    Holland, R C

    1989-01-01

    Provides a comprehensive guide to all of the major microprocessor families (8, 16 and 32 bit). The hardware aspects and software implications are described, giving the reader an overall understanding of microcomputer architectures. The internal processor operation of each microprocessor device is presented, followed by descriptions of the instruction set and applications for the device. Software considerations are expanded with descriptions and examples of the main high level programming languages (BASIC, Pascal and C). The book also includes detailed descriptions of the three main operatin

  19. Investigating the computer analysis of eddy current NDT data

    International Nuclear Information System (INIS)

    Brown, R.L.

    1979-01-01

    The objective of this activity was to investigate and develop techniques for computer analysis of eddy current nondestructive testing (NDT) data. A single frequency commercial eddy current tester and a precision mechanical scanner were interfaced with a PDP-11/34 computer to obtain and analyze eddy current data from samples of 316 stainless steel tubing containing known discontinuities. Among the data analysis techniques investigated were: correlation, Fast Fourier Transforms (FFT), clustering, and Adaptive Learning Networks (ALN). The results were considered encouraging. ALN, for example, correctly identified 88% of the defects and non-defects from a group of 153 signal indications

  20. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  1. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  2. Comprehensive safety analysis code system for nuclear fusion reactors II: Thermal analysis during plasma disruptions for international thermonuclear experimental reactor

    International Nuclear Information System (INIS)

    Honda, T.; Maki, K.; Okazaki, T.

    1994-01-01

    Thermal characteristics of a fusion reactor [International Thermonuclear Experimental Reactor (ITER) Conceptual Design Activity] during plasma disruptions have been analyzed by using a comprehensive safety analysis code for nuclear fusion reactors. The erosion depth due to disruptions for the armor of the first wall depends on the current quench time of disruptions occurring in normal operation. If it is possible to extend the time up to ∼50 ms, the erosion depth is considerably reduced. On the other hand, the erosion depth of the divertor is ∼570 μm for only one disruption, which is determined only by the thermal flux during the thermal quench. This means that the divertor plate should be exchanged after about nine disruptions. Counter-measures are necessary for the divertor to relieve disruption influences. As other scenarios of disruptions, beta-limit disruptions and vertical displacement events were also investigated quantitatively. 13 refs., 5 figs

  3. Comprehensive analysis of the mechanical system system “fixture – workpiece”

    Directory of Open Access Journals (Sweden)

    V. Ivanov

    2017-05-01

    Full Text Available Present work is devoted to locating and clamping of prismatic parts in fixtures during the machining on drilling-milling-boring machines. The special feature of locating schemes of parts of the present type is that one of the flat surfaces is used as a datum (3 contact points ensuring stability, convenience of locating and vibration elimination during the machining process. Within the scope of the presented work, the comprehensive mathematical model of the system “fixture – workpiece” has been developed taking into account technological features of locating and clamping processes of workpiece in fixture. The methodology of determination of the rigidity of the cutting tool and functional elements has been proposed. As a result, the static and dynamic analysis of the workpiece in the fixture under the effect of spatial system of cutting and clamping forces is being determined. In addition, the free frequencies of the system “fixture-workpiece” and forms of the free and forced vibrations, frequency-response characteristics are being determined. Keywords: locating chart, functional element, rigidity, free vibrations, free frequency, forced vibrations, cutting force, frequency-response characteristic

  4. Comprehensive analysis of tropomyosin isoforms in skeletal muscles by top-down proteomics.

    Science.gov (United States)

    Jin, Yutong; Peng, Ying; Lin, Ziqing; Chen, Yi-Chen; Wei, Liming; Hacker, Timothy A; Larsson, Lars; Ge, Ying

    2016-04-01

    Mammalian skeletal muscles are heterogeneous in nature and are capable of performing various functions. Tropomyosin (Tpm) is a major component of the thin filament in skeletal muscles and plays an important role in controlling muscle contraction and relaxation. Tpm is known to consist of multiple isoforms resulting from different encoding genes and alternative splicing, along with post-translational modifications. However, a systematic characterization of Tpm isoforms in skeletal muscles is still lacking. Therefore, we employed top-down mass spectrometry (MS) to identify and characterize Tpm isoforms present in different skeletal muscles from multiple species, including swine, rat, and human. Our study revealed that Tpm1.1 and Tpm2.2 are the two major Tpm isoforms in swine and rat skeletal muscles, whereas Tpm1.1, Tpm2.2, and Tpm3.12 are present in human skeletal muscles. Tandem MS was utilized to identify the sequences of the major Tpm isoforms. Furthermore, quantitative analysis revealed muscle-type specific differences in the abundance of un-modified and modified Tpm isoforms in rat and human skeletal muscles. This study represents the first systematic investigation of Tpm isoforms in skeletal muscles, which not only demonstrates the capabilities of top-down MS for the comprehensive characterization of skeletal myofilament proteins but also provides the basis for further studies on these Tpm isoforms in muscle-related diseases.

  5. Comprehensive drought characteristics analysis based on a nonlinear multivariate drought index

    Science.gov (United States)

    Yang, Jie; Chang, Jianxia; Wang, Yimin; Li, Yunyun; Hu, Hui; Chen, Yutong; Huang, Qiang; Yao, Jun

    2018-02-01

    It is vital to identify drought events and to evaluate multivariate drought characteristics based on a composite drought index for better drought risk assessment and sustainable development of water resources. However, most composite drought indices are constructed by the linear combination, principal component analysis and entropy weight method assuming a linear relationship among different drought indices. In this study, the multidimensional copulas function was applied to construct a nonlinear multivariate drought index (NMDI) to solve the complicated and nonlinear relationship due to its dependence structure and flexibility. The NMDI was constructed by combining meteorological, hydrological, and agricultural variables (precipitation, runoff, and soil moisture) to better reflect the multivariate variables simultaneously. Based on the constructed NMDI and runs theory, drought events for a particular area regarding three drought characteristics: duration, peak, and severity were identified. Finally, multivariate drought risk was analyzed as a tool for providing reliable support in drought decision-making. The results indicate that: (1) multidimensional copulas can effectively solve the complicated and nonlinear relationship among multivariate variables; (2) compared with single and other composite drought indices, the NMDI is slightly more sensitive in capturing recorded drought events; and (3) drought risk shows a spatial variation; out of the five partitions studied, the Jing River Basin as well as the upstream and midstream of the Wei River Basin are characterized by a higher multivariate drought risk. In general, multidimensional copulas provides a reliable way to solve the nonlinear relationship when constructing a comprehensive drought index and evaluating multivariate drought characteristics.

  6. Comprehensive analysis of the transcriptional profile of the Mediator complex across human cancer types.

    Science.gov (United States)

    Syring, Isabella; Klümper, Niklas; Offermann, Anne; Braun, Martin; Deng, Mario; Boehm, Diana; Queisser, Angela; von Mässenhausen, Anne; Brägelmann, Johannes; Vogel, Wenzel; Schmidt, Doris; Majores, Michael; Schindler, Anne; Kristiansen, Glen; Müller, Stefan C; Ellinger, Jörg; Shaikhibrahim, Zaki; Perner, Sven

    2016-04-26

    The Mediator complex is a key regulator of gene transcription and several studies demonstrated altered expressions of particular subunits in diverse human diseases, especially cancer. However a systematic study deciphering the transcriptional expression of the Mediator across different cancer entities is still lacking.We therefore performed a comprehensive in silico cancer vs. benign analysis of the Mediator complex subunits (MEDs) for 20 tumor entities using Oncomine datasets. The transcriptional expression profiles across almost all cancer entities showed differentially expressed MEDs as compared to benign tissue. Differential expression of MED8 in renal cell carcinoma (RCC) and MED12 in lung cancer (LCa) were validated and further investigated by immunohistochemical staining on tissue microarrays containing large numbers of specimen. MED8 in clear cell RCC (ccRCC) associated with shorter survival and advanced TNM stage and showed higher expression in metastatic than primary tumors. In vitro, siRNA mediated MED8 knockdown significantly impaired proliferation and motility in ccRCC cell lines, hinting at a role for MED8 to serve as a novel therapeutic target in ccRCC. Taken together, our Mediator complex transcriptome proved to be a valid tool for identifying cancer-related shifts in Mediator complex composition, revealing that MEDs do exhibit cancer specific transcriptional expression profiles.

  7. A Comprehensive Analysis of Multiscale Field-Aligned Currents: Characteristics, Controlling Parameters, and Relationships

    Science.gov (United States)

    McGranaghan, Ryan M.; Mannucci, Anthony J.; Forsyth, Colin

    2017-12-01

    We explore the characteristics, controlling parameters, and relationships of multiscale field-aligned currents (FACs) using a rigorous, comprehensive, and cross-platform analysis. Our unique approach combines FAC data from the Swarm satellites and the Advanced Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) to create a database of small-scale (˜10-150 km, 250 km) FACs. We examine these data for the repeatable behavior of FACs across scales (i.e., the characteristics), the dependence on the interplanetary magnetic field orientation, and the degree to which each scale "departs" from nominal large-scale specification. We retrieve new information by utilizing magnetic latitude and local time dependence, correlation analyses, and quantification of the departure of smaller from larger scales. We find that (1) FACs characteristics and dependence on controlling parameters do not map between scales in a straight forward manner, (2) relationships between FAC scales exhibit local time dependence, and (3) the dayside high-latitude region is characterized by remarkably distinct FAC behavior when analyzed at different scales, and the locations of distinction correspond to "anomalous" ionosphere-thermosphere behavior. Comparing with nominal large-scale FACs, we find that differences are characterized by a horseshoe shape, maximizing across dayside local times, and that difference magnitudes increase when smaller-scale observed FACs are considered. We suggest that both new physics and increased resolution of models are required to address the multiscale complexities. We include a summary table of our findings to provide a quick reference for differences between multiscale FACs.

  8. Comprehensive sequence analysis of nine Usher syndrome genes in the UK National Collaborative Usher Study.

    Science.gov (United States)

    Le Quesne Stabej, Polona; Saihan, Zubin; Rangesh, Nell; Steele-Stallard, Heather B; Ambrose, John; Coffey, Alison; Emmerson, Jenny; Haralambous, Elene; Hughes, Yasmin; Steel, Karen P; Luxon, Linda M; Webster, Andrew R; Bitner-Glindzicz, Maria

    2012-01-01

    Usher syndrome (USH) is an autosomal recessive disorder comprising retinitis pigmentosa, hearing loss and, in some cases, vestibular dysfunction. It is clinically and genetically heterogeneous with three distinctive clinical types (I-III) and nine Usher genes identified. This study is a comprehensive clinical and genetic analysis of 172 Usher patients and evaluates the contribution of digenic inheritance. The genes MYO7A, USH1C, CDH23, PCDH15, USH1G, USH2A, GPR98, WHRN, CLRN1 and the candidate gene SLC4A7 were sequenced in 172 UK Usher patients, regardless of clinical type. No subject had definite mutations (nonsense, frameshift or consensus splice site mutations) in two different USH genes. Novel missense variants were classified UV1-4 (unclassified variant): UV4 is 'probably pathogenic', based on control frequency A being the most common USH1 mutation in the cohort). USH2A was responsible for 79.3% of USH2 families and GPR98 for only 6.6%. No mutations were found in USH1G, WHRN or SLC4A7. One or two pathogenic/likely pathogenic variants were identified in 86% of cases. No convincing cases of digenic inheritance were found. It is concluded that digenic inheritance does not make a significant contribution to Usher syndrome; the observation of multiple variants in different genes is likely to reflect polymorphic variation, rather than digenic effects.

  9. Towards comprehensive hydrocarbons analysis of middle distillates by LC-GCxGC.

    Science.gov (United States)

    Adam, Frédérick; Bertoncini, Fabrice; Thiébaut, Didier; Esnault, Sébastien; Espinat, Didier; Hennion, M C

    2007-01-01

    The detailed characterization of middle distillates is essential for a better understanding of reactions involved in refining processes. Owing to a higher resolution power and an enhanced sensitivity, but especially to a group-type ordering in the chromatographic plane, comprehensive two-dimensional gas chromatography (GCxGC) offers unsurpassed characterization possibilities for petroleum samples. However, GCxGC fails to totally discriminate naphthenes from unsaturates occurring in hydrotreated diesel samples. This article aims at promoting the implementation of LC-GCxGC for the quantitative determination of hydrocarbon distribution in middle distillates, including naphthenes. In this configuration, liquid chromatography (LC) enables the separation of hydrocarbons into two fractions (viz., saturated and unsaturated) before the subsequent analysis of each fraction by GCxGC. In this paper, the choice of GCxGC conditions in order to achieve the separation and identification of hydrocarbons by chemical class is discussed; under these conditions, naphthenes are separated according to the number of saturated rings. For the first time, the presence of di-, tri-, and tetra-naphthenes resulting from the hydroconversion of aromatics can clearly be evidenced. A quantitative procedure for the determination of the distribution of hydrocarbons, including the distribution of naphthenes according to the number of saturated rings, is also proposed and discussed in detail. LC-GCxGC is found to provide an unequalled degree of information that will widely contribute to a better understanding of hydroconversion processes.

  10. A COMPREHENSIVE ANALYSIS DIABETES OF RISK FACTORS IN PATIENTS WITH CHRONIC PANCREATITIS

    Directory of Open Access Journals (Sweden)

    M. A. Kunitsyna

    2012-01-01

    Full Text Available Aim – to perform comprehensive analysis of the risk factors of diabetes mellitus (DM in patients with chronic pancreatitis (CP. Materials and methods. All examined patients with chronic pancreatitis (CP were divided into 2 groups. The first group – 38 patients with the progression of the diabetes in the first 3 years of the CP, the second – 44 people with no diabetes after 10 years of disease. Formation of CP etiologic groups based on the mandatory presence of one of two factors: alcohol abuse, the presence of disease of biliary system. The combination of two etiological factors wasl exclusion criteria. Results and conclusion. Among the examined patients biliary form of CP occurred in 52 patients, alcohol – in 30 patients. It was found that in the first three years of CP risk of diabetes is not associated with family history, obesity, number of exacerbations, but in the form of alcoholic CP DM formed almost two times more often than in the biliary. When combined with CP in patients with diabetes were significantly more common strain of the main flow and structural changes in the tail of the pancreas.

  11. Generation and Comprehensive Analysis of an Influenza Virus Polymerase Cellular Interaction Network▿†§

    Science.gov (United States)

    Tafforeau, Lionel; Chantier, Thibault; Pradezynski, Fabrine; Pellet, Johann; Mangeot, Philippe E.; Vidalain, Pierre-Olivier; Andre, Patrice; Rabourdin-Combe, Chantal; Lotteau, Vincent

    2011-01-01

    The influenza virus transcribes and replicates its genome inside the nucleus of infected cells. Both activities are performed by the viral RNA-dependent RNA polymerase that is composed of the three subunits PA, PB1, and PB2, and recent studies have shown that it requires host cell factors to transcribe and replicate the viral genome. To identify these cellular partners, we generated a comprehensive physical interaction map between each polymerase subunit and the host cellular proteome. A total of 109 human interactors were identified by yeast two-hybrid screens, whereas 90 were retrieved by literature mining. We built the FluPol interactome network composed of the influenza virus polymerase (PA, PB1, and PB2) and the nucleoprotein NP and 234 human proteins that are connected through 279 viral-cellular protein interactions. Analysis of this interactome map revealed enriched cellular functions associated with the influenza virus polymerase, including host factors involved in RNA polymerase II-dependent transcription and mRNA processing. We confirmed that eight influenza virus polymerase-interacting proteins are required for virus replication and transcriptional activity of the viral polymerase. These are involved in cellular transcription (C14orf166, COPS5, MNAT1, NMI, and POLR2A), translation (EIF3S6IP), nuclear transport (NUP54), and DNA repair (FANCG). Conversely, we identified PRKRA, which acts as an inhibitor of the viral polymerase transcriptional activity and thus is required for the cellular antiviral response. PMID:21994455

  12. Generation and comprehensive analysis of an influenza virus polymerase cellular interaction network.

    Science.gov (United States)

    Tafforeau, Lionel; Chantier, Thibault; Pradezynski, Fabrine; Pellet, Johann; Mangeot, Philippe E; Vidalain, Pierre-Olivier; Andre, Patrice; Rabourdin-Combe, Chantal; Lotteau, Vincent

    2011-12-01

    The influenza virus transcribes and replicates its genome inside the nucleus of infected cells. Both activities are performed by the viral RNA-dependent RNA polymerase that is composed of the three subunits PA, PB1, and PB2, and recent studies have shown that it requires host cell factors to transcribe and replicate the viral genome. To identify these cellular partners, we generated a comprehensive physical interaction map between each polymerase subunit and the host cellular proteome. A total of 109 human interactors were identified by yeast two-hybrid screens, whereas 90 were retrieved by literature mining. We built the FluPol interactome network composed of the influenza virus polymerase (PA, PB1, and PB2) and the nucleoprotein NP and 234 human proteins that are connected through 279 viral-cellular protein interactions. Analysis of this interactome map revealed enriched cellular functions associated with the influenza virus polymerase, including host factors involved in RNA polymerase II-dependent transcription and mRNA processing. We confirmed that eight influenza virus polymerase-interacting proteins are required for virus replication and transcriptional activity of the viral polymerase. These are involved in cellular transcription (C14orf166, COPS5, MNAT1, NMI, and POLR2A), translation (EIF3S6IP), nuclear transport (NUP54), and DNA repair (FANCG). Conversely, we identified PRKRA, which acts as an inhibitor of the viral polymerase transcriptional activity and thus is required for the cellular antiviral response.

  13. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias

    2007-01-01

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  14. A novel registration-based methodology for prediction of trabecular bone fabric from clinical QCT: A comprehensive analysis.

    Directory of Open Access Journals (Sweden)

    Vimal Chandran

    Full Text Available Osteoporosis leads to hip fractures in aging populations and is diagnosed by modern medical imaging techniques such as quantitative computed tomography (QCT. Hip fracture sites involve trabecular bone, whose strength is determined by volume fraction and orientation, known as fabric. However, bone fabric cannot be reliably assessed in clinical QCT images of proximal femur. Accordingly, we propose a novel registration-based estimation of bone fabric designed to preserve tensor properties of bone fabric and to map bone fabric by a global and local decomposition of the gradient of a non-rigid image registration transformation. Furthermore, no comprehensive analysis on the critical components of this methodology has been previously conducted. Hence, the aim of this work was to identify the best registration-based strategy to assign bone fabric to the QCT image of a patient's proximal femur. The normalized correlation coefficient and curvature-based regularization were used for image-based registration and the Frobenius norm of the stretch tensor of the local gradient was selected to quantify the distance among the proximal femora in the population. Based on this distance, closest, farthest and mean femora with a distinction of sex were chosen as alternative atlases to evaluate their influence on bone fabric prediction. Second, we analyzed different tensor mapping schemes for bone fabric prediction: identity, rotation-only, rotation and stretch tensor. Third, we investigated the use of a population average fabric atlas. A leave one out (LOO evaluation study was performed with a dual QCT and HR-pQCT database of 36 pairs of human femora. The quality of the fabric prediction was assessed with three metrics, the tensor norm (TN error, the degree of anisotropy (DA error and the angular deviation of the principal tensor direction (PTD. The closest femur atlas (CTP with a full rotation (CR for fabric mapping delivered the best results with a TN error of 7

  15. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  16. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  17. Analysis of Biosignals During Immersion in Computer Games.

    Science.gov (United States)

    Yeo, Mina; Lim, Seokbeen; Yoon, Gilwon

    2017-11-17

    The number of computer game users is increasing as computers and various IT devices in connection with the Internet are commonplace in all ages. In this research, in order to find the relevance of behavioral activity and its associated biosignal, biosignal changes before and after as well as during computer games were measured and analyzed for 31 subjects. For this purpose, a device to measure electrocardiogram, photoplethysmogram and skin temperature was developed such that the effect of motion artifacts could be minimized. The device was made wearable for convenient measurement. The game selected for the experiments was League of Legends™. Analysis on the pulse transit time, heart rate variability and skin temperature showed increased sympathetic nerve activities during computer game, while the parasympathetic nerves became less active. Interestingly, the sympathetic predominance group showed less change in the heart rate variability as compared to the normal group. The results can be valuable for studying internet gaming disorder.

  18. Comprehensive protocol of traceability during IVF: the result of a multicentre failure mode and effect analysis.

    Science.gov (United States)

    Rienzi, L; Bariani, F; Dalla Zorza, M; Albani, E; Benini, F; Chamayou, S; Minasi, M G; Parmegiani, L; Restelli, L; Vizziello, G; Costa, A Nanni

    2017-08-01

    Can traceability of gametes and embryos be ensured during IVF? The use of a simple and comprehensive traceability system that includes the most susceptible phases during the IVF process minimizes the risk of mismatches. Mismatches in IVF are very rare but unfortunately possible with dramatic consequences for both patients and health care professionals. Traceability is thus a fundamental aspect of the treatment. A clear process of patient and cell identification involving witnessing protocols has to be in place in every unit. To identify potential failures in the traceability process and to develop strategies to mitigate the risk of mismatches, previously failure mode and effects analysis (FMEA) has been used effectively. The FMEA approach is however a subjective analysis, strictly related to specific protocols and thus the results are not always widely applicable. To reduce subjectivity and to obtain a widespread comprehensive protocol of traceability, a multicentre centrally coordinated FMEA was performed. Seven representative Italian centres (three public and four private) were selected. The study had a duration of 21 months (from April 2015 to December 2016) and was centrally coordinated by a team of experts: a risk analysis specialist, an expert embryologist and a specialist in human factor. Principal investigators of each centre were first instructed about proactive risk assessment and FMEA methodology. A multidisciplinary team to perform the FMEA analysis was then formed in each centre. After mapping the traceability process, each team identified the possible causes of mistakes in their protocol. A risk priority number (RPN) for each identified potential failure mode was calculated. The results of the FMEA analyses were centrally investigated and consistent corrective measures suggested. The teams performed new FMEA analyses after the recommended implementations. In each centre, this study involved: the laboratory director, the Quality Control & Quality

  19. The reading comprehension skills in English, on the professionals training in education in third year of the Labour Education and computing career

    Directory of Open Access Journals (Sweden)

    Agnes Aydely Leal

    2016-03-01

    Full Text Available This article evidence in detail, the different stages through which has passed the teaching of Labor Education and Information Technology, an analysis of docuemnts of the different curriculum with whom we worked and are currently working, the historical past of the development of reading comprehension skills in English as well as English for specific purpuse in the training of professionals in both profiles.

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  1. Comprehensive Safety Analysis of Venetoclax Monotherapy for Patients with Relapsed/Refractory Chronic Lymphocytic Leukemia.

    Science.gov (United States)

    Davids, Matthew S; Hallek, Michael; Wierda, William; Roberts, Andrew W; Stilgenbauer, Stephan; Jones, Jeffrey A; Gerecitano, John F; Kim, Su Young; Potluri, Jalaja; Busman, Todd; Best, Andrea; Verdugo, Maria E; Cerri, Elisa; Desai, Monali; Hillmen, Peter; Seymour, John F

    2018-06-12

    The oral BCL-2 inhibitor venetoclax is an effective therapy for patients with relapsed/refractory (R/R) chronic lymphocytic leukemia (CLL), including disease with high-risk genomic features such as chromosome 17p deletion (del[17p]) or progressive disease following B-cell receptor pathway inhibitors. We conducted a comprehensive analysis of the safety of 400mg daily venetoclax monotherapy in 350 patients with CLL using an integrated dataset from three phase-I/II studies. Median age was 66 years and 60% had del(17p). Patients had received a median of three prior therapies (range: 0-15); 42% previously received ibrutinib or idelalisib. Median duration of exposure to venetoclax was 16 months (0-56). In the pooled analysis, the most common adverse events (AEs) of any grade were diarrhea (41%), neutropenia (40%), nausea (39%), anemia (31%), fatigue (28%), and upper respiratory tract infection (25%). The most common grade 3/4 AEs were neutropenia (37%), anemia (17%), and thrombocytopenia (14%). With the current 5-week ramp-up dosing, the incidence of laboratory TLS was 1.4% (2/166), none had clinical sequelae, and all of these patients were able to ramp-up to a daily dose of 400mg. Grade 3/4 neutropenia was manageable with growth-factor support and dose adjustments; the incidence of serious infections in these patients was 15%. Ten percent of patients discontinued venetoclax due to AEs and 8% died while on study, with the majority of deaths in the setting of disease progression. Venetoclax as a long-term continuous therapy is generally well-tolerated in patients with R/R CLL when initiated with the current treatment algorithm. Copyright ©2018, American Association for Cancer Research.

  2. Graph theoretical analysis of functional network for comprehension of sign language.

    Science.gov (United States)

    Liu, Lanfang; Yan, Xin; Liu, Jin; Xia, Mingrui; Lu, Chunming; Emmorey, Karen; Chu, Mingyuan; Ding, Guosheng

    2017-09-15

    Signed languages are natural human languages using the visual-motor modality. Previous neuroimaging studies based on univariate activation analysis show that a widely overlapped cortical network is recruited regardless whether the sign language is comprehended (for signers) or not (for non-signers). Here we move beyond previous studies by examining whether the functional connectivity profiles and the underlying organizational structure of the overlapped neural network may differ between signers and non-signers when watching sign language. Using graph theoretical analysis (GTA) and fMRI, we compared the large-scale functional network organization in hearing signers with non-signers during the observation of sentences in Chinese Sign Language. We found that signed sentences elicited highly similar cortical activations in the two groups of participants, with slightly larger responses within the left frontal and left temporal gyrus in signers than in non-signers. Crucially, further GTA revealed substantial group differences in the topologies of this activation network. Globally, the network engaged by signers showed higher local efficiency (t (24) =2.379, p=0.026), small-worldness (t (24) =2.604, p=0.016) and modularity (t (24) =3.513, p=0.002), and exhibited different modular structures, compared to the network engaged by non-signers. Locally, the left ventral pars opercularis served as a network hub in the signer group but not in the non-signer group. These findings suggest that, despite overlap in cortical activation, the neural substrates underlying sign language comprehension are distinguishable at the network level from those for the processing of gestural action. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. A comprehensive analysis on preservation patterns of gene co-expression networks during Alzheimer's disease progression.

    Science.gov (United States)

    Ray, Sumanta; Hossain, Sk Md Mosaddek; Khatun, Lutfunnesa; Mukhopadhyay, Anirban

    2017-12-20

    Alzheimer's disease (AD) is a chronic neuro-degenerative disruption of the brain which involves in large scale transcriptomic variation. The disease does not impact every regions of the brain at the same time, instead it progresses slowly involving somewhat sequential interaction with different regions. Analysis of the expression patterns of the genes in different regions of the brain influenced in AD surely contribute for a enhanced comprehension of AD pathogenesis and shed light on the early characterization of the disease. Here, we have proposed a framework to identify perturbation and preservation characteristics of gene expression patterns across six distinct regions of the brain ("EC", "HIP", "PC", "MTG", "SFG", and "VCX") affected in AD. Co-expression modules were discovered considering a couple of regions at once. These are then analyzed to know the preservation and perturbation characteristics. Different module preservation statistics and a rank aggregation mechanism have been adopted to detect the changes of expression patterns across brain regions. Gene ontology (GO) and pathway based analysis were also carried out to know the biological meaning of preserved and perturbed modules. In this article, we have extensively studied the preservation patterns of co-expressed modules in six distinct brain regions affected in AD. Some modules are emerged as the most preserved while some others are detected as perturbed between a pair of brain regions. Further investigation on the topological properties of preserved and non-preserved modules reveals a substantial association amongst "betweenness centrality" and "degree" of the involved genes. Our findings may render a deeper realization of the preservation characteristics of gene expression patterns in discrete brain regions affected by AD.

  4. Comprehensive two-dimensional gas chromatography applied to illicit drug analysis.

    Science.gov (United States)

    Mitrevski, Blagoj; Wynne, Paul; Marriott, Philip J

    2011-11-01

    Multidimensional gas chromatography (MDGC), and especially its latest incarnation--comprehensive two-dimensional gas chromatography (GC × GC)--have proved advantageous over and above classic one-dimensional gas chromatography (1D GC) in many areas of analysis by offering improved peak capacity, often enhanced sensitivity and, especially in the case of GC × GC, the unique feature of 'structured' chromatograms. This article reviews recent advances in MDGC and GC × GC in drug analysis with special focus on ecstasy, heroin and cocaine profiling. Although 1D GC is still the method of choice for drug profiling in most laboratories because of its simplicity and instrument availability, GC × GC is a tempting proposition for this purpose because of its ability to generate a higher net information content. Effluent refocusing due to the modulation (compression) process, combined with the separation on two 'orthogonal' columns, results in more components being well resolved and therefore being analytically and statistically useful to the profile. The spread of the components in the two-dimensional plots is strongly dependent on the extent of retention 'orthogonality' (i.e. the extent to which the two phases possess different or independent retention mechanisms towards sample constituents) between the two columns. The benefits of 'information-driven' drug profiling, where more points of reference are usually required for sample differentiation, are discussed. In addition, several limitations in application of MDGC in drug profiling, including data acquisition rate, column temperature limit, column phase orthogonality and chiral separation, are considered and discussed. Although the review focuses on the articles published in the last decade, a brief chronological preview of the profiling methods used throughout the last three decades is given.

  5. A comprehensive sensitivity analysis of microarray breast cancer classification under feature variability

    Directory of Open Access Journals (Sweden)

    Reinders Marcel JT

    2009-11-01

    Full Text Available Abstract Background Large discrepancies in signature composition and outcome concordance have been observed between different microarray breast cancer expression profiling studies. This is often ascribed to differences in array platform as well as biological variability. We conjecture that other reasons for the observed discrepancies are the measurement error associated with each feature and the choice of preprocessing method. Microarray data are known to be subject to technical variation and the confidence intervals around individual point estimates of expression levels can be wide. Furthermore, the estimated expression values also vary depending on the selected preprocessing scheme. In microarray breast cancer classification studies, however, these two forms of feature variability are almost always ignored and hence their exact role is unclear. Results We have performed a comprehensive sensitivity analysis of microarray breast cancer classification under the two types of feature variability mentioned above. We used data from six state of the art preprocessing methods, using a compendium consisting of eight diferent datasets, involving 1131 hybridizations, containing data from both one and two-color array technology. For a wide range of classifiers, we performed a joint study on performance, concordance and stability. In the stability analysis we explicitly tested classifiers for their noise tolerance by using perturbed expression profiles that are based on uncertainty information directly related to the preprocessing methods. Our results indicate that signature composition is strongly influenced by feature variability, even if the array platform and the stratification of patient samples are identical. In addition, we show that there is often a high level of discordance between individual class assignments for signatures constructed on data coming from different preprocessing schemes, even if the actual signature composition is identical

  6. A comprehensive analysis of coherent rainfall patterns in China and potential drivers. Part I: Interannual variability

    Science.gov (United States)

    Stephan, Claudia Christine; Klingaman, Nicholas Pappas; Vidale, Pier Luigi; Turner, Andrew George; Demory, Marie-Estelle; Guo, Liang

    2018-06-01

    Interannual rainfall variability in China affects agriculture, infrastructure and water resource management. To improve its understanding and prediction, many studies have associated precipitation variability with particular causes for specific seasons and regions. Here, a consistent and objective method, Empirical Orthogonal Teleconnection (EOT) analysis, is applied to 1951-2007 high-resolution precipitation observations over China in all seasons. Instead of maximizing the explained space-time variance, the method identifies regions in China that best explain the temporal variability in domain-averaged rainfall. The EOT method is validated by the reproduction of known relationships to the El Niño Southern Oscillation (ENSO): high positive correlations with ENSO are found in eastern China in winter, along the Yangtze River in summer, and in southeast China during spring. New findings include that wintertime rainfall variability along the southeast coast is associated with anomalous convection over the tropical eastern Atlantic and communicated to China through a zonal wavenumber-three Rossby wave. Furthermore, spring rainfall variability in the Yangtze valley is related to upper-tropospheric midlatitude perturbations that are part of a Rossby wave pattern with its origin in the North Atlantic. A circumglobal wave pattern in the northern hemisphere is also associated with autumn precipitation variability in eastern areas. The analysis is objective, comprehensive, and produces timeseries that are tied to specific locations in China. This facilitates the interpretation of associated dynamical processes, is useful for understanding the regional hydrological cycle, and allows the results to serve as a benchmark for assessing general circulation models.

  7. Life Cycle Assessment, ExternE and Comprehensive Analysis for an integrated evaluation of the environmental impact of anthropogenic activities

    Energy Technology Data Exchange (ETDEWEB)

    Pietrapertosa, F.; Cosmi, C. [National Research Council, Institute of Methodologies for Environmental Analysis C.N.R.-I.M.A.A. C.da S.Loja, I-85050 Tito Scalo (PZ) (Italy); National Research Council, National Institute for the Physics of Matter, C.N.R.-I.N.F.M. Via Cinthia, I-80126 Naples (Italy); Macchiato, M. [Federico II University, Department of Physical Sciences, Via Cinthia, I-80126 Naples (Italy); National Research Council, National Institute for the Physics of Matter, C.N.R.-I.N.F.M. Via Cinthia, I-80126 Naples (Italy); Salvia, M.; Cuomo, V. [National Research Council, Institute of Methodologies for Environmental Analysis C.N.R.-I.M.A.A. C.da S.Loja, I-85050 Tito Scalo (PZ) (Italy)

    2009-06-15

    The implementation of resource management strategies aimed at reducing the impacts of the anthropogenic activities system requires a comprehensive approach to evaluate on the whole the environmental burdens of productive processes and to identify the best recovery strategies from both an environmental and an economic point of view. In this framework, an analytical methodology based on the integration of Life Cycle Assessment (LCA), ExternE and Comprehensive Analysis was developed to perform an in-depth investigation of energy systems. The LCA methodology, largely utilised by the international scientific community for the assessment of the environmental performances of technologies, combined with Comprehensive Analysis allows modelling the overall system of anthropogenic activities, as well as sub-systems, the economic consequences of the whole set of environmental damages. Moreover, internalising external costs into partial equilibrium models, as those utilised by Comprehensive Analysis, can be useful to identify the best paths for implementing technology innovation and strategies aimed to a more sustainable energy supply and use. This paper presents an integrated application of these three methodologies to a local scale case study (the Val D'Agri area in Basilicata, Southern Italy), aimed to better characterise the environmental impacts of the energy system, with particular reference to extraction activities. The innovative methodological approach utilised takes advantage from the strength points of each methodology with an added value coming from their integration as emphasised by the main results obtained by the scenario analysis. (author)

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  9. Two-dimensional chromatographic analysis using three second-dimension columns for continuous comprehensive analysis of intact proteins.

    Science.gov (United States)

    Zhu, Zaifang; Chen, Huang; Ren, Jiangtao; Lu, Juan J; Gu, Congying; Lynch, Kyle B; Wu, Si; Wang, Zhe; Cao, Chengxi; Liu, Shaorong

    2018-03-01

    We develop a new two-dimensional (2D) high performance liquid chromatography (HPLC) approach for intact protein analysis. Development of 2D HPLC has a bottleneck problem - limited second-dimension (second-D) separation speed. We solve this problem by incorporating multiple second-D columns to allow several second-D separations to be proceeded in parallel. To demonstrate the feasibility of using this approach for comprehensive protein analysis, we select ion-exchange chromatography as the first-dimension and reverse-phase chromatography as the second-D. We incorporate three second-D columns in an innovative way so that three reverse-phase separations can be performed simultaneously. We test this system for separating both standard proteins and E. coli lysates and achieve baseline resolutions for eleven standard proteins and obtain more than 500 peaks for E. coli lysates. This is an indication that the sample complexities are greatly reduced. We see less than 10 bands when each fraction of the second-D effluents are analyzed by sodium dodecyl sulfate - polyacrylamide gel electrophoresis (SDS-PAGE), compared to hundreds of SDS-PAGE bands as the original sample is analyzed. This approach could potentially be an excellent and general tool for protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. SALP-PC, a computer program for fault tree analysis on personal computers

    International Nuclear Information System (INIS)

    Contini, S.; Poucet, A.

    1987-01-01

    The paper presents the main characteristics of the SALP-PC computer code for fault tree analysis. The program has been developed in Fortran 77 on an Olivetti M24 personal computer (IBM compatible) in order to reach a high degree of portability. It is composed of six processors implementing the different phases of the analysis procedure. This particular structure presents some advantages like, for instance, the restart facility and the possibility to develop an event tree analysis code. The set of allowed logical operators, i.e. AND, OR, NOT, K/N, XOR, INH, together with the possibility to define boundary conditions, make the SALP-PC code a powerful tool for risk assessment. (orig.)

  11. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...

  12. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  13. Practical computer analysis of switch mode power supplies

    CERN Document Server

    Bennett, Johnny C

    2006-01-01

    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  14. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  15. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  16. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  17. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  18. Conversation Analysis in Computer-Assisted Language Learning

    Science.gov (United States)

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  19. Computational content analysis of European Central Bank statements

    NARCIS (Netherlands)

    Milea, D.V.; Almeida, R.J.; Sharef, N.M.; Kaymak, U.; Frasincar, F.

    2012-01-01

    In this paper we present a framework for the computational content analysis of European Central Bank (ECB) statements. Based on this framework, we provide two approaches that can be used in a practical context. Both approaches use the content of ECB statements to predict upward and downward movement

  20. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  1. HAMOC: a computer program for fluid hammer analysis

    International Nuclear Information System (INIS)

    Johnson, H.G.

    1975-12-01

    A computer program has been developed for fluid hammer analysis of piping systems attached to a vessel which has undergone a known rapid pressure transient. The program is based on the characteristics method for solution of the partial differential equations of motion and continuity. Column separation logic is included for situations in which pressures fall to saturation values

  2. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  3. Informational-computer system for the neutron spectra analysis

    International Nuclear Information System (INIS)

    Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.

    1979-01-01

    In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described

  4. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...

  5. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  6. COALA--A Computational System for Interlanguage Analysis.

    Science.gov (United States)

    Pienemann, Manfred

    1992-01-01

    Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)

  7. New Mexico’s comprehensive impaired-driving program : crash data analysis.

    Science.gov (United States)

    2014-03-01

    In late 2004, the National Highway Traffic Safety Administration provided funds through a Cooperative Agreement to the New Mexico Department of Transportation to demonstrate a process for implementing a comprehensive State impaired-driving system. NH...

  8. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  9. Comprehensive geriatric assessment for older adults admitted to hospital: meta-analysis of randomised controlled trials

    OpenAIRE

    Ellis, G.; Whitehead, M.A.; Robinson, D.; O'Neill, D.; Langhorne, P.

    2011-01-01

    Objective - To evaluate the effectiveness of comprehensive geriatric assessment in hospital for older adults admitted as an emergency.\\ud \\ud Search strategy - We searched the EPOC Register, Cochrane’s Controlled Trials Register, the Database of Abstracts of Reviews of Effects (DARE), Medline, Embase, CINAHL, AARP Ageline, and handsearched high yield journals.\\ud \\ud Selection criteria - Randomised controlled trials of comprehensive geriatric assessment (whether by mobile teams or in designat...

  10. Risk Analysis Based on AHP and Fuzzy Comprehensive Evaluation for Maglev Train Bogie

    OpenAIRE

    Fan, Chengxin; Dou, Fengshan; Tong, Baiming; Long, Zhiqiang

    2016-01-01

    The maglev bogie is the key subsystem for maglev train security. To ensure life and property security, it is essential to evaluate its risk level before its operation. In this paper, a combinational method of analytic hierarchy process and fuzzy comprehensive evaluation is proposed to assess hazards in a complex maglev bogie system associated with multiple subsystems’ failures. The very comprehensive identification of risk sources has been done by analyzing the structure of maglev bogie. Furt...

  11. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  12. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  13. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  15. Practical considerations for optimizing cardiac computed tomography protocols for comprehensive acquisition prior to transcatheter aortic valve replacement.

    Science.gov (United States)

    Khalique, Omar K; Pulerwitz, Todd C; Halliburton, Sandra S; Kodali, Susheel K; Hahn, Rebecca T; Nazif, Tamim M; Vahl, Torsten P; George, Isaac; Leon, Martin B; D'Souza, Belinda; Einstein, Andrew J

    2016-01-01

    Transcatheter aortic valve replacement (TAVR) is performed frequently in patients with severe, symptomatic aortic stenosis who are at high risk or inoperable for open surgical aortic valve replacement. Computed tomography angiography (CTA) has become the gold standard imaging modality for pre-TAVR cardiac anatomic and vascular access assessment. Traditionally, cardiac CTA has been most frequently used for assessment of coronary artery stenosis, and scanning protocols have generally been tailored for this purpose. Pre-TAVR CTA has different goals than coronary CTA and the high prevalence of chronic kidney disease in the TAVR patient population creates a particular need to optimize protocols for a reduction in iodinated contrast volume. This document reviews details which allow the physician to tailor CTA examinations to maximize image quality and minimize harm, while factoring in multiple patient and scanner variables which must be considered in customizing a pre-TAVR protocol. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  16. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  17. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  18. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  19. Computational Text Analysis: A More Comprehensive Approach to Determine Readability of Reading Materials

    Science.gov (United States)

    Aziz, Anealka; Fook, Chan Yuen; Alsree, Zubaida

    2010-01-01

    Reading materials are considered having high readability if readers are interested to read the materials, understand the content of the materials and able to read the materials fluently. In contrast, reading materials with low readability discourage readers from reading the materials, create difficulties for readers to understand the content of…

  20. Comprehensive two-dimensional gas chromatography for biogas and biomethane analysis.

    Science.gov (United States)

    Hilaire, F; Basset, E; Bayard, R; Gallardo, M; Thiebaut, D; Vial, J

    2017-11-17

    The gas industry is going to be revolutionized by being able to generate bioenergy from biomass. The production of biomethane - a green substitute of natural gas - is growing in Europe and the United-States of America. Biomethane can be injected into the gas grid or used as fuel for vehicles after compression. Due to various biomass inputs (e.g. agricultural wastes, sludges from sewage treatment plants, etc.), production processes (e.g. anaerobic digestion, municipal solid waste (MSW) landfills), seasonal effects and purification processes (e.g. gas scrubbers, pressure swing adsorption, membranes for biogas upgrading), the composition and quality of biogas and biomethane produced is difficult to assess. All previous publications dealing with biogas analysis reported that hundreds of chemicals from ten chemical families do exist in trace amounts in biogas. However, to the best of our knowledge, no study reported a detailed analysis or the implementation of comprehensive two-dimensional gas chromatography (GC×GC) for biogas matrices. This is the reason why the benefit of implementing two-dimensional gas chromatography for the characterization of biogas and biomethane samples was evaluated. In a first step, a standard mixture of 89 compounds belonging to 10 chemical families, representative of those likely to be found, was used to optimize the analytical method. A set consisting of a non-polar and a polar columns, respectively in the first and the second dimension, was used with a modulation period of six seconds. Applied to ten samples of raw biogas, treated biogas and biomethane collected on 4 industrial sites (two MSW landfills, one anaerobic digester on a wastewater treatment plant and one agricultural biogas plant), this analytical method provided a "fingerprint" of the gases composition at the molecular level in all biogas and biomethane samples. Estimated limits of detection (far below the μgNm -3 ) coupled with the resolution of GC×GC allowed the comparison

  1. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  2. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  3. Material flow analysis of NdFeB magnets for Denmark: a comprehensive waste flow sampling and analysis approach.

    Science.gov (United States)

    Habib, Komal; Schibye, Peter Klausen; Vestbø, Andreas Peter; Dall, Ole; Wenzel, Henrik

    2014-10-21

    Neodymium-iron-boron (NdFeB) magnets have become highly desirable for modern hi-tech applications. These magnets, in general, contain two key rare earth elements (REEs), i.e., neodymium (Nd) and dysprosium (Dy), which are responsible for the very high strength of these magnets, allowing for considerable size and weight reduction in modern applications. This study aims to explore the current and future potential of a secondary supply of neodymium and dysprosium from recycling of NdFeB magnets. For this purpose, material flow analysis (MFA) has been carried out to perform the detailed mapping of stocks and flows of NdFeB magnets in Denmark. A novel element of this study is the value added to the traditionally practiced MFAs at national and/or global levels by complementing them with a comprehensive sampling and elemental analysis of NdFeB magnets, taken out from a sample of 157 different products representing 18 various product types. The results show that the current amount of neodymium and dysprosium in NdFeB magnets present in the Danish waste stream is only 3 and 0.2 Mg, respectively. However, this number is estimated to increase to 175 Mg of neodymium and 11.4 Mg of dysprosium by 2035. Nevertheless, efficient recovery of these elements from a very diverse electronic waste stream remains a logistic and economic challenge.

  4. Comprehensive analysis of the renal transcriptional response to acute uranyl nitrate exposure

    Directory of Open Access Journals (Sweden)

    Argiles Angel

    2006-01-01

    Full Text Available Abstract Background Chemical and radiological toxicities related to uranium acute exposure have been widely studied in nuclear fuel workers and military personnel. It is well known that uranyl nitrate induces acute renal failure (ARF. However, the mechanisms of this metal-induced injury are not well defined at the molecular level. Results Renal function and histology were assessed in mice receiving uranyl nitrate (UN(+ and controls (UN(-. To identify the genomic response to uranium exposure, serial analysis gene expression (SAGE of the kidney was performed in both groups. Over 43,000 mRNA SAGE tags were sequenced. A selection of the differentially expressed transcripts was confirmed by real-time quantitative PCR and Western blotting. UN(+ animals developed renal failure and displayed the characteristic histological lesions of UN nephropathy. Of the >14,500 unique tags identified in both libraries, 224 had a modified expression level; they are known to participate in inflammation, ion transport, signal transduction, oxidative stress, apoptosis, metabolism, and catabolism. Several genes that were identified had not previously been evaluated within the context of toxic ARF such as translationally controlled tumor protein, insulin like growth factor binding protein 7 and ribosomal protein S29, all apoptosis related genes. Conclusion We report a comprehensive description of the UN induced modifications in gene expression levels, including the identification of genes previously unrelated to ARF. The study of these genes and the metabolisms they control should improve our understanding of toxic ARF and enlighten on the molecular targets for potential therapeutic interventions.

  5. A comprehensive analysis of factors influencing the injury severity of large-truck crashes.

    Science.gov (United States)

    Zhu, Xiaoyu; Srinivasan, Sivaramakrishnan

    2011-01-01

    Given the importance of trucking to the economic well being of a country and the safety concerns posed by the trucks, a study of large-truck crashes is critical. This paper contributes by undertaking an extensive analysis of the empirical factors affecting injury severity of large-truck crashes. Data from a recent, nationally representative sample of large-truck crashes are examined to determine the factors affecting the overall injury severity of these crashes. The explanatory factors include the characteristics of the crash, vehicle(s), and the driver(s). The injury severity was modeled using two measures. Several similarities and some differences were observed across the two models which underscore the need for improved accuracy in the assessment of injury severity of crashes. The estimated models capture the marginal effects of a variety of explanatory factors simultaneously. In particular, the models indicate the impacts of several driver behavior variables on the severity of the crashes, after controlling for a variety of other factors. For example, driver distraction (truck drivers), alcohol use (car drivers), and emotional factors (car drivers) are found to be associated with higher severity crashes. A further interesting finding is the strong statistical significance of several dummy variables that indicate missing data - these reflect how the nature of the crash itself could affect the completeness of the data. Future efforts should seek to collect such data more comprehensively so that the true effects of these aspects on the crash severity can be determined. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Work Experiences of Patients Receiving Palliative Care at a Comprehensive Cancer Center: Exploratory Analysis.

    Science.gov (United States)

    Glare, Paul A; Nikolova, Tanya; Alickaj, Alberta; Patil, Sujata; Blinder, Victoria

    2017-07-01

    Employment-related issues have been largely overlooked in cancer patients needing palliative care. These issues may become more relevant as cancer evolves into more of a chronic illness and palliative care is provided independent of stage or prognosis. To characterize the employment situations of working-age palliative care patients. Cross-sectional survey setting/subjects: Consecutive sample of 112 patients followed in palliative care outpatient clinics at a comprehensive cancer center. Thirty-seven-item self-report questionnaire covering demographics, clinical status, and work experiences since diagnosis. The commonest cancer diagnoses were breast, colorectal, gynecological, and lung. Eighty-one percent had active disease. Seventy-four percent were on treatment. Eighty percent recalled being employed at the time of diagnosis, with 65% working full time. At the time of the survey, 44% were employed and 26% were working full time. Most participants said work was important, made them feel normal, and helped them feel they were "beating the cancer". Factors associated with being employed included male gender, self-employed, and taking less than three months off work. Respondents with pain and/or other symptoms were significantly less likely to be working. On multivariate analysis, only pain (odds ratio [OR] 8.16, p gender (OR 2.07), self-employed (OR 3.07), and current chemotherapy (OR 1.81) were included in the model, but were not statistically significant in this small sample. Work may be an important issue for some palliative care patients. Additional research is needed to facilitate ongoing employment for those who wish or need to continue working.

  7. Interpersonal Psychotherapy for Mental Health Problems: A Comprehensive Meta-Analysis.

    Science.gov (United States)

    Cuijpers, Pim; Donker, Tara; Weissman, Myrna M; Ravitz, Paula; Cristea, Ioana A

    2016-07-01

    Interpersonal psychotherapy (IPT) has been developed for the treatment of depression but has been examined for several other mental disorders. A comprehensive meta-analysis of all randomized trials examining the effects of IPT for all mental health problems was conducted. Searches in PubMed, PsycInfo, Embase, and Cochrane were conducted to identify all trials examining IPT for any mental health problem. Ninety studies with 11,434 participants were included. IPT for acute-phase depression had moderate-to-large effects compared with control groups (g=0.60; 95% CI=0.45-0.75). No significant difference was found with other therapies (differential g=0.06) and pharmacotherapy (g=-0.13). Combined treatment was more effective than IPT alone (g=0.24). IPT in subthreshold depression significantly prevented the onset of major depression, and maintenance IPT significantly reduced relapse. IPT had significant effects on eating disorders, but the effects are probably slightly smaller than those of cognitive-behavioral therapy (CBT) in the acute phase of treatment. In anxiety disorders, IPT had large effects compared with control groups, and there is no evidence that IPT was less effective than CBT. There was risk of bias as defined by the Cochrane Collaboration in the majority of studies. There was little indication that the presence of bias influenced outcome. IPT is effective in the acute treatment of depression and may be effective in the prevention of new depressive disorders and in preventing relapse. IPT may also be effective in the treatment of eating disorders and anxiety disorders and has shown promising effects in some other mental health disorders.

  8. A comprehensive analysis of gene expression changes provoked by bacterial and fungal infection in C. elegans.

    Directory of Open Access Journals (Sweden)

    Ilka Engelmann

    Full Text Available While Caenorhabditis elegans specifically responds to infection by the up-regulation of certain genes, distinct pathogens trigger the expression of a common set of genes. We applied new methods to conduct a comprehensive and comparative study of the transcriptional response of C. elegans to bacterial and fungal infection. Using tiling arrays and/or RNA-sequencing, we have characterized the genome-wide transcriptional changes that underlie the host's response to infection by three bacterial (Serratia marcescens, Enterococcus faecalis and otorhabdus luminescens and two fungal pathogens (Drechmeria coniospora and Harposporium sp.. We developed a flexible tool, the WormBase Converter (available at http://wormbasemanager.sourceforge.net/, to allow cross-study comparisons. The new data sets provided more extensive lists of differentially regulated genes than previous studies. Annotation analysis confirmed that genes commonly up-regulated by bacterial infections are related to stress responses. We found substantial overlaps between the genes regulated upon intestinal infection by the bacterial pathogens and Harposporium, and between those regulated by Harposporium and D. coniospora, which infects the epidermis. Among the fungus-regulated genes, there was a significant bias towards genes that are evolving rapidly and potentially encode small proteins. The results obtained using new methods reveal that the response to infection in C. elegans is determined by the nature of the pathogen, the site of infection and the physiological imbalance provoked by infection. They form the basis for future functional dissection of innate immune signaling. Finally, we also propose alternative methods to identify differentially regulated genes that take into account the greater variability in lowly expressed genes.

  9. Comprehensive analysis of LANA interacting proteins essential for viral genome tethering and persistence.

    Directory of Open Access Journals (Sweden)

    Subhash C Verma

    Full Text Available Kaposi's sarcoma associated herpesvirus is tightly linked to multiple human malignancies including Kaposi's sarcoma (KS, Primary Effusion Lymphoma (PEL and Multicentric Castleman's Disease (MCD. KSHV like other herpesviruses establishes life-long latency in the infected host by persisting as chromatin and tethering to host chromatin through the virally encoded protein Latency Associated Nuclear Antigen (LANA. LANA, a multifunctional protein, is capable of binding to a large number of cellular proteins responsible for transcriptional regulation of various cellular and viral pathways involved in blocking cell death and promoting cell proliferation. This leads to enhanced cell division and replication of the viral genome, which segregates faithfully in the dividing tumor cells. The mechanism of genome segregation is well known and the binding of LANA to nucleosomal proteins, throughout the cell cycle, suggests that these interactions play an important role in efficient segregation. Various biochemical methods have identified a large number of LANA binding proteins, including histone H2A/H2B, histone H1, MeCP2, DEK, CENP-F, NuMA, Bub1, HP-1, and Brd4. These nucleosomal proteins may have various functions in tethering of the viral genome during specific phases of the viral life cycle. Therefore, we performed a comprehensive analysis of their interaction with LANA using a number of different assays. We show that LANA binds to core nucleosomal histones and also associates with other host chromatin proteins including histone H1 and high mobility group proteins (HMGs. We used various biochemical assays including co-immunoprecipitation and in-vivo localization by split GFP and fluorescence resonance energy transfer (FRET to demonstrate their association.

  10. Deep Insight into the Ganoderma lucidum by Comprehensive Analysis of Its Transcriptome

    Science.gov (United States)

    Yu, Guo-Jun; Wang, Man; Huang, Jie; Yin, Ya-Lin; Chen, Yi-Jie; Jiang, Shuai; Jin, Yan-Xia; Lan, Xian-Qing; Wong, Barry Hon Cheung; Liang, Yi; Sun, Hui

    2012-01-01

    Background Ganoderma lucidum is a basidiomycete white rot fungus and is of medicinal importance in China, Japan and other countries in the Asiatic region. To date, much research has been performed in identifying the medicinal ingredients in Ganoderma lucidum. Despite its important therapeutic effects in disease, little is known about Ganoderma lucidum at the genomic level. In order to gain a molecular understanding of this fungus, we utilized Illumina high-throughput technology to sequence and analyze the transcriptome of Ganoderma lucidum. Methodology/Principal Findings We obtained 6,439,690 and 6,416,670 high-quality reads from the mycelium and fruiting body of Ganoderma lucidum, and these were assembled to form 18,892 and 27,408 unigenes, respectively. A similarity search was performed against the NCBI non-redundant nucleotide database and a customized database composed of five fungal genomes. 11,098 and 8, 775 unigenes were matched to the NCBI non-redundant nucleotide database and our customized database, respectively. All unigenes were subjected to annotation by Gene Ontology, Eukaryotic Orthologous Group terms and Kyoto Encyclopedia of Genes and Genomes. Differentially expressed genes from the Ganoderma lucidum mycelium and fruiting body stage were analyzed, resulting in the identification of 13 unigenes which are involved in the terpenoid backbone biosynthesis pathway. Quantitative real-time PCR was used to confirm the expression levels of these unigenes. Ganoderma lucidum was also studied for wood degrading activity and a total of 22 putative FOLymes (fungal oxidative lignin enzymes) and 120 CAZymes (carbohydrate-active enzymes) were predicted from our Ganoderma lucidum transcriptome. Conclusions Our study provides comprehensive gene expression information on Ganoderma lucidum at the transcriptional level, which will form the foundation for functional genomics studies in this fungus. The use of Illumina sequencing technology has made de novo transcriptome

  11. Comprehensive analysis of alternative splicing and functionality in neuronal differentiation of P19 cells.

    Directory of Open Access Journals (Sweden)

    Hitoshi Suzuki

    Full Text Available BACKGROUND: Alternative splicing, which produces multiple mRNAs from a single gene, occurs in most human genes and contributes to protein diversity. Many alternative isoforms are expressed in a spatio-temporal manner, and function in diverse processes, including in the neural system. METHODOLOGY/PRINCIPAL FINDINGS: The purpose of the present study was to comprehensively investigate neural-splicing using P19 cells. GeneChip Exon Array analysis was performed using total RNAs purified from cells during neuronal cell differentiation. To efficiently and readily extract the alternative exon candidates, 9 filtering conditions were prepared, yielding 262 candidate exons (236 genes. Semiquantitative RT-PCR results in 30 randomly selected candidates suggested that 87% of the candidates were differentially alternatively spliced in neuronal cells compared to undifferentiated cells. Gene ontology and pathway analyses suggested that many of the candidate genes were associated with neural events. Together with 66 genes whose functions in neural cells or organs were reported previously, 47 candidate genes were found to be linked to 189 events in the gene-level profile of neural differentiation. By text-mining for the alternative isoform, distinct functions of the isoforms of 9 candidate genes indicated by the result of Exon Array were confirmed. CONCLUSIONS/SIGNIFICANCE: Alternative exons were successfully extracted. Results from the informatics analyses suggested that neural events were primarily governed by genes whose expression was increased and whose transcripts were differentially alternatively spliced in the neuronal cells. In addition to known functions in neural cells or organs, the uninvestigated alternative splicing events of 11 genes among 47 candidate genes suggested that cell cycle events are also potentially important. These genes may help researchers to differentiate the roles of alternative splicing in cell differentiation and cell

  12. Comprehensive analysis of pipeline transportation systems for CO2 sequestration. Thermodynamics and safety problems

    International Nuclear Information System (INIS)

    Witkowski, Andrzej; Rusin, Andrzej; Majkut, Mirosław; Rulik, Sebastian; Stolecka, Katarzyna

    2013-01-01

    Highlights: • Comprehensive analysis of the efficiency and safety strategies of transport CO 2 . • Selection of safety zones around pipelines transporting CO 2 . • Optimization of CO 2 pipeline transportation conditions. - Abstract: The aim of this paper is to analyze CO 2 compression and transportation processes with safety issues for post-combustion CO 2 capture applications for basic technological concepts of a 900 MW pulverized coal-fired power plant. Four various types of compressors including a conventional multistage centrifugal compressor, an integrally geared centrifugal compressor, a supersonic shock wave compressor, and pump machines were used. This study emphasizes that total compression power is a strong function of the thermodynamic process and is not only determined by the compressor efficiency. The compressor increases the CO 2 pressure from normal pressure to critical pressure and the boosting pump continues to increase the pressure to the required pressure for the pipeline inlet. Another problem analyzed in this study is the transport of CO 2 by pipeline from the compressor outlet site to the disposal site under heat transfer conditions. Simulations were made to determine maximum safe pipeline distance to subsequent booster stations depending on inlet pressure, environmental temperature, the thermal insulation thickness and the ground level heat transfer conditions. From the point of view of environmental protection, the most important problem is to identify the hazards which indirectly affect CO 2 transportation in a strict and reliable manner. This identification is essential for effective hazard management. A failure of pipelines is usually caused by corrosion, material defects, ground movement or third party interference. After the rupture of the pipeline transporting liquid CO 2 , a large pressure drop will occur. The pressure will continue to fall until the liquid becomes a mixture of saturated vapour/liquid. In the vicinity of the

  13. A comprehensive analysis of the evaporation of a liquid spherical drop.

    Science.gov (United States)

    Sobac, B; Talbot, P; Haut, B; Rednikov, A; Colinet, P

    2015-01-15

    In this paper, a new comprehensive analysis of a suspended drop of a pure liquid evaporating into air is presented. Based on mass and energy conservation equations, a quasi-steady model is developed including diffusive and convective transports, and considering the non-isothermia of the gas phase. The main original feature of this simple analytical model lies in the consideration of the local dependence of the physico-chemical properties of the gas on the gas temperature, which has a significant influence on the evaporation process at high temperatures. The influence of the atmospheric conditions on the interfacial evaporation flux, molar fraction and temperature is investigated. Simplified versions of the model are developed to highlight the key mechanisms governing the evaporation process. For the conditions considered in this work, the convective transport appears to be opposed to the evaporation process leading to a decrease of the evaporation flux. However, this effect is relatively limited, the Péclet numbers happening to be small. In addition, the gas isothermia assumption never appears to be valid here, even at room temperature, due to the large temperature gradient that develops in the gas phase. These two conclusions are explained by the fact that heat transfer from the gas to the liquid appears to be the step limiting the evaporation process. Regardless of the complexity of the developed model, yet excluding extremely small droplets, the square of the drop radius decreases linearly over time (R(2) law). The assumptions of the model are rigorously discussed and general criteria are established, independently of the liquid-gas couple considered. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  15. Available computer codes and data for radiation transport analysis

    International Nuclear Information System (INIS)

    Trubey, D.K.; Maskewitz, B.F.; Roussin, R.W.

    1975-01-01

    The Radiation Shielding Information Center (RSIC), sponsored and supported by the Energy Research and Development Administration (ERDA) and the Defense Nuclear Agency (DNA), is a technical institute serving the radiation transport and shielding community. It acquires, selects, stores, retrieves, evaluates, analyzes, synthesizes, and disseminates information on shielding and ionizing radiation transport. The major activities include: (1) operating a computer-based information system and answering inquiries on radiation analysis, (2) collecting, checking out, packaging, and distributing large computer codes, and evaluated and processed data libraries. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  16. Computational analysis in support of the SSTO flowpath test

    Science.gov (United States)

    Duncan, Beverly S.; Trefny, Charles J.

    1994-10-01

    A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.

  17. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  18. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  19. Comprehensive geriatric assessment for older adults admitted to hospital: meta-analysis of randomised controlled trials.

    Science.gov (United States)

    Ellis, Graham; Whitehead, Martin A; Robinson, David; O'Neill, Desmond; Langhorne, Peter

    2011-10-27

    To evaluate the effectiveness of comprehensive geriatric assessment in hospital for older adults admitted as an emergency. We searched the EPOC Register, Cochrane's Controlled Trials Register, the Database of Abstracts of Reviews of Effects (DARE), Medline, Embase, CINAHL, AARP Ageline, and handsearched high yield journals. Randomised controlled trials of comprehensive geriatric assessment (whether by mobile teams or in designated wards) compared with usual care. Comprehensive geriatric assessment is a multidimensional interdisciplinary diagnostic process used to determine the medical, psychological, and functional capabilities of a frail elderly person to develop a coordinated and integrated plan for treatment and long term follow-up. Three independent reviewers assessed eligibility and trial quality and extracted published data. Two additional reviewers moderated. Twenty two trials evaluating 10,315 participants in six countries were identified. For the primary outcome "living at home," patients who underwent comprehensive geriatric assessment were more likely to be alive and in their own homes at the end of scheduled follow-up (odds ratio 1.16 (95% confidence interval 1.05 to 1.28; P = 0.003; number needed to treat 33) at a median follow-up of 12 months versus 1.25 (1.11 to 1.42; P P P = 0.001) and were more likely to experience improved cognition (standardised mean difference 0.08, 0.01 to 0.15; P = 0.02) in the comprehensive geriatric assessment group. Comprehensive geriatric assessment increases patients' likelihood of being alive and in their own homes after an emergency admission to hospital. This seems to be especially true for trials of wards designated for comprehensive geriatric assessment and is associated with a potential cost reduction compared with general medical care.

  20. First Comprehensive In Silico Analysis of the Functional and Structural Consequences of SNPs in Human GalNAc-T1 Gene

    Directory of Open Access Journals (Sweden)

    Hussein Sheikh Ali Mohamoud

    2014-01-01

    Full Text Available GalNAc-T1, a key candidate of GalNac-transferases genes family that is involved in mucin-type O-linked glycosylation pathway, is expressed in most biological tissues and cell types. Despite the reported association of GalNAc-T1 gene mutations with human disease susceptibility, the comprehensive computational analysis of coding, noncoding and regulatory SNPs, and their functional impacts on protein level, still remains unknown. Therefore, sequence- and structure-based computational tools were employed to screen the entire listed coding SNPs of GalNAc-T1 gene in order to identify and characterize them. Our concordant in silico analysis by SIFT, PolyPhen-2, PANTHER-cSNP, and SNPeffect tools, identified the potential nsSNPs (S143P, G258V, and Y414D variants from 18 nsSNPs of GalNAc-T1. Additionally, 2 regulatory SNPs (rs72964406 and #x26; rs34304568 were also identified in GalNAc-T1 by using FastSNP tool. Using multiple computational approaches, we have systematically classified the functional mutations in regulatory and coding regions that can modify expression and function of GalNAc-T1 enzyme. These genetic variants can further assist in better understanding the wide range of disease susceptibility associated with the mucin-based cell signalling and pathogenic binding, and may help to develop novel therapeutic elements for associated diseases.

  1. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  2. The Reliability of Classifications of Proximal Femoral Fractures with 3-Dimensional Computed Tomography: The New Concept of Comprehensive Classification

    Directory of Open Access Journals (Sweden)

    Hiroaki Kijima

    2014-01-01

    Full Text Available The reliability of proximal femoral fracture classifications using 3DCT was evaluated, and a comprehensive “area classification” was developed. Eleven orthopedists (5–26 years from graduation classified 27 proximal femoral fractures at one hospital from June 2013 to July 2014 based on preoperative images. Various classifications were compared to “area classification.” In “area classification,” the proximal femur is divided into 4 areas with 3 boundary lines: Line-1 is the center of the neck, Line-2 is the border between the neck and the trochanteric zone, and Line-3 links the inferior borders of the greater and lesser trochanters. A fracture only in the first area was classified as a pure first area fracture; one in the first and second area was classified as a 1-2 type fracture. In the same way, fractures were classified as pure 2, 3-4, 1-2-3, and so on. “Area classification” reliability was highest when orthopedists with varying experience classified proximal femoral fractures using 3DCT. Other classifications cannot classify proximal femoral fractures if they exceed each classification’s particular zones. However, fractures that exceed the target zones are “dangerous” fractures. “Area classification” can classify such fractures, and it is therefore useful for selecting osteosynthesis methods.

  3. Application of computer aided tolerance analysis in product design

    International Nuclear Information System (INIS)

    Du Hua

    2009-01-01

    This paper introduces the shortage of the traditional tolerance design method and the strong point of the computer aided tolerancing (CAT) method,compares the shortage and the strong point among the three tolerance analysis methods, which are Worst Case Analysis, Statistical Analysis and Monte-Carlo Simulation Analysis, and offers the basic courses and correlative details for CAT. As the study objects, the reactor pressure vessel, the core barrel, the hold-down barrel and the support plate are used to upbuild the tolerance simulation model, based on their 3D design models. Then the tolerance simulation analysis has been conducted and the scheme of the tolerance distribution is optimized based on the analysis results. (authors)

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  5. Textbook Evaluation: An Analysis of Listening Comprehension Parts in Top Notch 2A & 2B

    Directory of Open Access Journals (Sweden)

    Afshin Soori

    2015-10-01

    Full Text Available Textbooks are the instruments that assist both teachers and learners in process of second language learning. With respect to the importance of textbooks in a language course, evaluation of course books is a significant issue for most researchers. The present study investigated and analyzed Listening Comprehension parts in Top Notch 2A & 2B 2nd edition. Top Notch 2A & 2B have 10 Units. The number of listening comprehension parts is in the range of 2 to 4 parts in each unit through the book. So the number of listening comprehension parts is not equally distributed. The participants of this study are 10 EFL teachers of two English language Institutes in Jahrom. Strong and weak aspects of Listening Comprehension parts have indicated in this research. The weaknesses involve the pictures and visuals are not clear enough to enhance students' motivation and interest, the audio is not completely suitable for students' English level, and Discussion parts are not stimulating students' talking. Furthermore this study revealed the crucial function of teachers in listening achievement of students. Keywords: textbook evaluation, listening comprehension, ELT

  6. Comprehensive analysis of gene expression patterns of hedgehog-related genes

    Directory of Open Access Journals (Sweden)

    Baillie David

    2006-10-01

    Full Text Available Abstract Background The Caenorhabditis elegans genome encodes ten proteins that share sequence similarity with the Hedgehog signaling molecule through their C-terminal autoprocessing Hint/Hog domain. These proteins contain novel N-terminal domains, and C. elegans encodes dozens of additional proteins containing only these N-terminal domains. These gene families are called warthog, groundhog, ground-like and quahog, collectively called hedgehog (hh-related genes. Previously, the expression pattern of seventeen genes was examined, which showed that they are primarily expressed in the ectoderm. Results With the completion of the C. elegans genome sequence in November 2002, we reexamined and identified 61 hh-related ORFs. Further, we identified 49 hh-related ORFs in C. briggsae. ORF analysis revealed that 30% of the genes still had errors in their predictions and we improved these predictions here. We performed a comprehensive expression analysis using GFP fusions of the putative intergenic regulatory sequence with one or two transgenic lines for most genes. The hh-related genes are expressed in one or a few of the following tissues: hypodermis, seam cells, excretory duct and pore cells, vulval epithelial cells, rectal epithelial cells, pharyngeal muscle or marginal cells, arcade cells, support cells of sensory organs, and neuronal cells. Using time-lapse recordings, we discovered that some hh-related genes are expressed in a cyclical fashion in phase with molting during larval development. We also generated several translational GFP fusions, but they did not show any subcellular localization. In addition, we also studied the expression patterns of two genes with similarity to Drosophila frizzled, T23D8.1 and F27E11.3A, and the ortholog of the Drosophila gene dally-like, gpn-1, which is a heparan sulfate proteoglycan. The two frizzled homologs are expressed in a few neurons in the head, and gpn-1 is expressed in the pharynx. Finally, we compare the

  7. Fluid-Induced Vibration Analysis for Reactor Internals Using Computational FSI Method

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Jong Sung; Yi, Kun Woo; Sung, Ki Kwang; Im, In Young; Choi, Taek Sang [KEPCO E and C, Daejeon (Korea, Republic of)

    2013-10-15

    This paper introduces a fluid-induced vibration analysis method which calculates the response of the RVI to both deterministic and random loads at once and utilizes more realistic pressure distribution using the computational Fluid Structure Interaction (FSI) method. As addressed above, the FIV analysis for the RVI was carried out using the computational FSI method. This method calculates the response to deterministic and random turbulence loads at once. This method is also a simple and integrative method to get structural dynamic responses of reactor internals to various flow-induced loads. Because the analysis of this paper omitted the bypass flow region and Inner Barrel Assembly (IBA) due to the limitation of computer resources, it is necessary to find an effective way to consider all regions in the RV for the FIV analysis in the future. Reactor coolant flow makes Reactor Vessel Internals (RVI) vibrate and may affect the structural integrity of them. U. S. NRC Regulatory Guide 1.20 requires the Comprehensive Vibration Assessment Program (CVAP) to verify the structural integrity of the RVI for Fluid-Induced Vibration (FIV). The hydraulic forces on the RVI of OPR1000 and APR1400 were computed from the hydraulic formulas and the CVAP measurements in Palo Verde Unit 1 and Yonggwang Unit 4 for the structural vibration analyses. In this method, the hydraulic forces were divided into deterministic and random turbulence loads and were used for the excitation forces of the separate structural analyses. These forces are applied to the finite element model and the responses to them were combined into the resultant stresses.

  8. Comprehensive comparative-genomic analysis of Type 2 toxin-antitoxin systems and related mobile stress response systems in prokaryotes

    Directory of Open Access Journals (Sweden)

    Makarova Kira S

    2009-06-01

    extensive horizontal mobility, make the task of comprehensive identification of these systems particularly challenging. However, these same properties can be exploited to develop context-based computational approaches which, combined with exhaustive analysis of subtle sequence similarities were employed in this work to substantially expand the current collection of TAS by predicting both previously unnoticed, derived versions of known toxins and antitoxins, and putative novel TAS-like systems. In a broader context, the TAS belong to the resistome domain of the prokaryotic mobilome which includes partially selfish, addictive gene cassettes involved in various aspects of stress response and organized under the same general principles as the TAS. The "selfish altruism", or "responsible selfishness", of TAS-like systems appears to be a defining feature of the resistome and an important characteristic of the entire prokaryotic pan-genome given that in the prokaryotic world the mobilome and the "stable" chromosomes form a dynamic continuum. Reviewers This paper was reviewed by Kenn Gerdes (nominated by Arcady Mushegian, Daniel Haft, Arcady Mushegian, and Andrei Osterman. For full reviews, go to the Reviewers' Reports section.

  9. Comprehensive comparative-genomic analysis of type 2 toxin-antitoxin systems and related mobile stress response systems in prokaryotes.

    Science.gov (United States)

    Makarova, Kira S; Wolf, Yuri I; Koonin, Eugene V

    2009-06-03

    comprehensive identification of these systems particularly challenging. However, these same properties can be exploited to develop context-based computational approaches which, combined with exhaustive analysis of subtle sequence similarities were employed in this work to substantially expand the current collection of TAS by predicting both previously unnoticed, derived versions of known toxins and antitoxins, and putative novel TAS-like systems. In a broader context, the TAS belong to the resistome domain of the prokaryotic mobilome which includes partially selfish, addictive gene cassettes involved in various aspects of stress response and organized under the same general principles as the TAS. The "selfish altruism", or "responsible selfishness", of TAS-like systems appears to be a defining feature of the resistome and an important characteristic of the entire prokaryotic pan-genome given that in the prokaryotic world the mobilome and the "stable" chromosomes form a dynamic continuum. This paper was reviewed by Kenn Gerdes (nominated by Arcady Mushegian), Daniel Haft, Arcady Mushegian, and Andrei Osterman. For full reviews, go to the Reviewers' Reports section.

  10. Deep learning for digital pathology image analysis: A comprehensive tutorial with selected use cases.

    Science.gov (United States)

    Janowczyk, Andrew; Madabhushi, Anant

    2016-01-01

    analysis, we show how an open source framework (Caffe), with a singular network architecture, can be used to address: (a) nuclei segmentation (F-score of 0.83 across 12,000 nuclei), (b) epithelium segmentation (F-score of 0.84 across 1735 regions), (c) tubule segmentation (F-score of 0.83 from 795 tubules), (d) lymphocyte detection (F-score of 0.90 across 3064 lymphocytes), (e) mitosis detection (F-score of 0.53 across 550 mitotic events), (f) invasive ductal carcinoma detection (F-score of 0.7648 on 50 k testing patches), and (g) lymphoma classification (classification accuracy of 0.97 across 374 images). This paper represents the largest comprehensive study of DL approaches in DP to date, with over 1200 DP images used during evaluation. The supplemental online material that accompanies this paper consists of step-by-step instructions for the usage of the supplied source code, trained models, and input data.

  11. Deep learning for digital pathology image analysis: A comprehensive tutorial with selected use cases

    Directory of Open Access Journals (Sweden)

    Andrew Janowczyk

    2016-01-01

    this tutorial on DL for DP image analysis, we show how an open source framework (Caffe, with a singular network architecture, can be used to address: (a nuclei segmentation (F-score of 0.83 across 12,000 nuclei, (b epithelium segmentation (F-score of 0.84 across 1735 regions, (c tubule segmentation (F-score of 0.83 from 795 tubules, (d lymphocyte detection (F-score of 0.90 across 3064 lymphocytes, (e mitosis detection (F-score of 0.53 across 550 mitotic events, (f invasive ductal carcinoma detection (F-score of 0.7648 on 50 k testing patches, and (g lymphoma classification (classification accuracy of 0.97 across 374 images. Conclusion: This paper represents the largest comprehensive study of DL approaches in DP to date, with over 1200 DP images used during evaluation. The supplemental online material that accompanies this paper consists of step-by-step instructions for the usage of the supplied source code, trained models, and input data.

  12. Computational image analysis of Suspension Plasma Sprayed YSZ coatings

    Directory of Open Access Journals (Sweden)

    Michalak Monika

    2017-01-01

    Full Text Available The paper presents the computational studies of microstructure- and topography- related features of suspension plasma sprayed (SPS coatings of yttria-stabilized zirconia (YSZ. The study mainly covers the porosity assessment, provided by ImageJ software analysis. The influence of boundary conditions, defined by: (i circularity and (ii size limits, on the computed values of porosity is also investigated. Additionally, the digital topography evaluation is performed: confocal laser scanning microscope (CLSM and scanning electron microscope (SEM operating in Shape from Shading (SFS mode measure surface roughness of deposited coatings. Computed values of porosity and roughness are referred to the variables of the spraying process, which influence the morphology of coatings and determines the possible fields of their applications.

  13. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Comprehensive proteome analysis of the response of Pseudomonas putida KT2440 to the flavor compound vanillin.

    Science.gov (United States)

    Simon, Oliver; Klaiber, Iris; Huber, Armin; Pfannstiel, Jens

    2014-09-23

    Understanding of the molecular response of bacteria to precursors, products and environmental conditions applied in bioconversions is essential for optimizing whole-cell biocatalysis. To investigate the molecular response of the potential biocatalyst Pseudomonas putida KT2440 to the flavor compound vanillin we applied complementary gel- and LC-MS-based quantitative proteomics approaches. Our comprehensive proteomics survey included cytoplasmic and membrane proteins and led to the identification and quantification of 1614 proteins, corresponding to 30% of the total KT2440 proteome. 662 proteins were altered in abundance during growth on vanillin as sole carbon source as compared to growth on glucose. The proteome response entailed an increased abundance of enzymes involved in vanillin degradation, significant changes in central energy metabolism and an activation of solvent tolerance mechanisms. With respect to vanillin metabolism, particularly enzymes belonging to the β-ketoadipate pathway including a transcriptional regulator and porins specific for vanillin uptake increased in abundance. However, catabolism of vanillin was not dependent on vanillin dehydrogenase (Vdh), as shown by quantitative proteome analysis of a Vdh-deficient KT2440 mutant (GN235). Other aldehyde dehydrogenases that were significantly increased in abundance in response to vanillin may replace Vdh and thus may represent interesting targets for improving vanillin production in P. putida KT2440. The high demand for the flavor compound vanillin by the food and fragrance industry makes natural vanillin from vanilla pods a scarce and expensive resource rendering its biotechnological production economically attractive. Pseudomonas bacteria are metabolically very versatile and accept a broad range of hydrocarbons as carbon source making them suitable candidates for bioconversion processes. This work describes the impact of vanillin on the metabolism of the reference strain P. putida KT2440 on a

  15. Risk Analysis Based on AHP and Fuzzy Comprehensive Evaluation for Maglev Train Bogie

    Directory of Open Access Journals (Sweden)

    Chengxin Fan

    2016-01-01

    Full Text Available The maglev bogie is the key subsystem for maglev train security. To ensure life and property security, it is essential to evaluate its risk level before its operation. In this paper, a combinational method of analytic hierarchy process and fuzzy comprehensive evaluation is proposed to assess hazards in a complex maglev bogie system associated with multiple subsystems’ failures. The very comprehensive identification of risk sources has been done by analyzing the structure of maglev bogie. Furthermore, based on the fuzzy theory, linguistic evaluation set is classified according to risk tolerance. The score of each risk factor is obtained by weighted sum of the result of fuzzy comprehensive evaluation. Our results show that the degree of maglev bogie’s risk is within the range of acceptability. The merits of this work facilitate finding the weak links and determining the maintenance of maglev bogie system.

  16. Analysis of students’ science motivation and nature of science comprehension in middle school

    Directory of Open Access Journals (Sweden)

    Azizul Ghofar Candra Wicaksono

    2018-03-01

    Full Text Available The purpose of this study was to explore the pattern of science motivation and nature of science (NoS and the relationship between science motivation and nature of science in middle school students located in Semarang, Central Java, Indonesia. The design of this study was survey followed by the correlation study to discover the relationship between science motivation level and nature of science comprehension. This research included 113 students as sample. The instrument used for data collection was SMQ and seven essay test from NoS indicator. This study revealed that the students had a median score of science motivation and the low score in nature of science comprehension. There were students’ science motivation and nature of science comprehension urgently need to be improved. It can be done by developing learning process and any support from school or family.

  17. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  18. Identification of physicians providing comprehensive primary care in Ontario: a retrospective analysis using linked administrative data.

    Science.gov (United States)

    Schultz, Susan E; Glazier, Richard H

    2017-12-19

    Given the changing landscape of primary care, there may be fewer primary care physicians available to provide a broad range of services to patients of all age groups and health conditions. We sought to identify physicians with comprehensive primary care practices in Ontario using administrative data, investigating how many and what proportion of primary care physicians provided comprehensive primary care and how this changed over time. We identified the pool of active primary care physicians in linked population-based databases for Ontario from 1992/93 to 2014/15. After excluding those who saw patients fewer than 44 days per year, we identified physicians as providing comprehensive care if more than half of their services were for core primary care and if these services fell into at least 7 of 22 activity areas. Physicians with 50% or less of their services for core primary care but with more than 50% in a single location or type of service were identified as being in focused practice. In 2014/15, there were 12 891 physicians in the primary care pool: 1254 (9.7%) worked fewer than 44 days per year, 1619 (12.6%) were in focused practice, and 1009 (7.8%) could not be classified. The proportion in comprehensive practice ranged from 67.5% to 74.9% between 1992/93 and 2014/15, with a peak in 2002/03 and relative stability from 2009/10 to 2014/15. Over this period, there was an increase of 8.8% in population per comprehensive primary care physician. We found that just over two-thirds of primary care physicians provided comprehensive care in 2014/15, which indicates that traditional estimates of the primary care physician workforce may be too high. Although implementation will vary by setting and available data, this approach is likely applicable elsewhere. Copyright 2017, Joule Inc. or its licensors.

  19. Analysis of the experience of using a pedagogic conversational agent for reading comprehension in the school

    Directory of Open Access Journals (Sweden)

    Silvia Tamayo

    2013-08-01

    Full Text Available 0 0 1 183 1012 USAL 8 2 1193 14.0 Normal 0 21 false false false ES JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:Calibri; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-ansi-language:ES; mso-fareast-language:EN-US;} Technologies are key in society, and they are in continuous progress. In the educational context, an increase of using new technologies for education can be identified in the last years. One of these new technologies is Pedagogic Conversational Agents. That is, computer systems that interact with the students in spoken or written natural language. They can adopt the role of teacher, student or pal. In the 2011/2012 academic year, we used a Pedagogic Conversational Agent called Dr. Roland in class to solve Maths problems. 38 students and 3 teachers were involved in the experience. In this paper, an analysis of the experience and how to integrate agents in the school is presented. Ten recommendations as a guide for teachers, researchers and developers interested in using this type of educational software are proposed from the analysis. Some of these recommendations are to combine the use of computers with other devices such as tablets that make the use of the system by several students easier, to involve the parents to foster the use of the agent at home, and to validate each step of the advances made with teachers, parents and students so that they are adapted to their needs.

  20. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  1. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  2. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  3. Ubiquitous computing in sports: A review and analysis.

    Science.gov (United States)

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  4. Comprehensive benefits analysis of steel structure modular residence based on the entropy evaluation

    Science.gov (United States)

    Zhang, Xiaoxiao; Wang, Li; Jiang, Pengming

    2017-04-01

    Steel structure modular residence is the outstanding residential industrialization. It has many advantages, such as the low whole cost, high resource recovery, a high degree of industrialization. This paper compares the comprehensive benefits of steel structural in modular buildings with prefabricated reinforced concrete residential from economic benefits, environmental benefits, social benefits and technical benefits by the method of entropy evaluation. Finally, it is concluded that the comprehensive benefits of steel structural in modular buildings is better than that of prefabricated reinforced concrete residential. The conclusion of this study will provide certain reference significance to the development of steel structural in modular buildings in China.

  5. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  6. Gas analysis by computer-controlled microwave rotational spectrometry

    International Nuclear Information System (INIS)

    Hrubesh, L.W.

    1978-01-01

    Microwave rotational spectrometry has inherently high resolution and is thus nearly ideal for qualitative gas mixture analysis. Quantitative gas analysis is also possible by a simplified method which utilizes the ease with which molecular rotational transitions can be saturated at low microwave power densities. This article describes a computer-controlled microwave spectrometer which is used to demonstrate for the first time a totally automated analysis of a complex gas mixture. Examples are shown for a complete qualitative and quantitative analysis, in which a search of over 100 different compounds is made in less than 7 min, with sensitivity for most compounds in the 10 to 100 ppm range. This technique is expected to find increased use in view of the reduced complexity and increased reliabiity of microwave spectrometers and because of new energy-related applications for analysis of mixtures of small molecules

  7. Thermohydraulic analysis of nuclear power plant accidents by computer codes

    International Nuclear Information System (INIS)

    Petelin, S.; Stritar, A.; Istenic, R.; Gregoric, M.; Jerele, A.; Mavko, B.

    1982-01-01

    RELAP4/MOD6, BRUCH-D-06, CONTEMPT-LT-28, RELAP5/MOD1 and COBRA-4-1 codes were successful y implemented at the CYBER 172 computer in Ljubljana. Input models of NPP Krsko for the first three codes were prepared. Because of the high computer cost only one analysis of double ended guillotine break of the cold leg of NPP Krsko by RELAP4 code has been done. BRUCH code is easier and cheaper for use. Several analysis have been done. Sensitivity study was performed with CONTEMPT-LT-28 for double ended pump suction break. These codes are intended to be used as a basis for independent safety analyses. (author)

  8. Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser

    Science.gov (United States)

    Adib, M. A. H. M.; Adnan, F.; Ismail, A. R.; Kardigama, K.; Salaam, H. A.; Ahmad, Z.; Johari, N. H.; Anuar, Z.; Azmi, N. S. N.

    2012-09-01

    Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ~ 60%) acceptable compared to diffuser with 6mm ~ 40% and 12mm ~ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).

  9. Computational Analysis on Performance of Thermal Energy Storage (TES) Diffuser

    International Nuclear Information System (INIS)

    Adib, M A H M; Ismail, A R; Kardigama, K; Salaam, H A; Ahmad, Z; Johari, N H; Anuar, Z; Azmi, N S N; Adnan, F

    2012-01-01

    Application of thermal energy storage (TES) system reduces cost and energy consumption. The performance of the overall operation is affected by diffuser design. In this study, computational analysis is used to determine the thermocline thickness. Three dimensional simulations with different tank height-to-diameter ratio (HD), diffuser opening and the effect of difference number of diffuser holes are investigated. Medium HD tanks simulations with double ring octagonal diffuser show good thermocline behavior and clear distinction between warm and cold water. The result show, the best performance of thermocline thickness during 50% time charging occur in medium tank with height-to-diameter ratio of 4.0 and double ring octagonal diffuser with 48 holes (9mm opening ∼ 60%) acceptable compared to diffuser with 6mm ∼ 40% and 12mm ∼ 80% opening. The conclusion is computational analysis method are very useful in the study on performance of thermal energy storage (TES).

  10. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    Science.gov (United States)

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  11. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    Science.gov (United States)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  12. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  13. Pathobiological features of breast tumours in the State of Kuwait: a comprehensive analysis

    Directory of Open Access Journals (Sweden)

    Abdeen Suad

    2007-01-01

    Full Text Available Abstract Background Breast cancer accounts for 30.3% of all cancer types in Kuwaiti women. Death occurs in approximately 43% of these patients. Our goal was to conduct a comprehensive analysis of the pathobiological characteristics of the tumours in an attempt to determine any particular trend that could be present. Methods One hundred and sixty-six cases were included in this study. All the pathology reports and paraffin blocks pertaining to these cases were collected. Four micrometer sections were taken from each block, and immunostaining against Her-2, ER, and PgR was performed. Both the proportion and intensity of immunostaining were scored according to the Allred's method, and typing of the tumour was done according the WHO criteria regarding tumour classification. Grading of invasive carcinomas was done according to the modified Bloom-Richardson-Elston's method, and tumour stage was determined according to the criteria set by the American Joint Committee on Cancer. Results The mean age of the patients below 55 years was 40, as compared to 68 for those above 55 (p < 0.0001. More than half of the cases were in the right breast, and were surgically treated by total mastectomy with axillary clearance. The majority of the tumours had irregular (stellate margins, was invasive, and had a surrounding breast tissue of adenosis or fibrocystic type. Their mitotic index was 10–20 or >20 with a marked to moderate nuclear pleomorphism. They were mostly grade II or III, sized 2–5 or > 5 cm, had absent or scanty tumour lymphocytes, and were stage II or III. The in situ tumours were mainly ductal carcinoma (DCIS of which comedo and cribriform were the major histological subtypes. The major histological subtypes of the invasive tumours were ductal-not otherwise specified, lobular, and tubular/cribriform. In this study, we also found a significant (p < 0.05 association between over expression of Her-2, lack of expression of ER and some of the

  14. Comprehensive Protein Interactome Analysis of a Key RNA Helicase: Detection of Novel Stress Granule Proteins

    Directory of Open Access Journals (Sweden)

    Rebecca Bish

    2015-07-01

    Full Text Available DDX6 (p54/RCK is a human RNA helicase with central roles in mRNA decay and translation repression. To help our understanding of how DDX6 performs these multiple functions, we conducted the first unbiased, large-scale study to map the DDX6-centric protein-protein interactome using immunoprecipitation and mass spectrometry. Using DDX6 as bait, we identify a high-confidence and high-quality set of protein interaction partners which are enriched for functions in RNA metabolism and ribosomal proteins. The screen is highly specific, maximizing the number of true positives, as demonstrated by the validation of 81% (47/58 of the RNA-independent interactors through known functions and interactions. Importantly, we minimize the number of indirect interaction partners through use of a nuclease-based digestion to eliminate RNA. We describe eleven new interactors, including proteins involved in splicing which is an as-yet unknown role for DDX6. We validated and characterized in more detail the interaction of DDX6 with Nuclear fragile X mental retardation-interacting protein 2 (NUFIP2 and with two previously uncharacterized proteins, FAM195A and FAM195B (here referred to as granulin-1 and granulin-2, or GRAN1 and GRAN2. We show that NUFIP2, GRAN1, and GRAN2 are not P-body components, but re-localize to stress granules upon exposure to stress, suggesting a function in translation repression in the cellular stress response. Using a complementary analysis that resolved DDX6’s multiple complex memberships, we further validated these interaction partners and the presence of splicing factors. As DDX6 also interacts with the E3 SUMO ligase TIF1β, we tested for and observed a significant enrichment of sumoylation amongst DDX6’s interaction partners. Our results represent the most comprehensive screen for direct interaction partners of a key regulator of RNA life cycle and localization, highlighting new stress granule components and possible DDX6 functions

  15. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  16. A Computable OLG Model for Gender and Growth Policy Analysis

    OpenAIRE

    Pierre-Richard Agénor

    2012-01-01

    This paper develops a computable Overlapping Generations (OLG) model for gender and growth policy analysis. The model accounts for human and physical capital accumulation (both public and private), intra- and inter-generational health persistence, fertility choices, and women's time allocation between market work, child rearing, and home production. Bargaining between spouses and gender bias, in the form of discrimination in the work place and mothers' time allocation between daughters and so...

  17. Computers in activation analysis and gamma-ray spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, B. S.; D' Agostino, M. D.; Yule, H. P. [eds.

    1979-01-01

    Seventy-three papers are included under the following session headings: analytical and mathematical methods for data analysis; software systems for ..gamma..-ray and x-ray spectrometry; ..gamma..-ray spectra treatment, peak evaluation; least squares; IAEA intercomparison of methods for processing spectra; computer and calculator utilization in spectrometer systems; and applications in safeguards, fuel scanning, and environmental monitoring. Separate abstracts were prepared for 72 of those papers. (DLC)

  18. Learner Variables in Second Language Listening Comprehension: An Exploratory Path Analysis

    Science.gov (United States)

    Vandergrift, Larry; Baker, Susan

    2015-01-01

    Listening comprehension plays a key role in language acquisition, yet little is known about the variables that contribute to the development of second language (L2) listening ability. This study sought to obtain empirical evidence for the impact of some of the learner variables and the degree to which they might predict success in L2 listening.…

  19. Textbook Evaluation: An Analysis of Listening Comprehension Parts in Top Notch 2A & 2B

    Science.gov (United States)

    Soori, Afshin; Haghani, Elham

    2015-01-01

    Textbooks are the instruments that assist both teachers and learners in process of second language learning. With respect to the importance of textbooks in a language course, evaluation of course books is a significant issue for most researchers. The present study investigated and analyzed Listening Comprehension parts in Top Notch 2A & 2B 2nd…

  20. An Analysis and Allocation System for Library Collections Budgets: The Comprehensive Allocation Process (CAP)

    Science.gov (United States)

    Lyons, Lucy Eleonore; Blosser, John

    2012-01-01

    The "Comprehensive Allocation Process" (CAP) is a reproducible decision-making structure for the allocation of new collections funds, for the reallocation of funds within stagnant budgets, and for budget cuts in the face of reduced funding levels. This system was designed to overcome common shortcomings of current methods. Its philosophical…