WorldWideScience

Sample records for comprehensive computational analysis

  1. Computed microtomography and X-ray fluorescence analysis for comprehensive analysis of structural changes in bone.

    Science.gov (United States)

    Buzmakov, Alexey; Chukalina, Marina; Nikolaev, Dmitry; Schaefer, Gerald; Gulimova, Victoria; Saveliev, Sergey; Tereschenko, Elena; Seregin, Alexey; Senin, Roman; Prun, Victor; Zolotov, Denis; Asadchikov, Victor

    2013-01-01

    This paper presents the results of a comprehensive analysis of structural changes in the caudal vertebrae of Turner's thick-toed geckos by computer microtomography and X-ray fluorescence analysis. We present algorithms used for the reconstruction of tomographic images which allow to work with high noise level projections that represent typical conditions dictated by the nature of the samples. Reptiles, due to their ruggedness, small size, belonging to the amniote and a number of other valuable features, are an attractive model object for long-orbital experiments on unmanned spacecraft. Issues of possible changes in their bone tissue under the influence of spaceflight are the subject of discussions between biologists from different laboratories around the world.

  2. The Perseus computational platform for comprehensive analysis of (prote)omics data.

    Science.gov (United States)

    Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen

    2016-09-01

    A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.

  3. Learning to reason about speakers' alternatives in sentence comprehension : A computational account

    NARCIS (Netherlands)

    Hendriks, Petra; van Rijn, Hedderik; Valkenier, Bea

    We present a computational simulation study of the acquisition of pronouns and reflexives. The computational simulation is based on an Optimality Theory analysis, and is shown to account for the well-known observation that in English and many other languages the correct comprehension of pronouns

  4. Learning to reason about speakers' alternatives in sentence comprehension : A computational account

    NARCIS (Netherlands)

    Hendriks, Petra; van Rijn, Hedderik; Valkenier, Bea

    2007-01-01

    We present a computational simulation study of the acquisition of pronouns and reflexives. The computational simulation is based on an Optimality Theory analysis, and is shown to account for the well-known observation that in English and many other languages the correct comprehension of pronouns

  5. Incorporating Computers into Classroom: Effects on Learners’ Reading Comprehension in EFL Context

    Directory of Open Access Journals (Sweden)

    Ali Akbar Ansarin

    2017-10-01

    Full Text Available Owing to the importance of computer-assisted reading and considering the prominent role of learners in this respect, the present study investigated: (1 the effects of computer as a supplemental tool to support and improve the Iranian EFL learners’ reading comprehension in comparison with equivalent non-technological or traditional print-based treatments, (2 EFL learners’ attitudes and perception towards the computer-assisted reading course.To this purpose, 111 randomly selected groups of EFL learners participated in the study. The subjects were divided into two groups of control and experimental. Both groups received 10 reading lessons either through computers or through an instructor-led method. The statistical analysis revealed no significant difference between the learners who had access to reading supports on computer screen and their counterparts in the traditional reading classes. Learners were also allowed to express their ideas on a 5-point Likert Scale. The purpose of the attitude questionnaire was to find out more information about the participants and their experiences with computer-assisted reading. Results of attitude questionnaire supported the conclusion that computers may enhance EFL learners’ motivation and interest towards learning but they do not enhance comprehension. The findings of this study support the view that technology should supplement not supplant teachers and that people read less accurately and less comprehensively on screens than on paper.

  6. On the Computation of Comprehensive Boolean Gröbner Bases

    Science.gov (United States)

    Inoue, Shutaro

    We show that a comprehensive Boolean Gröbner basis of an ideal I in a Boolean polynomial ring B (bar A,bar X) with main variables bar X and parameters bar A can be obtained by simply computing a usual Boolean Gröbner basis of I regarding both bar X and bar A as variables with a certain block term order such that bar X ≫ bar A. The result together with a fact that a finite Boolean ring is isomorphic to a direct product of the Galois field mathbb{GF}_2 enables us to compute a comprehensive Boolean Gröbner basis by only computing corresponding Gröbner bases in a polynomial ring over mathbb{GF}_2. Our implementation in a computer algebra system Risa/Asir shows that our method is extremely efficient comparing with existing computation algorithms of comprehensive Boolean Gröbner bases.

  7. Effects of computer-based immediate feedback on foreign language listening comprehension and test-associated anxiety.

    Science.gov (United States)

    Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da

    2012-06-01

    This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.

  8. COMAN: a web server for comprehensive metatranscriptomics analysis.

    Science.gov (United States)

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  9. Comprehensive metabolic panel

    Science.gov (United States)

    Metabolic panel - comprehensive; Chem-20; SMA20; Sequential multi-channel analysis with computer-20; SMAC20; Metabolic panel 20 ... Chernecky CC, Berger BJ. Comprehensive metabolic panel (CMP) - blood. In: ... Tests and Diagnostic Procedures . 6th ed. St Louis, MO: ...

  10. Effects of a Computer-Assisted Concept Mapping Learning Strategy on EFL College Students' English Reading Comprehension

    Science.gov (United States)

    Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju

    2010-01-01

    The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…

  11. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  12. The Comprehension Problems of Children with Poor Reading Comprehension despite Adequate Decoding: A Meta-Analysis.

    Science.gov (United States)

    Spencer, Mercedes; Wagner, Richard K

    2018-06-01

    The purpose of this meta-analysis was to examine the comprehension problems of children who have a specific reading comprehension deficit (SCD), which is characterized by poor reading comprehension despite adequate decoding. The meta-analysis included 86 studies of children with SCD who were assessed in reading comprehension and oral language (vocabulary, listening comprehension, storytelling ability, and semantic and syntactic knowledge). Results indicated that children with SCD had deficits in oral language ( d = -0.78, 95% CI [-0.89, -0.68], but these deficits were not as severe as their deficit in reading comprehension ( d = -2.78, 95% CI [-3.01, -2.54]). When compared to reading comprehension age-matched normal readers, the oral language skills of the two groups were comparable ( d = 0.32, 95% CI [-0.49, 1.14]), which suggests that the oral language weaknesses of children with SCD represent a developmental delay rather than developmental deviance. Theoretical and practical implications of these findings are discussed.

  13. Comprehensive analysis of transport aircraft flight performance

    Science.gov (United States)

    Filippone, Antonio

    2008-04-01

    This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance

  14. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  15. Development of a Computer-Based Measure of Listening Comprehension of Science Talk

    Science.gov (United States)

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2015-01-01

    The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…

  16. Current topics in pure and computational complex analysis

    CERN Document Server

    Dorff, Michael; Lahiri, Indrajit

    2014-01-01

    The book contains 13 articles, some of which are survey articles and others research papers. Written by eminent mathematicians, these articles were presented at the International Workshop on Complex Analysis and Its Applications held at Walchand College of Engineering, Sangli. All the contributing authors are actively engaged in research fields related to the topic of the book. The workshop offered a comprehensive exposition of the recent developments in geometric functions theory, planar harmonic mappings, entire and meromorphic functions and their applications, both theoretical and computational. The recent developments in complex analysis and its applications play a crucial role in research in many disciplines.

  17. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  18. Advanced complex analysis a comprehensive course in analysis, part 2b

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2B provides a comprehensive look at a number of subjects of complex analysis not included in Part 2A. Presented in this volume are the theory of conformal metrics (includ

  19. Comprehensive Analysis of Trends and Emerging Technologies in All Types of Fuel Cells Based on a Computational Method

    Directory of Open Access Journals (Sweden)

    Takaya Ogawa

    2018-02-01

    Full Text Available Fuel cells have been attracting significant attention recently as highly efficient and eco-friendly energy generators. Here, we have comprehensively reviewed all types of fuel cells using computational analysis based on a citation network that detects emerging technologies objectively and provides interdisciplinary data to compare trends. This comparison shows that the technologies of solid oxide fuel cells (SOFCs and electrolytes in polymer electrolyte fuel cells (PEFCs are at the mature stage, whereas those of biofuel cells (BFCs and catalysts in PEFCs are currently garnering attention. It does not mean, however, that the challenges of SOFCs and PEFC electrolytes have been overcome. SOFCs need to be operated at lower temperatures, approximately 500 °C. Electrolytes in PEFCs still suffer from a severe decrease in proton conductivity at low relative humidity and from their high cost. Catalysts in PEFCs are becoming attractive as means to reduce the platinum catalyst cost. The emerging technologies in PEFC catalysts are mainly heteroatom-doped graphene/carbon nanotubes for metal-free catalysts and supports for iron- or cobalt-based catalysts. BFCs have also received attention for wastewater treatment and as miniaturized energy sources. Of particular interest in BFCs are membrane reactors in microbial fuel cells and membrane-less enzymatic biofuel cells.

  20. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  1. ClassyFire: automated chemical classification with a comprehensive, computable taxonomy.

    Science.gov (United States)

    Djoumbou Feunang, Yannick; Eisner, Roman; Knox, Craig; Chepelev, Leonid; Hastings, Janna; Owen, Gareth; Fahy, Eoin; Steinbeck, Christoph; Subramanian, Shankar; Bolton, Evan; Greiner, Russell; Wishart, David S

    2016-01-01

    Scientists have long been driven by the desire to describe, organize, classify, and compare objects using taxonomies and/or ontologies. In contrast to biology, geology, and many other scientific disciplines, the world of chemistry still lacks a standardized chemical ontology or taxonomy. Several attempts at chemical classification have been made; but they have mostly been limited to either manual, or semi-automated proof-of-principle applications. This is regrettable as comprehensive chemical classification and description tools could not only improve our understanding of chemistry but also improve the linkage between chemistry and many other fields. For instance, the chemical classification of a compound could help predict its metabolic fate in humans, its druggability or potential hazards associated with it, among others. However, the sheer number (tens of millions of compounds) and complexity of chemical structures is such that any manual classification effort would prove to be near impossible. We have developed a comprehensive, flexible, and computable, purely structure-based chemical taxonomy (ChemOnt), along with a computer program (ClassyFire) that uses only chemical structures and structural features to automatically assign all known chemical compounds to a taxonomy consisting of >4800 different categories. This new chemical taxonomy consists of up to 11 different levels (Kingdom, SuperClass, Class, SubClass, etc.) with each of the categories defined by unambiguous, computable structural rules. Furthermore each category is named using a consensus-based nomenclature and described (in English) based on the characteristic common structural properties of the compounds it contains. The ClassyFire webserver is freely accessible at http://classyfire.wishartlab.com/. Moreover, a Ruby API version is available at https://bitbucket.org/wishartlab/classyfire_api, which provides programmatic access to the ClassyFire server and database. ClassyFire has been used to

  2. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  3. A Comprehensive, Open-source Platform for Mass Spectrometry-based Glycoproteomics Data Analysis.

    Science.gov (United States)

    Liu, Gang; Cheng, Kai; Lo, Chi Y; Li, Jun; Qu, Jun; Neelamegham, Sriram

    2017-11-01

    Glycosylation is among the most abundant and diverse protein post-translational modifications (PTMs) identified to date. The structural analysis of this PTM is challenging because of the diverse monosaccharides which are not conserved among organisms, the branched nature of glycans, their isomeric structures, and heterogeneity in the glycan distribution at a given site. Glycoproteomics experiments have adopted the traditional high-throughput LC-MS n proteomics workflow to analyze site-specific glycosylation. However, comprehensive computational platforms for data analyses are scarce. To address this limitation, we present a comprehensive, open-source, modular software for glycoproteomics data analysis called GlycoPAT (GlycoProteomics Analysis Toolbox; freely available from www.VirtualGlycome.org/glycopat). The program includes three major advances: (1) "SmallGlyPep," a minimal linear representation of glycopeptides for MS n data analysis. This format allows facile serial fragmentation of both the peptide backbone and PTM at one or more locations. (2) A novel scoring scheme based on calculation of the "Ensemble Score (ES)," a measure that scores and rank-orders MS/MS spectrum for N- and O-linked glycopeptides using cross-correlation and probability based analyses. (3) A false discovery rate (FDR) calculation scheme where decoy glycopeptides are created by simultaneously scrambling the amino acid sequence and by introducing artificial monosaccharides by perturbing the original sugar mass. Parallel computing facilities and user-friendly GUIs (Graphical User Interfaces) are also provided. GlycoPAT is used to catalogue site-specific glycosylation on simple glycoproteins, standard protein mixtures and human plasma cryoprecipitate samples in three common MS/MS fragmentation modes: CID, HCD and ETD. It is also used to identify 960 unique glycopeptides in cell lysates from prostate cancer cells. The results show that the simultaneous consideration of peptide and glycan

  4. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  5. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data

    Science.gov (United States)

    Ma, Jianhua

    2018-01-01

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model. PMID:29495343

  6. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data

    Directory of Open Access Journals (Sweden)

    Ao Guo

    2018-02-01

    Full Text Available A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual. This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.

  7. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data.

    Science.gov (United States)

    Guo, Ao; Ma, Jianhua

    2018-02-25

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject's persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual's facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.

  8. Real analysis a comprehensive course in analysis, part 1

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 1 is devoted to real analysis. From one point of view, it presents the infinitesimal calculus of the twentieth century with the ultimate integral calculus (measure theory)

  9. The Comprehension Problems for Second-Language Learners with Poor Reading Comprehension Despite Adequate Decoding: A Meta-Analysis

    Science.gov (United States)

    Spencer, Mercedes; Wagner, Richard K.

    2017-01-01

    We conducted a meta-analysis of 16 existing studies to examine the nature of the comprehension problems for children who were second-language learners with poor reading comprehension despite adequate decoding. Results indicated that these children had deficits in oral language (d = -0.80), but these deficits were not as severe as their reading…

  10. Harmonic analysis a comprehensive course in analysis, part 3

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 3 returns to the themes of Part 1 by discussing pointwise limits (going beyond the usual focus on the Hardy-Littlewood maximal function by including ergodic theorems and m

  11. Introducing computational thinking through hands-on projects using R with applications to calculus, probability and data analysis

    Science.gov (United States)

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-04-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.

  12. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  13. Comprehensive analysis of a Radiology Operations Management computer system.

    Science.gov (United States)

    Arenson, R L; London, J W

    1979-11-01

    The Radiology Operations Management computer system at the Hospital of the University of Pennsylvania is discussed. The scheduling and file room modules are based on the system at Massachusetts General Hospital. Patient delays are indicated by the patient tracking module. A reporting module allows CRT/keyboard entry by transcriptionists, entry of standard reports by radiologists using bar code labels, and entry by radiologists using a specialty designed diagnostic reporting terminal. Time-flow analyses demonstrate a significant improvement in scheduling, patient waiting, retrieval of radiographs, and report delivery. Recovery of previously lost billing contributes to the proved cost effectiveness of this system.

  14. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  15. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  16. Comprehensive adaptive mesh refinement in wrinkling prediction analysis

    NARCIS (Netherlands)

    Selman, A.; Meinders, Vincent T.; Huetink, Han; van den Boogaard, Antonius H.

    2002-01-01

    Discretisation errors indicator, contact free wrinkling and wrinkling with contact indicators are, in a challenging task, brought together and used in a comprehensive approach to wrinkling prediction analysis in thin sheet metal forming processes.

  17. Basic complex analysis a comprehensive course in analysis, part 2a

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2A is devoted to basic complex analysis. It interweaves three analytic threads associated with Cauchy, Riemann, and Weierstrass, respectively. Cauchy's view focuses on th

  18. Fluid-Induced Vibration Analysis for Reactor Internals Using Computational FSI Method

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Jong Sung; Yi, Kun Woo; Sung, Ki Kwang; Im, In Young; Choi, Taek Sang [KEPCO E and C, Daejeon (Korea, Republic of)

    2013-10-15

    This paper introduces a fluid-induced vibration analysis method which calculates the response of the RVI to both deterministic and random loads at once and utilizes more realistic pressure distribution using the computational Fluid Structure Interaction (FSI) method. As addressed above, the FIV analysis for the RVI was carried out using the computational FSI method. This method calculates the response to deterministic and random turbulence loads at once. This method is also a simple and integrative method to get structural dynamic responses of reactor internals to various flow-induced loads. Because the analysis of this paper omitted the bypass flow region and Inner Barrel Assembly (IBA) due to the limitation of computer resources, it is necessary to find an effective way to consider all regions in the RV for the FIV analysis in the future. Reactor coolant flow makes Reactor Vessel Internals (RVI) vibrate and may affect the structural integrity of them. U. S. NRC Regulatory Guide 1.20 requires the Comprehensive Vibration Assessment Program (CVAP) to verify the structural integrity of the RVI for Fluid-Induced Vibration (FIV). The hydraulic forces on the RVI of OPR1000 and APR1400 were computed from the hydraulic formulas and the CVAP measurements in Palo Verde Unit 1 and Yonggwang Unit 4 for the structural vibration analyses. In this method, the hydraulic forces were divided into deterministic and random turbulence loads and were used for the excitation forces of the separate structural analyses. These forces are applied to the finite element model and the responses to them were combined into the resultant stresses.

  19. Computational text analysis and reading comprehension exam complexity towards automatic text classification

    CERN Document Server

    Liontou, Trisevgeni

    2014-01-01

    This book delineates a range of linguistic features that characterise the reading texts used at the B2 (Independent User) and C1 (Proficient User) levels of the Greek State Certificate of English Language Proficiency exams in order to help define text difficulty per level of competence. In addition, it examines whether specific reader variables influence test takers' perceptions of reading comprehension difficulty. The end product is a Text Classification Profile per level of competence and a formula for automatically estimating text difficulty and assigning levels to texts consistently and re

  20. Using Primary Language Support via Computer to Improve Reading Comprehension Skills of First-Grade English Language Learners

    Science.gov (United States)

    Rodriguez, Cathi Draper; Filler, John; Higgins, Kyle

    2012-01-01

    Through this exploratory study the authors investigated the effects of primary language support delivered via computer on the English reading comprehension skills of English language learners. Participants were 28 First-grade students identified as Limited English Proficient. The primary language of all participants was Spanish. Students were…

  1. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  2. Using miscue analysis to assess comprehension in deaf college readers.

    Science.gov (United States)

    Albertini, John; Mayer, Connie

    2011-01-01

    For over 30 years, teachers have used miscue analysis as a tool to assess and evaluate the reading abilities of hearing students in elementary and middle schools and to design effective literacy programs. More recently, teachers of deaf and hard-of-hearing students have also reported its usefulness for diagnosing word- and phrase-level reading difficulties and for planning instruction. To our knowledge, miscue analysis has not been used with older, college-age deaf students who might also be having difficulty decoding and understanding text at the word level. The goal of this study was to determine whether such an analysis would be helpful in identifying the source of college students' reading comprehension difficulties. After analyzing the miscues of 10 college-age readers and the results of other comprehension-related tasks, we concluded that comprehension of basic grade school-level passages depended on the ability to recognize and comprehend key words and phrases in these texts. We also concluded that these diagnostic procedures provided useful information about the reading abilities and strategies of each reader that had implications for designing more effective interventions.

  3. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  4. TranslatomeDB: a comprehensive database and cloud-based analysis platform for translatome sequencing data.

    Science.gov (United States)

    Liu, Wanting; Xiang, Lunping; Zheng, Tingkai; Jin, Jingjie; Zhang, Gong

    2018-01-04

    Translation is a key regulatory step, linking transcriptome and proteome. Two major methods of translatome investigations are RNC-seq (sequencing of translating mRNA) and Ribo-seq (ribosome profiling). To facilitate the investigation of translation, we built a comprehensive database TranslatomeDB (http://www.translatomedb.net/) which provides collection and integrated analysis of published and user-generated translatome sequencing data. The current version includes 2453 Ribo-seq, 10 RNC-seq and their 1394 corresponding mRNA-seq datasets in 13 species. The database emphasizes the analysis functions in addition to the dataset collections. Differential gene expression (DGE) analysis can be performed between any two datasets of same species and type, both on transcriptome and translatome levels. The translation indices translation ratios, elongation velocity index and translational efficiency can be calculated to quantitatively evaluate translational initiation efficiency and elongation velocity, respectively. All datasets were analyzed using a unified, robust, accurate and experimentally-verifiable pipeline based on the FANSe3 mapping algorithm and edgeR for DGE analyzes. TranslatomeDB also allows users to upload their own datasets and utilize the identical unified pipeline to analyze their data. We believe that our TranslatomeDB is a comprehensive platform and knowledgebase on translatome and proteome research, releasing the biologists from complex searching, analyzing and comparing huge sequencing data without needing local computational power. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. A Comprehensive Sensitivity Analysis of a Data Center Network with Server Virtualization for Business Continuity

    Directory of Open Access Journals (Sweden)

    Tuan Anh Nguyen

    2015-01-01

    Full Text Available Sensitivity assessment of availability for data center networks (DCNs is of paramount importance in design and management of cloud computing based businesses. Previous work has presented a performance modeling and analysis of a fat-tree based DCN using queuing theory. In this paper, we present a comprehensive availability modeling and sensitivity analysis of a DCell-based DCN with server virtualization for business continuity using stochastic reward nets (SRN. We use SRN in modeling to capture complex behaviors and dependencies of the system in detail. The models take into account (i two DCell configurations, respectively, composed of two and three physical hosts in a DCell0 unit, (ii failure modes and corresponding recovery behaviors of hosts, switches, and VMs, and VM live migration mechanism within and between DCell0s, and (iii dependencies between subsystems (e.g., between a host and VMs and between switches and VMs in the same DCell0. The constructed SRN models are analyzed in detail with regard to various metrics of interest to investigate system’s characteristics. A comprehensive sensitivity analysis of system availability is carried out in consideration of the major impacting parameters in order to observe the system’s complicated behaviors and find the bottlenecks of system availability. The analysis results show the availability improvement, capability of fault tolerance, and business continuity of the DCNs complying with DCell network topology. This study provides a basis of designing and management of DCNs for business continuity.

  6. Impact of comprehensive two-dimensional gas chromatography with mass spectrometry on food analysis.

    Science.gov (United States)

    Tranchida, Peter Q; Purcaro, Giorgia; Maimone, Mariarosa; Mondello, Luigi

    2016-01-01

    Comprehensive two-dimensional gas chromatography with mass spectrometry has been on the separation-science scene for about 15 years. This three-dimensional method has made a great positive impact on various fields of research, and among these that related to food analysis is certainly at the forefront. The present critical review is based on the use of comprehensive two-dimensional gas chromatography with mass spectrometry in the untargeted (general qualitative profiling and fingerprinting) and targeted analysis of food volatiles; attention is focused not only on its potential in such applications, but also on how recent advances in comprehensive two-dimensional gas chromatography with mass spectrometry will potentially be important for food analysis. Additionally, emphasis is devoted to the many instances in which straightforward gas chromatography with mass spectrometry is a sufficiently-powerful analytical tool. Finally, possible future scenarios in the comprehensive two-dimensional gas chromatography with mass spectrometry food analysis field are discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    Science.gov (United States)

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Comprehensive analysis of a medication dosing error related to CPOE.

    Science.gov (United States)

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  9. Development validation and use of computer codes for inelastic analysis

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    A finite element scheme is a system which provides routines so carry out the operations which are common to all finite element programs. The list of items that can be provided as standard by the finite element scheme is surprisingly large and the list provided by the UNCLE finite element scheme is unusually comprehensive. This presentation covers the following: construction of the program, setting up a finite element mesh, generation of coordinates, incorporating boundary and load conditions. Program validation was done by creep calculations performed using CAUSE code. Program use is illustrated by calculating a typical inelastic analysis problem. This includes computer model of the PFR intermediate heat exchanger

  10. Comprehensive School Reform and Achievement: A Meta-Analysis

    Science.gov (United States)

    Borman, Geoffrey D.; Hewes, Gina M.; Overman, Laura T.; Brown, Shelly

    2003-01-01

    This meta-analysis reviews research on the achievement effects of comprehensive school reform (CSR) and summarizes the specific effects of 29 widely implemented models. There are limitations on the overall quantity and quality of the research base, but the overall effects of CSR appear promising. The combined quantity, quality, and statistical…

  11. Comprehensive benefit analysis of regional water resources based on multi-objective evaluation

    Science.gov (United States)

    Chi, Yixia; Xue, Lianqing; Zhang, Hui

    2018-01-01

    The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.

  12. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  13. Comprehensive automatic assessment of retinal vascular abnormalities for computer-assisted retinopathy grading.

    Science.gov (United States)

    Joshi, Vinayak; Agurto, Carla; VanNess, Richard; Nemeth, Sheila; Soliz, Peter; Barriga, Simon

    2014-01-01

    One of the most important signs of systemic disease that presents on the retina is vascular abnormalities such as in hypertensive retinopathy. Manual analysis of fundus images by human readers is qualitative and lacks in accuracy, consistency and repeatability. Present semi-automatic methods for vascular evaluation are reported to increase accuracy and reduce reader variability, but require extensive reader interaction; thus limiting the software-aided efficiency. Automation thus holds a twofold promise. First, decrease variability while increasing accuracy, and second, increasing the efficiency. In this paper we propose fully automated software as a second reader system for comprehensive assessment of retinal vasculature; which aids the readers in the quantitative characterization of vessel abnormalities in fundus images. This system provides the reader with objective measures of vascular morphology such as tortuosity, branching angles, as well as highlights of areas with abnormalities such as artery-venous nicking, copper and silver wiring, and retinal emboli; in order for the reader to make a final screening decision. To test the efficacy of our system, we evaluated the change in performance of a newly certified retinal reader when grading a set of 40 color fundus images with and without the assistance of the software. The results demonstrated an improvement in reader's performance with the software assistance, in terms of accuracy of detection of vessel abnormalities, determination of retinopathy, and reading time. This system enables the reader in making computer-assisted vasculature assessment with high accuracy and consistency, at a reduced reading time.

  14. Dose tracking and dose auditing in a comprehensive computed tomography dose-reduction program.

    Science.gov (United States)

    Duong, Phuong-Anh; Little, Brent P

    2014-08-01

    Implementation of a comprehensive computed tomography (CT) radiation dose-reduction program is a complex undertaking, requiring an assessment of baseline doses, an understanding of dose-saving techniques, and an ongoing appraisal of results. We describe the role of dose tracking in planning and executing a dose-reduction program and discuss the use of the American College of Radiology CT Dose Index Registry at our institution. We review the basics of dose-related CT scan parameters, the components of the dose report, and the dose-reduction techniques, showing how an understanding of each technique is important in effective auditing of "outlier" doses identified by dose tracking. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. iDC: A comprehensive toolkit for the analysis of residual dipolar couplings for macromolecular structure determination

    International Nuclear Information System (INIS)

    Wei Yufeng; Werner, Milton H.

    2006-01-01

    Measurement of residual dipolar couplings (RDCs) has become an important method for the determination and validation of protein or nucleic acid structures by NMRf spectroscopy. A number of toolkits have been devised for the handling of RDC data which run in the Linux/Unix operating environment and require specifically formatted input files. The outputs from these programs, while informative, require format modification prior to the incorporation of this data into commonly used personal computer programs for manuscript preparation. To bridge the gap between analysis and publication, an easy-to-use, comprehensive toolkit for RDC analysis has been created, iDC. iDC is written for the WaveMetrics Igor Pro mathematics program, a widely used graphing and data analysis software program that runs on both Windows PC and Mac OS X computers. Experimental RDC values can be loaded into iDC using simple data formats accessible to Igor's tabular data function. The program can perform most useful RDC analyses, including alignment tensor estimation from a histogram of RDC occurrence versus values and order tensor analysis by singular value decomposition (SVD). SVD analysis can be performed on an entire structure family at once, a feature missing in other applications of this kind. iDC can also import from and export to several different commonly used programs for the analysis of RDC data (DC, PALES, REDCAT) and can prepare formatted files for RDC-based refinement of macromolecular structures using XPLOR-NIH, CNS and ARIA. The graphical user interface provides an easy-to-use I/O for data, structures and formatted outputs

  16. A research on comprehension differences between print and screen reading

    Directory of Open Access Journals (Sweden)

    Szu-Yuan Sun

    2013-12-01

    Full Text Available Since the 1980s, extensive research has been conducted comparing reading comprehension from printed text and computer screens. The conclusions, however, are not very consistent. As reading from computer screens requires a certain degree of individual technical skill, such variables should be objectively taken into consideration when conducting an experiment regarding the comparison between print and screen reading. This study analyses the difference in the level of understanding of the two presentational formats (text on printed pages and hypertext on computer screens for people between 45-54 years of age (i.e. “middleaged” adults. In our experimental findings there were no significant differences between the levels of comprehension for print and screen presentations. With regard to individual differences in gender, age group and educational level, the findings are as follows: gender and education effects on print reading comprehension performance were significant, while those on screen reading comprehension performance were not. For middle-aged computer learners, the main effect of age group on both print and screen reading comprehension performance was insignificant. In contrast, linear texts of traditional paper-based material are better for middle-aged readers’ literal text comprehension, while hypertext is beneficial to their inferential text comprehension. It is also suggested that hypermedia could be used as a cognitive tool for improving middle-aged adults’ inferential abilities on reading comprehension, provided that they were trained adequately to use available computers.

  17. Are There Gender Differences in Emotion Comprehension? Analysis of the Test of Emotion Comprehension.

    Science.gov (United States)

    Fidalgo, Angel M; Tenenbaum, Harriet R; Aznar, Ana

    2018-01-01

    This article examines whether there are gender differences in understanding the emotions evaluated by the Test of Emotion Comprehension (TEC). The TEC provides a global index of emotion comprehension in children 3-11 years of age, which is the sum of the nine components that constitute emotion comprehension: (1) recognition of facial expressions, (2) understanding of external causes of emotions, (3) understanding of desire-based emotions, (4) understanding of belief-based emotions, (5) understanding of the influence of a reminder on present emotional states, (6) understanding of the possibility to regulate emotional states, (7) understanding of the possibility of hiding emotional states, (8) understanding of mixed emotions, and (9) understanding of moral emotions. We used the answers to the TEC given by 172 English girls and 181 boys from 3 to 8 years of age. First, the nine components into which the TEC is subdivided were analysed for differential item functioning (DIF), taking gender as the grouping variable. To evaluate DIF, the Mantel-Haenszel method and logistic regression analysis were used applying the Educational Testing Service DIF classification criteria. The results show that the TEC did not display gender DIF. Second, when absence of DIF had been corroborated, it was analysed for differences between boys and girls in the total TEC score and its components controlling for age. Our data are compatible with the hypothesis of independence between gender and level of comprehension in 8 of the 9 components of the TEC. Several hypotheses are discussed that could explain the differences found between boys and girls in the belief component. Given that the Belief component is basically a false belief task, the differences found seem to support findings in the literature indicating that girls perform better on this task.

  18. Piping stress analysis with personal computers

    International Nuclear Information System (INIS)

    Revesz, Z.

    1987-01-01

    The growing market of the personal computers is providing an increasing number of professionals with unprecedented and surprisingly inexpensive computing capacity, which if using with powerful software, can enhance immensely the engineers capabilities. This paper focuses on the possibilities which opened in piping stress analysis by the widespread distribution of personal computers, on the necessary changes in the software and on the limitations of using personal computers for engineering design and analysis. Reliability and quality assurance aspects of using personal computers for nuclear applications are also mentioned. The paper resumes with personal views of the author and experiences gained during interactive graphic piping software development for personal computers. (orig./GL)

  19. A comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements.

    Science.gov (United States)

    Abdelgaied, A; Fisher, J; Jennings, L M

    2018-02-01

    A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily

  20. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles

  1. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    Science.gov (United States)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  2. Revealing and Quantifying the Impaired Phonological Analysis Underpinning Impaired Comprehension in Wernicke's Aphasia

    Science.gov (United States)

    Robson, Holly; Keidel, James L.; Lambon Ralph, Matthew A.; Sage, Karen

    2012-01-01

    Wernicke's aphasia is a condition which results in severely disrupted language comprehension following a lesion to the left temporo-parietal region. A phonological analysis deficit has traditionally been held to be at the root of the comprehension impairment in Wernicke's aphasia, a view consistent with current functional neuroimaging which finds…

  3. Operator theory a comprehensive course in analysis, part 4

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 4 focuses on operator theory, especially on a Hilbert space. Central topics are the spectral theorem, the theory of trace class and Fredholm determinants, and the study of

  4. The fifth generation computer project state of the art report 111

    CERN Document Server

    Scarrott

    1983-01-01

    The Fifth Generation Computer Project is a two-part book consisting of the invited papers and the analysis. The invited papers examine various aspects of The Fifth Generation Computer Project. The analysis part assesses the major advances of the Fifth Generation Computer Project and provides a balanced analysis of the state of the art in The Fifth Generation. This part provides a balanced and comprehensive view of the development in Fifth Generation Computer technology. The Bibliography compiles the most important published material on the subject of The Fifth Generation.

  5. Acute myocardial ischemia after aortic valve replacement: A comprehensive diagnostic evaluation using dynamic multislice spiral computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lembcke, Alexander [Department of Radiology, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany)]. E-mail: alexander.lembcke@gmx.de; Hein, Patrick A. [Department of Radiology, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany); Enzweiler, Christian N.H. [Department of Radiology, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany); Hoffmann, Udo [Department of Radiology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Klessen, Christian [Department of Radiology, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany); Dohmen, Pascal M. [Department of Cardiovascular Surgery, Charite-Universitaetsmedizin Berlin, Freie Universitaet Berlin and Humboldt-Universitaet zu Berlin, Berlin (Germany)

    2006-03-15

    We describe the case of a 72-year-old man presenting with endocarditis and clinical signs of acute myocardial ischemia after biological aortic valve replacement. A comprehensive cardiac dynamic multislice spiral computed tomography demonstrated: (1) an endocarditic vegetation of the aortic valve; (2) a subvalvular leakage feeding a paravalvular pseudoaneurysm based on an aortic root abscess with subsequent compromise of the systolic blood flow in the left main coronary artery and the resulting myocardial perfusion deficit.

  6. Acute myocardial ischemia after aortic valve replacement: A comprehensive diagnostic evaluation using dynamic multislice spiral computed tomography

    International Nuclear Information System (INIS)

    Lembcke, Alexander; Hein, Patrick A.; Enzweiler, Christian N.H.; Hoffmann, Udo; Klessen, Christian; Dohmen, Pascal M.

    2006-01-01

    We describe the case of a 72-year-old man presenting with endocarditis and clinical signs of acute myocardial ischemia after biological aortic valve replacement. A comprehensive cardiac dynamic multislice spiral computed tomography demonstrated: (1) an endocarditic vegetation of the aortic valve; (2) a subvalvular leakage feeding a paravalvular pseudoaneurysm based on an aortic root abscess with subsequent compromise of the systolic blood flow in the left main coronary artery and the resulting myocardial perfusion deficit

  7. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  8. iSmaRT: a toolkit for a comprehensive analysis of small RNA-Seq data.

    Science.gov (United States)

    Panero, Riccardo; Rinaldi, Antonio; Memoli, Domenico; Nassa, Giovanni; Ravo, Maria; Rizzo, Francesca; Tarallo, Roberta; Milanesi, Luciano; Weisz, Alessandro; Giurato, Giorgio

    2017-03-15

    The interest in investigating the biological roles of small non-coding RNAs (sncRNAs) is increasing, due to the pleiotropic effects of these molecules exert in many biological contexts. While several methods and tools are available to study microRNAs (miRNAs), only few focus on novel classes of sncRNAs, in particular PIWI-interacting RNAs (piRNAs). To overcome these limitations, we implemented iSmaRT ( i ntegrative Sm all R NA T ool-kit), an automated pipeline to analyze smallRNA-Seq data. iSmaRT is a collection of bioinformatics tools and own algorithms, interconnected through a Graphical User Interface (GUI). In addition to performing comprehensive analyses on miRNAs, it implements specific computational modules to analyze piRNAs, predicting novel ones and identifying their RNA targets. A smallRNA-Seq dataset generated from brain samples of Huntington's Disease patients was used here to illustrate iSmaRT performances, demonstrating how the pipeline can provide, in a rapid and user friendly way, a comprehensive analysis of different classes of sncRNAs. iSmaRT is freely available on the web at ftp://labmedmolge-1.unisa.it (User: iSmart - Password: password). aweisz@unisa.it or ggiurato@unisa.it. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  9. Comprehensive physical analysis of bond wire interfaces in power modules

    DEFF Research Database (Denmark)

    Popok, Vladimir; Pedersen, Kristian Bonderup; Kristensen, Peter Kjær

    2016-01-01

    causing failures. In this paper we present a review on the set of our experimental and theoretical studies allowing comprehensive physical analysis of changes in materials under active power cycling with focus on bond wire interfaces and thin metallisation layers. The developed electro-thermal and thermo...

  10. An introduction to computational micromechanics

    CERN Document Server

    Zohdi, Tarek I; Zohdi, Tarek I

    2004-01-01

    In this, its second corrected printing, Zohdi and Wriggers' illuminating text presents a comprehensive introduction to the subject. The authors include in their scope basic homogenization theory, microstructural optimization and multifield analysis of heterogeneous materials. This volume is ideal for researchers and engineers, and can be used in a first-year course for graduate students with an interest in the computational micromechanical analysis of new materials.

  11. National cyber defense high performance computing and analysis : concepts, planning and roadmap.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Jason R.; Keliiaa, Curtis M.

    2010-09-01

    There is a national cyber dilemma that threatens the very fabric of government, commercial and private use operations worldwide. Much is written about 'what' the problem is, and though the basis for this paper is an assessment of the problem space, we target the 'how' solution space of the wide-area national information infrastructure through the advancement of science, technology, evaluation and analysis with actionable results intended to produce a more secure national information infrastructure and a comprehensive national cyber defense capability. This cybersecurity High Performance Computing (HPC) analysis concepts, planning and roadmap activity was conducted as an assessment of cybersecurity analysis as a fertile area of research and investment for high value cybersecurity wide-area solutions. This report and a related SAND2010-4765 Assessment of Current Cybersecurity Practices in the Public Domain: Cyber Indications and Warnings Domain report are intended to provoke discussion throughout a broad audience about developing a cohesive HPC centric solution to wide-area cybersecurity problems.

  12. Comprehensive analysis of RNA-Seq data reveals extensive RNA editing in a human transcriptome

    DEFF Research Database (Denmark)

    Peng, Zhiyu; Cheng, Yanbing; Tan, Bertrand Chin-Ming

    2012-01-01

    a computational pipeline that carefully controls for false positives while calling RNA editing events from genome and whole-transcriptome data of the same individual. We identified 22,688 RNA editing events in noncoding genes and introns, untranslated regions and coding sequences of protein-coding genes. Most......RNA editing is a post-transcriptional event that recodes hereditary information. Here we describe a comprehensive profile of the RNA editome of a male Han Chinese individual based on analysis of ∼767 million sequencing reads from poly(A)(+), poly(A)(-) and small RNA samples. We developed...... changes (∼93%) converted A to I(G), consistent with known editing mechanisms based on adenosine deaminase acting on RNA (ADAR). We also found evidence of other types of nucleotide changes; however, these were validated at lower rates. We found 44 editing sites in microRNAs (miRNAs), suggesting a potential...

  13. Spectrum of physics comprehension

    International Nuclear Information System (INIS)

    Blasiak, W; Godlewska, M; Rosiek, R; Wcislo, D

    2012-01-01

    The paper presents the results of research on the relationship between self-assessed comprehension of physics lectures and final grades of junior high school students (aged 13-15), high school students (aged 16-18) and physics students at the Pedagogical University of Cracow, Poland (aged 21). Students' declared level of comprehension was measured during a physics lecture on a prearranged scale of 1-10 with the use of a personal response system designed for the purpose of this experiment. Through the use of this tool, we obtained about 2000 computer records of students' declared comprehension of a 45 min lecture, which we named ‘the spectrum of comprehension’. In this paper, we present and analyse the correlation between students' declared comprehension of the content presented in the lecture and their final learning results. (paper)

  14. Comprehensive proteomic analysis of human pancreatic juice

    DEFF Research Database (Denmark)

    Grønborg, Mads; Bunkenborg, Jakob; Kristiansen, Troels Zakarias

    2004-01-01

    Proteomic technologies provide an excellent means for analysis of body fluids for cataloging protein constituents and identifying biomarkers for early detection of cancers. The biomarkers currently available for pancreatic cancer, such as CA19-9, lack adequate sensitivity and specificity...... contributing to late diagnosis of this deadly disease. In this study, we carried out a comprehensive characterization of the "pancreatic juice proteome" in patients with pancreatic adenocarcinoma. Pancreatic juice was first fractionated by 1-dimensional gel electrophoresis and subsequently analyzed by liquid...... in this study could be directly assessed for their potential as biomarkers for pancreatic cancer by quantitative proteomics methods or immunoassays....

  15. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  16. A practice course to cultivate students' comprehensive ability of photoelectricity

    Science.gov (United States)

    Lv, Yong; Liu, Yang; Niu, Chunhui; Liu, Lishuang

    2017-08-01

    After the studying of many theoretical courses, it's important and urgent for the students from specialty of optoelectronic information science and engineering to cultivate their comprehensive ability of photoelectricity. We set up a comprehensive practice course named "Integrated Design of Optoelectronic Information System" (IDOIS) for the purpose that students can integrate their knowledge of optics, electronics and computer programming to design, install and debug an optoelectronic system with independent functions. Eight years of practice shows that this practice course can train students' ability of analysis, design/development and debugging of photoelectric system, improve their ability in document retrieval, design proposal and summary report writing, teamwork, innovation consciousness and skill.

  17. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  18. Computational Cognitive Color Perception

    NARCIS (Netherlands)

    Ciftcioglu, O.; Bittermann, M.S.

    2016-01-01

    Comprehension of aesthetical color characteristics based on a computational model of visual perception and color cognition are presented. The computational comprehension is manifested by the machine’s capability of instantly assigning appropriate colors to the objects perceived. They form a scene

  19. Analysis of a comprehensive quality assurance program with computer-enhanced monitors

    International Nuclear Information System (INIS)

    Arenson, R.L.; Mintz, M.C.; Goldstein, E.; Stevens, J.F.; Jovais, C.

    1987-01-01

    The authors' quality assurance (QA) program provides communication pathways among its constituent committees, which include patient care, professional review, medical staff, missed case, quality control, safety, and management committees. The QA monitors are based on data from these committees but also include data from the information management system, such as patient delays, contrast reactions, incidents, complications, time-flow analyses, film library retrieval, cancellations, missing reports, and missing clinical data. Committee data include complaints, missed diagnoses, patient identification problems, and equipment failure. The QA monitors have now been incorporated into summary reports as part of their computer networks. A systematic method for follow-up ensures corrective action and documentation. Examples of improved quality of care resulting from this approach includes reductions in delays for report signature and in repeat films

  20. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  1. L2 Reading Comprehension and Its Correlates: A Meta-Analysis

    Science.gov (United States)

    Jeon, Eun Hee; Yamashita, Junko

    2014-01-01

    The present meta-analysis examined the overall average correlation (weighted for sample size and corrected for measurement error) between passage-level second language (L2) reading comprehension and 10 key reading component variables investigated in the research domain. Four high-evidence correlates (with 18 or more accumulated effect sizes: L2…

  2. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  3. Comprehensive two-dimensional liquid chromatographic analysis of poloxamers.

    Science.gov (United States)

    Malik, Muhammad Imran; Lee, Sanghoon; Chang, Taihyun

    2016-04-15

    Poloxamers are low molar mass triblock copolymers of poly(ethylene oxide) (PEO) and poly(propylene oxide) (PPO), having number of applications as non-ionic surfactants. Comprehensive one and two-dimensional liquid chromatographic (LC) analysis of these materials is proposed in this study. The separation of oligomers of both types (PEO and PPO) is demonstrated for several commercial poloxamers. This is accomplished at the critical conditions for one of the block while interaction for the other block. Reversed phase LC at CAP of PEO allowed for oligomeric separation of triblock copolymers with regard to PPO block whereas normal phase LC at CAP of PPO renders oligomeric separation with respect to PEO block. The oligomeric separation with regard to PEO and PPO are coupled online (comprehensive 2D-LC) to reveal two-dimensional contour plots by unconventional 2D IC×IC (interaction chromatography) coupling. The study provides chemical composition mapping of both PEO and PPO, equivalent to combined molar mass and chemical composition mapping for several commercial poloxamers. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  5. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  6. Impact analysis on a massively parallel computer

    International Nuclear Information System (INIS)

    Zacharia, T.; Aramayo, G.A.

    1994-01-01

    Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper

  7. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  8. CPSS: a computational platform for the analysis of small RNA deep sequencing data.

    Science.gov (United States)

    Zhang, Yuanwei; Xu, Bo; Yang, Yifan; Ban, Rongjun; Zhang, Huan; Jiang, Xiaohua; Cooke, Howard J; Xue, Yu; Shi, Qinghua

    2012-07-15

    Next generation sequencing (NGS) techniques have been widely used to document the small ribonucleic acids (RNAs) implicated in a variety of biological, physiological and pathological processes. An integrated computational tool is needed for handling and analysing the enormous datasets from small RNA deep sequencing approach. Herein, we present a novel web server, CPSS (a computational platform for the analysis of small RNA deep sequencing data), designed to completely annotate and functionally analyse microRNAs (miRNAs) from NGS data on one platform with a single data submission. Small RNA NGS data can be submitted to this server with analysis results being returned in two parts: (i) annotation analysis, which provides the most comprehensive analysis for small RNA transcriptome, including length distribution and genome mapping of sequencing reads, small RNA quantification, prediction of novel miRNAs, identification of differentially expressed miRNAs, piwi-interacting RNAs and other non-coding small RNAs between paired samples and detection of miRNA editing and modifications and (ii) functional analysis, including prediction of miRNA targeted genes by multiple tools, enrichment of gene ontology terms, signalling pathway involvement and protein-protein interaction analysis for the predicted genes. CPSS, a ready-to-use web server that integrates most functions of currently available bioinformatics tools, provides all the information wanted by the majority of users from small RNA deep sequencing datasets. CPSS is implemented in PHP/PERL+MySQL+R and can be freely accessed at http://mcg.ustc.edu.cn/db/cpss/index.html or http://mcg.ustc.edu.cn/sdap1/cpss/index.html.

  9. A comprehensive computational model of sound transmission through the porcine lung.

    Science.gov (United States)

    Dai, Zoujun; Peng, Ying; Henry, Brian M; Mansy, Hansen A; Sandler, Richard H; Royston, Thomas J

    2014-09-01

    A comprehensive computational simulation model of sound transmission through the porcine lung is introduced and experimentally evaluated. This "subject-specific" model utilizes parenchymal and major airway geometry derived from x-ray CT images. The lung parenchyma is modeled as a poroviscoelastic material using Biot theory. A finite element (FE) mesh of the lung that includes airway detail is created and used in comsol FE software to simulate the vibroacoustic response of the lung to sound input at the trachea. The FE simulation model is validated by comparing simulation results to experimental measurements using scanning laser Doppler vibrometry on the surface of an excised, preserved lung. The FE model can also be used to calculate and visualize vibroacoustic pressure and motion inside the lung and its airways caused by the acoustic input. The effect of diffuse lung fibrosis and of a local tumor on the lung acoustic response is simulated and visualized using the FE model. In the future, this type of visualization can be compared and matched with experimentally obtained elastographic images to better quantify regional lung material properties to noninvasively diagnose and stage disease and response to treatment.

  10. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  11. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  12. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  13. Use of the Comprehensive Inversion method for Swarm satellite data analysis

    DEFF Research Database (Denmark)

    Sabaka, T. J.; Tøffner-Clausen, Lars; Olsen, Nils

    2013-01-01

    An advanced algorithm, known as the “Comprehensive Inversion” (CI), is presented for the analysis of Swarm measurements to generate a consistent set of Level-2 data products to be delivered by the Swarm “Satellite Constellation Application and Research Facility” (SCARF) to the European Space Agency...

  14. Implementation of the Principal Component Analysis onto High-Performance Computer Facilities for Hyperspectral Dimensionality Reduction: Results and Comparisons

    Directory of Open Access Journals (Sweden)

    Ernestina Martel

    2018-06-01

    Full Text Available Dimensionality reduction represents a critical preprocessing step in order to increase the efficiency and the performance of many hyperspectral imaging algorithms. However, dimensionality reduction algorithms, such as the Principal Component Analysis (PCA, suffer from their computationally demanding nature, becoming advisable for their implementation onto high-performance computer architectures for applications under strict latency constraints. This work presents the implementation of the PCA algorithm onto two different high-performance devices, namely, an NVIDIA Graphics Processing Unit (GPU and a Kalray manycore, uncovering a highly valuable set of tips and tricks in order to take full advantage of the inherent parallelism of these high-performance computing platforms, and hence, reducing the time that is required to process a given hyperspectral image. Moreover, the achieved results obtained with different hyperspectral images have been compared with the ones that were obtained with a field programmable gate array (FPGA-based implementation of the PCA algorithm that has been recently published, providing, for the first time in the literature, a comprehensive analysis in order to highlight the pros and cons of each option.

  15. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  16. Turbo Pascal Computer Code for PIXE Analysis

    International Nuclear Information System (INIS)

    Darsono

    2002-01-01

    To optimal utilization of the 150 kV ion accelerator facilities and to govern the analysis technique using ion accelerator, the research and development of low energy PIXE technology has been done. The R and D for hardware of the low energy PIXE installation in P3TM have been carried on since year 2000. To support the R and D of PIXE accelerator facilities in harmonize with the R and D of the PIXE hardware, the development of PIXE software for analysis is also needed. The development of database of PIXE software for analysis using turbo Pascal computer code is reported in this paper. This computer code computes the ionization cross-section, the fluorescence yield, and the stopping power of elements also it computes the coefficient attenuation of X- rays energy. The computer code is named PIXEDASIS and it is part of big computer code planed for PIXE analysis that will be constructed in the near future. PIXEDASIS is designed to be communicative with the user. It has the input from the keyboard. The output shows in the PC monitor, which also can be printed. The performance test of the PIXEDASIS shows that it can be operated well and it can provide data agreement with data form other literatures. (author)

  17. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  18. Computer-Assisted Linguistic Analysis of the Peshitta

    NARCIS (Netherlands)

    Roorda, D.; Talstra, Eep; Dyk, Janet; van Keulen, Percy; Sikkel, Constantijn; Bosman, H.J.; Jenner, K.D.; Bakker, Dirk; Volkmer, J.A.; Gutman, Ariel; van Peursen, Wido Th.

    2014-01-01

    CALAP (Computer-Assisted Linguistic Analysis of the Peshitta), a joint research project of the Peshitta Institute Leiden and the Werkgroep Informatica at the Vrije Universiteit Amsterdam (1999-2005) CALAP concerned the computer-assisted analysis of the Peshitta to Kings (Janet Dyk and Percy van

  19. Computer aided safety analysis 1989

    International Nuclear Information System (INIS)

    1990-04-01

    The meeting was conducted in a workshop style, to encourage involvement of all participants during the discussions. Forty-five (45) experts from 19 countries, plus 22 experts from the GDR participated in the meeting. A list of participants can be found at the end of this volume. Forty-two (42) papers were presented and discussed during the meeting. Additionally an open discussion was held on the possible directions of the IAEA programme on Computer Aided Safety Analysis. A summary of the conclusions of these discussions is presented in the publication. The remainder of this proceedings volume comprises the transcript of selected technical papers (22) presented in the meeting. It is the intention of the IAEA that the publication of these proceedings will extend the benefits of the discussions held during the meeting to a larger audience throughout the world. The Technical Committee/Workshop on Computer Aided Safety Analysis was organized by the IAEA in cooperation with the National Board for Safety and Radiological Protection (SAAS) of the German Democratic Republic in Berlin. The purpose of the meeting was to provide an opportunity for discussions on experiences in the use of computer codes used for safety analysis of nuclear power plants. In particular it was intended to provide a forum for exchange of information among experts using computer codes for safety analysis under the Technical Cooperation Programme on Safety of WWER Type Reactors (RER/9/004) and other experts throughout the world. A separate abstract was prepared for each of the 22 selected papers. Refs, figs tabs and pictures

  20. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1988-01-01

    This paper gives a collective summary of the studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRANTIC, FTAP, computer code package RALLY, and BOUNDS codes. Two reference study cases were executed by each code. The results obtained logic/probabilistic analysis as well as computation time are compared

  1. Identification Of Protein Vaccine Candidates Using Comprehensive Proteomic Analysis Strategies

    Science.gov (United States)

    2007-12-01

    that fascinating fungus known as Coccidioides. I also want to thank the UA Mass Spectrometry Facility and the UA Proteomics Consortium, especially...W. & N. N. Kav. 2006. The proteome of the phytopathogenic fungus Sclerotinia sclerotiorum. Proteomics 6: 5995-6007. 127. de Godoy, L. M., J. V...IDENTIFICATION OF PROTEIN VACCINE CANDIDATES USING COMPREHENSIVE PROTEOMIC ANALYSIS STRATEGIES by James G. Rohrbough

  2. Molecular and Thermodynamic Properties of Zwitterions versus Ionic Liquids: A Comprehensive Computational Analysis to Develop Advanced Separation Processes.

    Science.gov (United States)

    Moreno, Daniel; Gonzalez-Miquel, Maria; Ferro, Victor R; Palomar, Jose

    2018-04-05

    Zwitterion ionic liquids (ZIs) are compounds in which both counterions are covalently tethered, conferring them with unique characteristics; however, most of their properties are still unknown, representing a bottleneck to exploit their practical applications. Herein, the molecular and fluid properties of ZIs and their mixtures were explored by means of quantum chemical analysis based on the density functional theory (DFT) and COSMO-RS method, and compared against homologous ionic liquids (ILs) to provide a comprehensive overview of the effect of the distinct structures on their physicochemical and thermodynamic behavior. Overall, ZIs were revealed as compounds with higher polarity and stronger hydrogen-bonding capacity, implying higher density, viscosity, melting point, and even lower volatility than structurally similar ILs. The phase equilibrium of binary and ternary systems supports stronger attractive interactions between ZIs and polar compounds, whereas higher liquid-liquid immiscibility with nonpolar compounds may be expected. Ultimately, the performance of ZIs in the wider context of separation processes is illustrated, while providing molecular insights to allow their selection and design for relevant applications. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. CGHPRO – A comprehensive data analysis tool for array CGH

    Directory of Open Access Journals (Sweden)

    Lenzner Steffen

    2005-04-01

    Full Text Available Abstract Background Array CGH (Comparative Genomic Hybridisation is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the

  4. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms.

  5. First Comprehensive In Silico Analysis of the Functional and Structural Consequences of SNPs in Human GalNAc-T1 Gene

    Directory of Open Access Journals (Sweden)

    Hussein Sheikh Ali Mohamoud

    2014-01-01

    Full Text Available GalNAc-T1, a key candidate of GalNac-transferases genes family that is involved in mucin-type O-linked glycosylation pathway, is expressed in most biological tissues and cell types. Despite the reported association of GalNAc-T1 gene mutations with human disease susceptibility, the comprehensive computational analysis of coding, noncoding and regulatory SNPs, and their functional impacts on protein level, still remains unknown. Therefore, sequence- and structure-based computational tools were employed to screen the entire listed coding SNPs of GalNAc-T1 gene in order to identify and characterize them. Our concordant in silico analysis by SIFT, PolyPhen-2, PANTHER-cSNP, and SNPeffect tools, identified the potential nsSNPs (S143P, G258V, and Y414D variants from 18 nsSNPs of GalNAc-T1. Additionally, 2 regulatory SNPs (rs72964406 and #x26; rs34304568 were also identified in GalNAc-T1 by using FastSNP tool. Using multiple computational approaches, we have systematically classified the functional mutations in regulatory and coding regions that can modify expression and function of GalNAc-T1 enzyme. These genetic variants can further assist in better understanding the wide range of disease susceptibility associated with the mucin-based cell signalling and pathogenic binding, and may help to develop novel therapeutic elements for associated diseases.

  6. Comprehensive numerical modelling of tokamaks

    International Nuclear Information System (INIS)

    Cohen, R.H.; Cohen, B.I.; Dubois, P.F.

    1991-01-01

    We outline a plan for the development of a comprehensive numerical model of tokamaks. The model would consist of a suite of independent, communicating packages describing the various aspects of tokamak performance (core and edge transport coefficients and profiles, heating, fueling, magnetic configuration, etc.) as well as extensive diagnostics. These codes, which may run on different computers, would be flexibly linked by a user-friendly shell which would allow run-time specification of packages and generation of pre- and post-processing functions, including workstation-based visualization of output. One package in particular, the calculation of core transport coefficients via gyrokinetic particle simulation, will become practical on the scale required for comprehensive modelling only with the advent of teraFLOP computers. Incremental effort at LLNL would be focused on gyrokinetic simulation and development of the shell

  7. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  8. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  9. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  10. Computational intelligence in biomedical imaging

    CERN Document Server

    2014-01-01

    This book provides a comprehensive overview of the state-of-the-art computational intelligence research and technologies in biomedical images with emphasis on biomedical decision making. Biomedical imaging offers useful information on patients’ medical conditions and clues to causes of their symptoms and diseases. Biomedical images, however, provide a large number of images which physicians must interpret. Therefore, computer aids are demanded and become indispensable in physicians’ decision making. This book discusses major technical advancements and research findings in the field of computational intelligence in biomedical imaging, for example, computational intelligence in computer-aided diagnosis for breast cancer, prostate cancer, and brain disease, in lung function analysis, and in radiation therapy. The book examines technologies and studies that have reached the practical level, and those technologies that are becoming available in clinical practices in hospitals rapidly such as computational inte...

  11. Pixel-based analysis of comprehensive two-dimensional gas chromatograms (color plots) of petroleum

    DEFF Research Database (Denmark)

    Furbo, Søren; Hansen, Asger B.; Skov, Thomas

    2014-01-01

    We demonstrate how to process comprehensive two-dimensional gas chromatograms (GC × GC chromatograms) to remove nonsample information (artifacts), including background and retention time shifts. We also demonstrate how this, combined with further reduction of the influence of irrelevant informati......, allows for data analysis without integration or peak deconvolution (pixelbased analysis)....

  12. [Semantic Network Analysis of Online News and Social Media Text Related to Comprehensive Nursing Care Service].

    Science.gov (United States)

    Kim, Minji; Choi, Mona; Youm, Yoosik

    2017-12-01

    As comprehensive nursing care service has gradually expanded, it has become necessary to explore the various opinions about it. The purpose of this study is to explore the large amount of text data regarding comprehensive nursing care service extracted from online news and social media by applying a semantic network analysis. The web pages of the Korean Nurses Association (KNA) News, major daily newspapers, and Twitter were crawled by searching the keyword 'comprehensive nursing care service' using Python. A morphological analysis was performed using KoNLPy. Nodes on a 'comprehensive nursing care service' cluster were selected, and frequency, edge weight, and degree centrality were calculated and visualized with Gephi for the semantic network. A total of 536 news pages and 464 tweets were analyzed. In the KNA News and major daily newspapers, 'nursing workforce' and 'nursing service' were highly rated in frequency, edge weight, and degree centrality. On Twitter, the most frequent nodes were 'National Health Insurance Service' and 'comprehensive nursing care service hospital.' The nodes with the highest edge weight were 'national health insurance,' 'wards without caregiver presence,' and 'caregiving costs.' 'National Health Insurance Service' was highest in degree centrality. This study provides an example of how to use atypical big data for a nursing issue through semantic network analysis to explore diverse perspectives surrounding the nursing community through various media sources. Applying semantic network analysis to online big data to gather information regarding various nursing issues would help to explore opinions for formulating and implementing nursing policies. © 2017 Korean Society of Nursing Science

  13. A directory of computer codes suitable for stress analysis of HLW containers - Compas project

    International Nuclear Information System (INIS)

    1989-01-01

    This document reports the work carried out for the Compas project which looked at the capabilities of various computer codes for the stress analysis of high-level nuclear-waste containers and overpacks. The report concentrates on codes used by the project partners, but also includes a number of the major commercial finite element codes. The report falls into two parts. The first part of the report describes the capabilities of the codes. This includes details of the solution methods used in the codes, the types of analysis which they can carry out and the interfacing with pre - and post - processing packages. This is the more comprehensive section of the report. The second part of the report looks at the performance of a selection of the codes (those used by the project partners). This look at how the codes perform in a number of test problems which require calculations typical of those encountered in the design and analysis of high-level waste containers and overpacks

  14. Introducing People – Genre Analysis and Oral Comprehension and Oral Production Tasks

    Directory of Open Access Journals (Sweden)

    Keila Rocha Reis de Carvalho

    2012-02-01

    Full Text Available This paper aims at presenting an analysis of the genre introducing people and at suggesting listening comprehension and oral production tasks. This work was developed according to the characterization of the rhetorical organization of situations taken from seventeen films that contain the genre under analysis. Although several studies in the ESP area carried out recently (Andrade, 2003; Cardoso, 2003; Shergue, 2003; Belmonte, 2003; Serafini, 2003 have identified listening comprehension and oral production as the abilities that should be prioritized in an English course, much needs to be done, especially concerning the oral genres that take into account the language the learners of English as a second language need in their target situation. This work is based on Hutchinson & Waters (1987 theoretical background on ESP, Swales’ (1990 genre analysis, Ramos’ (2004 pedagogical proposal, and also on Ellis´ (2003 tasks concept. The familiarization of learners of English as a second language with this genre will provide them with the opportunity to better understand and use the English language in their academic and professional life.

  15. Safety analysis of control rod drive computers

    International Nuclear Information System (INIS)

    Ehrenberger, W.; Rauch, G.; Schmeil, U.; Maertz, J.; Mainka, E.U.; Nordland, O.; Gloee, G.

    1985-01-01

    The analysis of the most significant user programmes revealed no errors in these programmes. The evaluation of approximately 82 cumulated years of operation demonstrated that the operating system of the control rod positioning processor has a reliability that is sufficiently good for the tasks this computer has to fulfil. Computers can be used for safety relevant tasks. The experience gained with the control rod positioning processor confirms that computers are not less reliable than conventional instrumentation and control system for comparable tasks. The examination and evaluation of computers for safety relevant tasks can be done with programme analysis or statistical evaluation of the operating experience. Programme analysis is recommended for seldom used and well structured programmes. For programmes with a long, cumulated operating time a statistical evaluation is more advisable. The effort for examination and evaluation is not greater than the corresponding effort for conventional instrumentation and control systems. This project has also revealed that, where it is technologically sensible, process controlling computers or microprocessors can be qualified for safety relevant tasks without undue effort. (orig./HP) [de

  16. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  17. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  18. Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.

    Science.gov (United States)

    Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru

    2015-01-21

    Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.

  19. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  20. Resilient computer system design

    CERN Document Server

    Castano, Victor

    2015-01-01

    This book presents a paradigm for designing new generation resilient and evolving computer systems, including their key concepts, elements of supportive theory, methods of analysis and synthesis of ICT with new properties of evolving functioning, as well as implementation schemes and their prototyping. The book explains why new ICT applications require a complete redesign of computer systems to address challenges of extreme reliability, high performance, and power efficiency. The authors present a comprehensive treatment for designing the next generation of computers, especially addressing safety-critical, autonomous, real time, military, banking, and wearable health care systems.   §  Describes design solutions for new computer system - evolving reconfigurable architecture (ERA) that is free from drawbacks inherent in current ICT and related engineering models §  Pursues simplicity, reliability, scalability principles of design implemented through redundancy and re-configurability; targeted for energy-,...

  1. Comprehensive transportation risk assessment system based on unit-consequence factors

    International Nuclear Information System (INIS)

    Biwer, B.M.; Monette, F.A.; LePoire, D.J.; Chen, S.Y.

    1994-01-01

    The U.S. Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement requires a comprehensive transportation risk analysis of radioactive waste shipments for large shipping campaigns. Thousands of unique shipments involving truck and rail transport must be analyzed; a comprehensive risk analysis is impossible with currently available methods. Argonne National Laboratory developed a modular transportation model that can handle the demands imposed by such an analysis. The modular design of the model facilitates the simple addition/updating of transportation routes and waste inventories, as required, and reduces the overhead associated with file maintenance and quality assurance. The model incorporates unit-consequences factors generated with the RADTRAN 4 transportation risk analysis code that are combined with an easy-to-use, menu-driven interface on IBM-compatible computers running under DOS. User selection of multiple origin/destination site pairs for the shipment of multiple radioactive waste inventories is permitted from pop-up lists. Over 800 predefined routes are available among more than 30 DOE sites and waste inventories that include high-level waste, spent nuclear fuel, transuranic waste, low-level waste, low-level mixed waste, and greater-than-Class C waste

  2. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  3. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  4. Comprehensive two-dimensional gas chromatography for the analysis of organohalogenated micro-contaminants

    NARCIS (Netherlands)

    Korytar, P.; Haglund, P.; Boer, de J.; Brinkman, U.A.Th.

    2006-01-01

    We explain the principles of comprehensive two-dimensional gas chromatography (GC × GC), and discuss key instrumental aspects - with emphasis on column combinations and mass spectrometric detection. As the main item of interest, we review the potential of GC × GC for the analysis of

  5. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  6. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    Science.gov (United States)

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  7. Key Concept Identification: A Comprehensive Analysis of Frequency and Topical Graph-Based Approaches

    Directory of Open Access Journals (Sweden)

    Muhammad Aman

    2018-05-01

    Full Text Available Automatic key concept extraction from text is the main challenging task in information extraction, information retrieval and digital libraries, ontology learning, and text analysis. The statistical frequency and topical graph-based ranking are the two kinds of potentially powerful and leading unsupervised approaches in this area, devised to address the problem. To utilize the potential of these approaches and improve key concept identification, a comprehensive performance analysis of these approaches on datasets from different domains is needed. The objective of the study presented in this paper is to perform a comprehensive empirical analysis of selected frequency and topical graph-based algorithms for key concept extraction on three different datasets, to identify the major sources of error in these approaches. For experimental analysis, we have selected TF-IDF, KP-Miner and TopicRank. Three major sources of error, i.e., frequency errors, syntactical errors and semantical errors, and the factors that contribute to these errors are identified. Analysis of the results reveals that performance of the selected approaches is significantly degraded by these errors. These findings can help us develop an intelligent solution for key concept extraction in the future.

  8. Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis

    Science.gov (United States)

    2016-07-31

    distribution unlimited Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis Matthew...vital importance for hydrocarbon -fueled propulsion systems: fuel thermal performance as indicated by physical and chemical effects of cooling passage... analysis . The selection and acquisition of a set of chemically diverse fuels is pivotal for a successful outcome since test method validation and

  9. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  10. Swabs to genomes: a comprehensive workflow

    Directory of Open Access Journals (Sweden)

    Madison I. Dunitz

    2015-05-01

    Full Text Available The sequencing, assembly, and basic analysis of microbial genomes, once a painstaking and expensive undertaking, has become much easier for research labs with access to standard molecular biology and computational tools. However, there are a confusing variety of options available for DNA library preparation and sequencing, and inexperience with bioinformatics can pose a significant barrier to entry for many who may be interested in microbial genomics. The objective of the present study was to design, test, troubleshoot, and publish a simple, comprehensive workflow from the collection of an environmental sample (a swab to a published microbial genome; empowering even a lab or classroom with limited resources and bioinformatics experience to perform it.

  11. Diagnostic and therapeutic implications of genetic heterogeneity in myeloid neoplasms uncovered by comprehensive mutational analysis

    Directory of Open Access Journals (Sweden)

    Sarah M. Choi

    2017-01-01

    Full Text Available While growing use of comprehensive mutational analysis has led to the discovery of innumerable genetic alterations associated with various myeloid neoplasms, the under-recognized phenomenon of genetic heterogeneity within such neoplasms creates a potential for diagnostic confusion. Here, we describe two cases where expanded mutational testing led to amendment of an initial diagnosis of chronic myelogenous leukemia with subsequent altered treatment of each patient. We demonstrate the power of comprehensive testing in ensuring appropriate classification of genetically heterogeneous neoplasms, and emphasize thoughtful analysis of molecular and genetic data as an essential component of diagnosis and management.

  12. Comprehensive Analysis Competence and Innovative Approaches for Sustainable Chemical Production.

    Science.gov (United States)

    Appel, Joerg; Colombo, Corrado; Dätwyler, Urs; Chen, Yun; Kerimoglu, Nimet

    2016-01-01

    Humanity currently sees itself facing enormous economic, ecological, and social challenges. Sustainable products and production in specialty chemistry are an important strategic element to address these megatrends. In addition to that, digitalization and global connectivity will create new opportunities for the industry. One aspect is examined in this paper, which shows the development of comprehensive analysis of production networks for a more sustainable production in which the need for innovative solutions arises. Examples from data analysis, advanced process control and automated performance monitoring are shown. These efforts have significant impact on improved yields, reduced energy and water consumption, and better product performance in the application of the products.

  13. Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.

    Science.gov (United States)

    Xia, Jianguo; Wishart, David S

    2016-09-07

    MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  14. Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones

    Science.gov (United States)

    2015-07-01

    Android source . 3.1.2.2 Analyzers An analyzer conforms to specifications defined by the Security Toolbox. Specifically an analyzer encapsulates a...COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed

  15. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  16. Normal-Gamma-Bernoulli Peak Detection for Analysis of Comprehensive Two-Dimensional Gas Chromatography Mass Spectrometry Data.

    Science.gov (United States)

    Kim, Seongho; Jang, Hyejeong; Koo, Imhoi; Lee, Joohyoung; Zhang, Xiang

    2017-01-01

    Compared to other analytical platforms, comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS) has much increased separation power for analysis of complex samples and thus is increasingly used in metabolomics for biomarker discovery. However, accurate peak detection remains a bottleneck for wide applications of GC×GC-MS. Therefore, the normal-exponential-Bernoulli (NEB) model is generalized by gamma distribution and a new peak detection algorithm using the normal-gamma-Bernoulli (NGB) model is developed. Unlike the NEB model, the NGB model has no closed-form analytical solution, hampering its practical use in peak detection. To circumvent this difficulty, three numerical approaches, which are fast Fourier transform (FFT), the first-order and the second-order delta methods (D1 and D2), are introduced. The applications to simulated data and two real GC×GC-MS data sets show that the NGB-D1 method performs the best in terms of both computational expense and peak detection performance.

  17. Interface between computational fluid dynamics (CFD) and plant analysis computer codes

    International Nuclear Information System (INIS)

    Coffield, R.D.; Dunckhorst, F.F.; Tomlinson, E.T.; Welch, J.W.

    1993-01-01

    Computational fluid dynamics (CFD) can provide valuable input to the development of advanced plant analysis computer codes. The types of interfacing discussed in this paper will directly contribute to modeling and accuracy improvements throughout the plant system and should result in significant reduction of design conservatisms that have been applied to such analyses in the past

  18. Computer-Aided Qualitative Data Analysis with Word

    Directory of Open Access Journals (Sweden)

    Bruno Nideröst

    2002-05-01

    Full Text Available Despite some fragmentary references in the literature about qualitative methods, it is fairly unknown that Word can be successfully used for computer-aided Qualitative Data Analyses (QDA. Based on several Word standard operations, elementary QDA functions such as sorting data, code-and-retrieve and frequency counts can be realized. Word is particularly interesting for those users who wish to have first experiences with computer-aided analysis before investing time and money in a specialized QDA Program. The well-known standard software could also be an option for those qualitative researchers who usually work with word processing but have certain reservations towards computer-aided analysis. The following article deals with the most important requirements and options of Word for computer-aided QDA. URN: urn:nbn:de:0114-fqs0202225

  19. Computer Program for Analysis, Design and Optimization of Propulsion, Dynamics, and Kinematics of Multistage Rockets

    Science.gov (United States)

    Lali, Mehdi

    2009-03-01

    A comprehensive computer program is designed in MATLAB to analyze, design and optimize the propulsion, dynamics, thermodynamics, and kinematics of any serial multi-staging rocket for a set of given data. The program is quite user-friendly. It comprises two main sections: "analysis and design" and "optimization." Each section has a GUI (Graphical User Interface) in which the rocket's data are entered by the user and by which the program is run. The first section analyzes the performance of the rocket that is previously devised by the user. Numerous plots and subplots are provided to display the performance of the rocket. The second section of the program finds the "optimum trajectory" via billions of iterations and computations which are done through sophisticated algorithms using numerical methods and incremental integrations. Innovative techniques are applied to calculate the optimal parameters for the engine and designing the "optimal pitch program." This computer program is stand-alone in such a way that it calculates almost every design parameter in regards to rocket propulsion and dynamics. It is meant to be used for actual launch operations as well as educational and research purposes.

  20. The Effects of Visual Attention Span and Phonological Decoding in Reading Comprehension in Dyslexia: A Path Analysis

    OpenAIRE

    Chen, C.; Schneps, M.; Masyn, K.; Thomson, J.

    2016-01-01

    Increasing evidence has shown visual attention span to be a factor, distinct from phonological skills, that explains single-word identification (pseudo-word/word reading) performance in dyslexia. Yet, little is known about how well visual attention span explains text comprehension. Observing reading comprehension in a sample of 105 high school students with dyslexia, we used a pathway analysis to examine the direct and indirect path between visual attention span and reading comprehension whil...

  1. Computational analysis of a multistage axial compressor

    Science.gov (United States)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  2. Surface computing and collaborative analysis work

    CERN Document Server

    Brown, Judith; Gossage, Stevenson; Hack, Chris

    2013-01-01

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the challenges security personne...

  3. Computational contact and impact mechanics fundamentals of modeling interfacial phenomena in nonlinear finite element analysis

    CERN Document Server

    Laursen, Tod A

    2003-01-01

    This book comprehensively treats the formulation and finite element approximation of contact and impact problems in nonlinear mechanics. Intended for students, researchers and practitioners interested in numerical solid and structural analysis, as well as for engineers and scientists dealing with technologies in which tribological response must be characterized, the book includes an introductory but detailed overview of nonlinear finite element formulations before dealing with contact and impact specifically. Topics encompassed include the continuum mechanics, mathematical structure, variational framework, and finite element implementations associated with contact/impact interaction. Additionally, important and currently emerging research topics in computational contact mechanics are introduced, encompassing such topics as tribological complexity, conservative treatment of inelastic impact interaction, and novel spatial discretization strategies.

  4. Comprehensive analysis of a straw-fired power plant in Vojvodina

    Directory of Open Access Journals (Sweden)

    Urošević Dragan M.

    2012-01-01

    Full Text Available In recent years, renewable energy sources have played an increasingly important role in potential energy production. The integration of renewable energy technologies into existing national energy system has therefore become a major challenge for many countries. Due to the importance of this matter, this paper deals with the comprehensive analysis for implementation of a power plant on biomass (straw. The analysis is conducted regarding several key indicators: availability of biomass, regulation, reduction of greenhouse gas emissions, location, land use, electricity price and social impacts. The analysis also includes favorable price for electricity produced from biomass relevant to national feed in tariffs. In order to demonstrate all above mentioned indicators, the region in Serbia (Province of Vojvodina with significant potential in biomass, especially in straw, is selected. The results of the analysis are validated trough environmental and social aspects. Special attention is given to identifying risks for this application.

  5. DataView: a computational visualisation system for multidisciplinary design and analysis

    Science.gov (United States)

    Wang, Chengen

    2016-01-01

    Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.

  6. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  7. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    Science.gov (United States)

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  8. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  9. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  10. Comprehensive Mass Analysis for Chemical Processes, a Case Study on L-Dopa Manufacture

    Science.gov (United States)

    To evaluate the “greenness” of chemical processes in route selection and process development, we propose a comprehensive mass analysis to inform the stakeholders from different fields. This is carried out by characterizing the mass intensity for each contributing chemical or wast...

  11. Experience with a distributed computing system for magnetic field analysis

    International Nuclear Information System (INIS)

    Newman, M.J.

    1978-08-01

    The development of a general purpose computer system, THESEUS, is described the initial use for which has been magnetic field analysis. The system involves several computers connected by data links. Some are small computers with interactive graphics facilities and limited analysis capabilities, and others are large computers for batch execution of analysis programs with heavy processor demands. The system is highly modular for easy extension and highly portable for transfer to different computers. It can easily be adapted for a completely different application. It provides a highly efficient and flexible interface between magnet designers and specialised analysis programs. Both the advantages and problems experienced are highlighted, together with a mention of possible future developments. (U.K.)

  12. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    Science.gov (United States)

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  13. SANDPIPER I (A comprehensive analysis programme for liquid moderated UO2 lattices)

    International Nuclear Information System (INIS)

    Alpiar, R.A.

    1962-04-01

    Methods of calculation for light water moderated reactors have recently been reviewed in AEEW R64. Calculation schemes for lattice parameters were presented which depended on the use of a number of IBM 704 and Perranti MERCURY Computer Programmes. SANDPIPER I is a comprehensive MERCURY programme designed to cover all the operations with a degree of accuracy adequate for survey calculations. The present version is restricted to regular or near regular UO 2 pin type lattices moderated by H 2 O, D 2 O, or organic liquids; it is planned to allow for greater flexibility in later versions of the programme. The present version is written in Autocode and requires a 4 drum machine. (author)

  14. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  15. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  16. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  17. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  18. SALP-PC, a computer program for fault tree analysis on personal computers

    International Nuclear Information System (INIS)

    Contini, S.; Poucet, A.

    1987-01-01

    The paper presents the main characteristics of the SALP-PC computer code for fault tree analysis. The program has been developed in Fortran 77 on an Olivetti M24 personal computer (IBM compatible) in order to reach a high degree of portability. It is composed of six processors implementing the different phases of the analysis procedure. This particular structure presents some advantages like, for instance, the restart facility and the possibility to develop an event tree analysis code. The set of allowed logical operators, i.e. AND, OR, NOT, K/N, XOR, INH, together with the possibility to define boundary conditions, make the SALP-PC code a powerful tool for risk assessment. (orig.)

  19. Computer aided safety analysis

    International Nuclear Information System (INIS)

    1988-05-01

    The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs

  20. Comprehensive Evaluation and Analysis of China's Mainstream Online Map Service Websites

    Science.gov (United States)

    Zhang, H.; Jiang, J.; Huang, W.; Wang, Q.; Gu, X.

    2012-08-01

    With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a) the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b) a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c) a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.

  1. Workstation computer systems for in-core fuel management

    International Nuclear Information System (INIS)

    Ciccone, L.; Casadei, A.L.

    1992-01-01

    The advancement of powerful engineering workstations has made it possible to have thermal-hydraulics and accident analysis computer programs operating efficiently with a significant performance/cost ratio compared to large mainframe computer. Today, nuclear utilities are acquiring independent engineering analysis capability for fuel management and safety analyses. Computer systems currently available to utility organizations vary widely thus requiring that this software be operational on a number of computer platforms. Recognizing these trends Westinghouse adopted a software development life cycle process for the software development activities which strictly controls the development, testing and qualification of design computer codes. In addition, software standards to ensure maximum portability were developed and implemented, including adherence to FORTRAN 77, and use of uniform system interface and auxiliary routines. A comprehensive test matrix was developed for each computer program to ensure that evolution of code versions preserves the licensing basis. In addition, the results of such test matrices establish the Quality Assurance basis and consistency for the same software operating on different computer platforms. (author). 4 figs

  2. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  3. Robust Nucleus/Cell Detection and Segmentation in Digital Pathology and Microscopy Images: A Comprehensive Review.

    Science.gov (United States)

    Xing, Fuyong; Yang, Lin

    2016-01-01

    Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to interobserver variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literature. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast, fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation.

  4. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  5. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  6. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  7. The application of parameters for comprehensive smile esthetics by digital smile design programs: A review of literature

    Directory of Open Access Journals (Sweden)

    Doya Omar

    2018-01-01

    Full Text Available Cosmetic dentistry is increasingly becoming an issue of concern to patients who hope to improve their smile. A systematic and comprehensive dentofacial analysis must be performed before commencing esthetic treatment. Several computer software programs have been developed for digital smile design (DSD to assist clinicians in this process. This article compares DSD programs commonly used in cosmetic dentistry and their ability to assess esthetic parameters. A literature review was performed of current dentofacial aesthetic parameters and clinical applications of computer technology to assess facial, dentogingival and dental esthetics. Eight DSD programs (Photoshop CS6, Keynote, Planmeca Romexis Smile Design, Cerec SW 4.2, Aesthetic Digital Smile Design, Smile Designer Pro, DSD App and VisagiSMile were compared. Photoshop, Keynote and Aesthetic Digital Smile Design included the largest number of esthetic analysis parameters. Other studied DSD programs presented deficiencies in their ability to analyze facial esthetic parameters but included comprehensive dentogingival and dental esthetic functions. The DSD App, Planmeca Romexis Smile Design, and Cerec SW 4.2 were able to perform 3D analysis; furthermore, Cerec SW 4.2 and PRSD could be used jointly with CAD/CAM. The DSD App and Smile Designer Pro are available as mobile phone applications. It can be concluded that despite the fact that they were not specifically designed for dental diagnosis, Photoshop CS6 and Keynote provide a more comprehensive smile analysis than most specialized DSD programs. However, other program functions should also be considered when deciding which DSD program is applicable to individual clinical setups.

  8. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  9. Computer use and carpal tunnel syndrome: A meta-analysis.

    Science.gov (United States)

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. computer/typewriter use ≥4 vs. computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Transition towards a low carbon economy: A computable general equilibrium analysis for Poland

    International Nuclear Information System (INIS)

    Böhringer, Christoph; Rutherford, Thomas F.

    2013-01-01

    In the transition to sustainable economic structures the European Union assumes a leading role with its climate and energy package which sets ambitious greenhouse gas emission reduction targets by 2020. Among EU Member States, Poland with its heavy energy system reliance on coal is particularly worried on the pending trade-offs between emission regulation and economic growth. In our computable general equilibrium analysis of the EU climate and energy package we show that economic adjustment cost for Poland hinge crucially on restrictions to where-flexibility of emission abatement, revenue recycling, and technological options in the power system. We conclude that more comprehensive flexibility provisions at the EU level and a diligent policy implementation at the national level could achieve the transition towards a low carbon economy at little cost thereby broadening societal support. - Highlights: ► Economic impact assessment of the EU climate and energy package for Poland. ► Sensitivity analysis on where-flexibility, revenue recycling and technology choice. ► Application of a hybrid bottom-up, top-down CGE model

  11. Analyse Factorielle d'une Batterie de Tests de Comprehension Orale et Ecrite (Factor Analysis of a Battery of Tests of Listening and Reading Comprehension). Melanges Pedagogiques, 1971.

    Science.gov (United States)

    Lonchamp, F.

    This is a presentation of the results of a factor analysis of a battery of tests intended to measure listening and reading comprehension in English as a second language. The analysis sought to answer the following questions: (1) whether the factor analysis method yields results when applied to tests which are not specifically designed for this…

  12. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  13. Outlier analysis

    CERN Document Server

    Aggarwal, Charu C

    2013-01-01

    With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and

  14. Use of cloud computing in biomedicine.

    Science.gov (United States)

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  15. ANCON: A code for the evaluation of complex fault trees in personal computers

    International Nuclear Information System (INIS)

    Napoles, J.G.; Salomon, J.; Rivero, J.

    1990-01-01

    Performing probabilistic safety analysis has been recognized worldwide as one of the more effective ways for further enhancing safety of Nuclear Power Plants. The evaluation of fault trees plays a fundamental role in these analysis. Some existing limitations in RAM and execution speed of personal computers (PC) has restricted so far their use in the analysis of complex fault trees. Starting from new approaches in the data structure and other possibilities the ANCON code can evaluate complex fault trees in a PC, allowing the user to do a more comprehensive analysis of the considered system in reduced computing time

  16. Computational surgery and dual training computing, robotics and imaging

    CERN Document Server

    Bass, Barbara; Berceli, Scott; Collet, Christophe; Cerveri, Pietro

    2014-01-01

    This critical volume focuses on the use of medical imaging, medical robotics, simulation, and information technology in surgery. It offers a road map for computational surgery success,  discusses the computer-assisted management of disease and surgery, and provides a rational for image processing and diagnostic. This book also presents some advances on image-driven intervention and robotics, as well as evaluates models and simulations for a broad spectrum of cancers as well as cardiovascular, neurological, and bone diseases. Training and performance analysis in surgery assisted by robotic systems is also covered. This book also: ·         Provides a comprehensive overview of the use of computational surgery and disease management ·         Discusses the design and use of medical robotic tools for orthopedic surgery, endoscopic surgery, and prostate surgery ·         Provides practical examples and case studies in the areas of image processing, virtual surgery, and simulation traini...

  17. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  18. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  19. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  20. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  1. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  2. A comprehensive computing initiative for MFE. Revision 1

    International Nuclear Information System (INIS)

    Cohen, R.H.; Crotinger, J.A.; Baldwin, D.E.

    1996-01-01

    The authors propose that a national initiative by launched to develop a comprehensive simulation facility for MFE. The facility would consist of physics codes developed by the national MFE community tightly but flexibly coupled through a programmable shell, enabling effectively simultaneous solution of the models in the various codes. The world ''facility'' is chosen to convey the notion that this is where one would go to conduct numerical experiments, using a full set of modules to describe an entire device, a coupled subset to describe particular aspects of a device, or a combination of the facility's modules plus the user's own physics

  3. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  4. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    Fiala, C.; Kulak, R.F.

    1989-01-01

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  5. PIXAN: the Lucas Heights PIXE analysis computer package

    International Nuclear Information System (INIS)

    Clayton, E.

    1986-11-01

    To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis

  6. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  7. Robust Nucleus/Cell Detection and Segmentation in Digital Pathology and Microscopy Images: A Comprehensive Review

    Science.gov (United States)

    Xing, Fuyong; Yang, Lin

    2016-01-01

    Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to inter-observer variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literatures. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast (DIC), fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation. PMID:26742143

  8. Development of comprehensive and versatile framework for reactor analysis, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Numata, Kazuyuki; Jin, Tomoyuki

    2014-01-01

    Highlights: • We have developed a neutronics code system for reactor analysis. • The new code system covers all five phases of the core design procedures. • All the functionalities are integrated and validated in the same framework. • The framework supports continuous improvement and extension. • We report results of validation and practical applications. - Abstract: A comprehensive and versatile reactor analysis code system, MARBLE, has been developed. MARBLE is designed as a software development framework for reactor analysis, which offers reusable and extendible functions and data models based on physical concepts, rather than a reactor analysis code system. From a viewpoint of the code system, it provides a set of functionalities utilized in a detailed reactor analysis scheme for fast criticality assemblies and power reactors, and nuclear data related uncertainty quantification such as cross-section adjustment. MARBLE includes five sub-systems named ECRIPSE, BIBLO, SCHEME, UNCERTAINTY and ORPHEUS, which are constructed of the shared functions and data models in the framework. By using these sub-systems, MARBLE covers all phases required in fast reactor core design prediction and improvement procedures, i.e. integral experiment database management, nuclear data processing, fast criticality assembly analysis, uncertainty quantification, and power reactor analysis. In the present paper, these functionalities are summarized and system validation results are described

  9. An updated comprehensive techno-economic analysis of algae biodiesel.

    Science.gov (United States)

    Nagarajan, Sanjay; Chou, Siaw Kiang; Cao, Shenyan; Wu, Chen; Zhou, Zhi

    2013-10-01

    Algae biodiesel is a promising but expensive alternative fuel to petro-diesel. To overcome cost barriers, detailed cost analyses are needed. A decade-old cost analysis by the U.S. National Renewable Energy Laboratory indicated that the costs of algae biodiesel were in the range of $0.53-0.85/L (2012 USD values). However, the cost of land and transesterification were just roughly estimated. In this study, an updated comprehensive techno-economic analysis was conducted with optimized processes and improved cost estimations. Latest process improvement, quotes from vendors, government databases, and other relevant data sources were used to calculate the updated algal biodiesel costs, and the final costs of biodiesel are in the range of $0.42-0.97/L. Additional improvements on cost-effective biodiesel production around the globe to cultivate algae was also recommended. Overall, the calculated costs seem promising, suggesting that a single step biodiesel production process is close to commercial reality. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  11. Effect of education on listening comprehension of sentences on healthy elderly: analysis of number of correct responses and task execution time.

    Science.gov (United States)

    Silagi, Marcela Lima; Rabelo, Camila Maia; Schochat, Eliane; Mansur, Letícia Lessa

    2017-11-13

    To analyze the effect of education on sentence listening comprehension on cognitively healthy elderly. A total of 111 healthy elderly, aged 60-80 years of both genders were divided into two groups according to educational level: low education (0-8 years of formal education) and high education (≥9 years of formal education). The participants were assessed using the Revised Token Test, an instrument that supports the evaluation of auditory comprehension of orders with different working memory and syntactic complexity demands. The indicators used for performance analysis were the number of correct responses (accuracy analysis) and task execution time (temporal analysis) in the different blocks. The low educated group had a lower number of correct responses than the high educated group on all blocks of the test. In the temporal analysis, participants with low education had longer execution time for commands on the first four blocks related to working memory. However, the two groups had similar execution time for blocks more related to syntactic comprehension. Education influenced sentence listening comprehension on elderly. Temporal analysis allowed to infer over the relationship between comprehension and other cognitive abilities, and to observe that the low educated elderly did not use effective compensation strategies to improve their performances on the task. Therefore, low educational level, associated with aging, may potentialize the risks for language decline.

  12. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  13. Comprehensive computational model for combining fluid hydrodynamics, light transport and biomass growth in a Taylor vortex algal photobioreactor: Lagrangian approach.

    Science.gov (United States)

    Gao, Xi; Kong, Bo; Vigil, R Dennis

    2017-01-01

    A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Computer analysis and comparison of chess players' game-playing styles

    OpenAIRE

    Krevs, Urša

    2015-01-01

    Today's computer chess programs are very good at evaluating chess positions. Research has shown that we can rank chess players by the quality of their game play, using a computer chess program. In the master's thesis Computer analysis and comparison of chess players' game-playing styles, we focus on the content analysis of chess games using a computer chess program's evaluation and attributes we determined for each individual position. We defined meaningful attributes that can be used for com...

  15. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  16. Research on a Frame-Based Model of Reading Comprehension. Final Report.

    Science.gov (United States)

    Goldstein, Ira

    This report summarizes computational investigations of language comprehension based on Marvin Minsky's theory of frames, a recent advance in artifical intelligence theories about the representation of knowledge. The investigations discussed explored frame theory as a basis for text comprehension by implementing models of the theory and developing…

  17. Numerical methods in matrix computations

    CERN Document Server

    Björck, Åke

    2015-01-01

    Matrix algorithms are at the core of scientific computing and are indispensable tools in most applications in engineering. This book offers a comprehensive and up-to-date treatment of modern methods in matrix computation. It uses a unified approach to direct and iterative methods for linear systems, least squares and eigenvalue problems. A thorough analysis of the stability, accuracy, and complexity of the treated methods is given. Numerical Methods in Matrix Computations is suitable for use in courses on scientific computing and applied technical areas at advanced undergraduate and graduate level. A large bibliography is provided, which includes both historical and review papers as well as recent research papers. This makes the book useful also as a reference and guide to further study and research work. Åke Björck is a professor emeritus at the Department of Mathematics, Linköping University. He is a Fellow of the Society of Industrial and Applied Mathematics.

  18. ATLAS Distributed Computing: Experience and Evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2013-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb-1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centers around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics program including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2014 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  19. ATLAS distributed computing: experience and evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25/fb of data. The total volume of beam and simulated data products exceeds 100~PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  20. Practical guidelines for the comprehensive analysis of ChIP-seq data.

    Directory of Open Access Journals (Sweden)

    Timothy Bailey

    Full Text Available Mapping the chromosomal locations of transcription factors, nucleosomes, histone modifications, chromatin remodeling enzymes, chaperones, and polymerases is one of the key tasks of modern biology, as evidenced by the Encyclopedia of DNA Elements (ENCODE Project. To this end, chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq is the standard methodology. Mapping such protein-DNA interactions in vivo using ChIP-seq presents multiple challenges not only in sample preparation and sequencing but also for computational analysis. Here, we present step-by-step guidelines for the computational analysis of ChIP-seq data. We address all the major steps in the analysis of ChIP-seq data: sequencing depth selection, quality checking, mapping, data normalization, assessment of reproducibility, peak calling, differential binding analysis, controlling the false discovery rate, peak annotation, visualization, and motif analysis. At each step in our guidelines we discuss some of the software tools most frequently used. We also highlight the challenges and problems associated with each step in ChIP-seq data analysis. We present a concise workflow for the analysis of ChIP-seq data in Figure 1 that complements and expands on the recommendations of the ENCODE and modENCODE projects. Each step in the workflow is described in detail in the following sections.

  1. COMPREHENSIVE EVALUATION AND ANALYSIS OF CHINA’S MAINSTREAM ONLINE MAP SERVICE WEBSITES

    Directory of Open Access Journals (Sweden)

    H. Zhang

    2012-08-01

    Full Text Available With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.

  2. Probabilistic logic networks a comprehensive framework for uncertain inference

    CERN Document Server

    Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari

    2008-01-01

    This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.

  3. Correlates of Early Reading Comprehension Skills: A Componential Analysis

    Science.gov (United States)

    Babayigit, Selma; Stainthorp, Rhona

    2014-01-01

    This study had three main aims. First, we examined to what extent listening comprehension, vocabulary, grammatical skills and verbal short-term memory (VSTM) assessed prior to formal reading instruction explained individual differences in early reading comprehension levels. Second, we examined to what extent the three common component skills,…

  4. A Comprehensive Analysis of the Quality of Online Health-Related Information regarding Schizophrenia

    Science.gov (United States)

    Guada, Joseph; Venable, Victoria

    2011-01-01

    Social workers are major mental health providers and, thus, can be key players in guiding consumers and their families to accurate information regarding schizophrenia. The present study, using the WebMedQual scale, is a comprehensive analysis across a one-year period at two different time points of the top for-profit and nonprofit sites that…

  5. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  6. Computerized comprehensive data analysis of Lung Imaging Database Consortium (LIDC)

    International Nuclear Information System (INIS)

    Tan Jun; Pu Jiantao; Zheng Bin; Wang Xingwei; Leader, Joseph K.

    2010-01-01

    Purpose: Lung Image Database Consortium (LIDC) is the largest public CT image database of lung nodules. In this study, the authors present a comprehensive and the most updated analysis of this dynamically growing database under the help of a computerized tool, aiming to assist researchers to optimally use this database for lung cancer related investigations. Methods: The authors developed a computer scheme to automatically match the nodule outlines marked manually by radiologists on CT images. A large variety of characteristics regarding the annotated nodules in the database including volume, spiculation level, elongation, interobserver variability, as well as the intersection of delineated nodule voxels and overlapping ratio between the same nodules marked by different radiologists are automatically calculated and summarized. The scheme was applied to analyze all 157 examinations with complete annotation data currently available in LIDC dataset. Results: The scheme summarizes the statistical distributions of the abovementioned geometric and diagnosis features. Among the 391 nodules, (1) 365 (93.35%) have principal axis length ≤20 mm; (2) 120, 75, 76, and 120 were marked by one, two, three, and four radiologists, respectively; and (3) 122 (32.48%) have the maximum volume overlapping ratios ≥80% for the delineations of two radiologists, while 198 (50.64%) have the maximum volume overlapping ratios <60%. The results also showed that 72.89% of the nodules were assessed with malignancy score between 2 and 4, and only 7.93% of these nodules were considered as severely malignant (malignancy ≥4). Conclusions: This study demonstrates that LIDC contains examinations covering a diverse distribution of nodule characteristics and it can be a useful resource to assess the performance of the nodule detection and/or segmentation schemes.

  7. Run 2 analysis computing for CDF and D0

    International Nuclear Information System (INIS)

    Fuess, S.

    1995-11-01

    Two large experiments at the Fermilab Tevatron collider will use upgraded of running. The associated analysis software is also expected to change, both to account for higher data rates and to embrace new computing paradigms. A discussion is given to the problems facing current and future High Energy Physics (HEP) analysis computing, and several issues explored in detail

  8. Use of computer codes for system reliability analysis

    International Nuclear Information System (INIS)

    Sabek, M.; Gaafar, M.; Poucet, A.

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author)

  9. Use of computer codes for system reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sabek, M.; Gaafar, M. (Nuclear Regulatory and Safety Centre, Atomic Energy Authority, Cairo (Egypt)); Poucet, A. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1989-01-01

    This paper gives a summary of studies performed at the JRC, ISPRA on the use of computer codes for complex systems analysis. The computer codes dealt with are: CAFTS-SALP software package, FRACTIC, FTAP, computer code package RALLY, and BOUNDS. Two reference case studies were executed by each code. The probabilistic results obtained, as well as the computation times are compared. The two cases studied are the auxiliary feedwater system of a 1300 MW PWR reactor and the emergency electrical power supply system. (author).

  10. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  11. Numerical Limit Analysis:

    DEFF Research Database (Denmark)

    Damkilde, Lars

    2007-01-01

    Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....

  12. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  13. Temporal fringe pattern analysis with parallel computing

    International Nuclear Information System (INIS)

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-01-01

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis

  14. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  15. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  16. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  17. Can cloud computing benefit health services? - a SWOT analysis.

    Science.gov (United States)

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  18. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  19. A guide through the computational analysis of isotope-labeled mass spectrometry-based quantitative proteomics data: an application study

    Directory of Open Access Journals (Sweden)

    Haußmann Ute

    2011-06-01

    Full Text Available Abstract Background Mass spectrometry-based proteomics has reached a stage where it is possible to comprehensively analyze the whole proteome of a cell in one experiment. Here, the employment of stable isotopes has become a standard technique to yield relative abundance values of proteins. In recent times, more and more experiments are conducted that depict not only a static image of the up- or down-regulated proteins at a distinct time point but instead compare developmental stages of an organism or varying experimental conditions. Results Although the scientific questions behind these experiments are of course manifold, there are, nevertheless, two questions that commonly arise: 1 which proteins are differentially regulated regarding the selected experimental conditions, and 2 are there groups of proteins that show similar abundance ratios, indicating that they have a similar turnover? We give advice on how these two questions can be answered and comprehensively compare a variety of commonly applied computational methods and their outcomes. Conclusions This work provides guidance through the jungle of computational methods to analyze mass spectrometry-based isotope-labeled datasets and recommends an effective and easy-to-use evaluation strategy. We demonstrate our approach with three recently published datasets on Bacillus subtilis 12 and Corynebacterium glutamicum 3. Special focus is placed on the application and validation of cluster analysis methods. All applied methods were implemented within the rich internet application QuPE 4. Results can be found at http://qupe.cebitec.uni-bielefeld.de.

  20. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  1. Computer code for qualitative analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Yule, H.P.

    1979-01-01

    Computer code QLN1 provides complete analysis of gamma-ray spectra observed with Ge(Li) detectors and is used at both the National Bureau of Standards and the Environmental Protection Agency. It locates peaks, resolves multiplets, identifies component radioisotopes, and computes quantitative results. The qualitative-analysis (or component identification) algorithms feature thorough, self-correcting steps which provide accurate isotope identification in spite of errors in peak centroids, energy calibration, and other typical problems. The qualitative-analysis algorithm is described in this paper

  2. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  3. Comprehensive experimental analysis of nonlinear dynamics in an optically-injected semiconductor laser

    Directory of Open Access Journals (Sweden)

    Kevin Schires

    2011-09-01

    Full Text Available We present the first comprehensive experimental study, to our knowledge, of the routes between nonlinear dynamics induced in a semiconductor laser under external optical injection based on an analysis of time-averaged measurements of the optical and RF spectra and phasors of real-time series of the laser output. The different means of analysis are compared for several types of routes and the benefits of each are discussed in terms of the identification and mapping of the nonlinear dynamics. Finally, the results are presented in a novel audio/video format that describes the evolution of the dynamics with the injection parameters.

  4. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  5. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    Science.gov (United States)

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  6. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  7. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  8. A comprehensive risk analysis of coastal zones in China

    Science.gov (United States)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  9. Analysis of Virtual Learning Environments from a Comprehensive Semiotic Perspective

    Directory of Open Access Journals (Sweden)

    Gloria María Álvarez Cadavid

    2012-11-01

    Full Text Available Although there is a wide variety of perspectives and models for the study of online education, most of these focus on the analysis of the verbal aspects of such learning, while very few consider the relationship between speech and elements of a different nature, such as images and hypermediality. In a previous article we presented a proposal for a comprehensive semiotic analysis of virtual learning environments that more recently has been developed and tested for the study of different online training courses without instructional intervention. In this paper we use this same proposal to analyze online learning environments in the framework of courses with instructional intervention. One of the main observations in relation to this type of analyses is that the organizational aspects of the courses are found to be related to the way in which the input elements for the teaching and learning process are constructed.

  10. Data science in R a case studies approach to computational reasoning and problem solving

    CERN Document Server

    Nolan, Deborah

    2015-01-01

    Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar

  11. Analyzing Comprehensive QoS with Security Constraints for Services Composition Applications in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Naixue Xiong

    2014-12-01

    Full Text Available Services composition is fundamental to software development in multi-service wireless sensor networks (WSNs. The quality of service (QoS of services composition applications (SCAs are confronted with severe challenges due to the open, dynamic, and complex natures of WSNs. Most previous research separated various QoS indices into different fields and studied them individually due to the computational complexity. This approach ignores the mutual influence between these QoS indices, and leads to a non-comprehensive and inaccurate analysis result. The universal generating function (UGF shows the speediness and precision in QoS analysis. However, only one QoS index at a time can be analyzed by the classic UGF. In order to efficiently analyze the comprehensive QoS of SCAs, this paper proposes an improved UGF technique—vector universal generating function (VUGF—which considers the relationship between multiple QoS indices, including security, and can simultaneously analyze multiple QoS indices. The numerical examples demonstrate that it can be used for the evaluation of the comprehensive QoS of SCAs subjected to the security constraint in WSNs. Therefore, it can be effectively applied to the optimal design of multi-service WSNs.

  12. Analyzing comprehensive QoS with security constraints for services composition applications in wireless sensor networks.

    Science.gov (United States)

    Xiong, Naixue; Wu, Zhao; Huang, Yannong; Xu, Degang

    2014-12-01

    Services composition is fundamental to software development in multi-service wireless sensor networks (WSNs). The quality of service (QoS) of services composition applications (SCAs) are confronted with severe challenges due to the open, dynamic, and complex natures of WSNs. Most previous research separated various QoS indices into different fields and studied them individually due to the computational complexity. This approach ignores the mutual influence between these QoS indices, and leads to a non-comprehensive and inaccurate analysis result. The universal generating function (UGF) shows the speediness and precision in QoS analysis. However, only one QoS index at a time can be analyzed by the classic UGF. In order to efficiently analyze the comprehensive QoS of SCAs, this paper proposes an improved UGF technique-vector universal generating function (VUGF)-which considers the relationship between multiple QoS indices, including security, and can simultaneously analyze multiple QoS indices. The numerical examples demonstrate that it can be used for the evaluation of the comprehensive QoS of SCAs subjected to the security constraint in WSNs. Therefore, it can be effectively applied to the optimal design of multi-service WSNs.

  13. Life Cycle Assessment, ExternE and Comprehensive Analysis for an integrated evaluation of the environmental impact of anthropogenic activities

    Energy Technology Data Exchange (ETDEWEB)

    Pietrapertosa, F.; Cosmi, C. [National Research Council, Institute of Methodologies for Environmental Analysis C.N.R.-I.M.A.A. C.da S.Loja, I-85050 Tito Scalo (PZ) (Italy); National Research Council, National Institute for the Physics of Matter, C.N.R.-I.N.F.M. Via Cinthia, I-80126 Naples (Italy); Macchiato, M. [Federico II University, Department of Physical Sciences, Via Cinthia, I-80126 Naples (Italy); National Research Council, National Institute for the Physics of Matter, C.N.R.-I.N.F.M. Via Cinthia, I-80126 Naples (Italy); Salvia, M.; Cuomo, V. [National Research Council, Institute of Methodologies for Environmental Analysis C.N.R.-I.M.A.A. C.da S.Loja, I-85050 Tito Scalo (PZ) (Italy)

    2009-06-15

    The implementation of resource management strategies aimed at reducing the impacts of the anthropogenic activities system requires a comprehensive approach to evaluate on the whole the environmental burdens of productive processes and to identify the best recovery strategies from both an environmental and an economic point of view. In this framework, an analytical methodology based on the integration of Life Cycle Assessment (LCA), ExternE and Comprehensive Analysis was developed to perform an in-depth investigation of energy systems. The LCA methodology, largely utilised by the international scientific community for the assessment of the environmental performances of technologies, combined with Comprehensive Analysis allows modelling the overall system of anthropogenic activities, as well as sub-systems, the economic consequences of the whole set of environmental damages. Moreover, internalising external costs into partial equilibrium models, as those utilised by Comprehensive Analysis, can be useful to identify the best paths for implementing technology innovation and strategies aimed to a more sustainable energy supply and use. This paper presents an integrated application of these three methodologies to a local scale case study (the Val D'Agri area in Basilicata, Southern Italy), aimed to better characterise the environmental impacts of the energy system, with particular reference to extraction activities. The innovative methodological approach utilised takes advantage from the strength points of each methodology with an added value coming from their integration as emphasised by the main results obtained by the scenario analysis. (author)

  14. Computational Chemical Synthesis Analysis and Pathway Design

    Directory of Open Access Journals (Sweden)

    Fan Feng

    2018-06-01

    Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.

  15. Summary of research in applied mathematics, numerical analysis, and computer sciences

    Science.gov (United States)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  16. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  17. A single-chip computer analysis system for liquid fluorescence

    International Nuclear Information System (INIS)

    Zhang Yongming; Wu Ruisheng; Li Bin

    1998-01-01

    The single-chip computer analysis system for liquid fluorescence is an intelligent analytic instrument, which is based on the principle that the liquid containing hydrocarbons can give out several characteristic fluorescences when irradiated by strong light. Besides a single-chip computer, the system makes use of the keyboard and the calculation and printing functions of a CASIO printing calculator. It combines optics, mechanism and electronics into one, and is small, light and practical, so it can be used for surface water sample analysis in oil field and impurity analysis of other materials

  18. The Effects of Visual Attention Span and Phonological Decoding in Reading Comprehension in Dyslexia: A Path Analysis.

    Science.gov (United States)

    Chen, Chen; Schneps, Matthew H; Masyn, Katherine E; Thomson, Jennifer M

    2016-11-01

    Increasing evidence has shown visual attention span to be a factor, distinct from phonological skills, that explains single-word identification (pseudo-word/word reading) performance in dyslexia. Yet, little is known about how well visual attention span explains text comprehension. Observing reading comprehension in a sample of 105 high school students with dyslexia, we used a pathway analysis to examine the direct and indirect path between visual attention span and reading comprehension while controlling for other factors such as phonological awareness, letter identification, short-term memory, IQ and age. Integrating phonemic decoding efficiency skills in the analytic model, this study aimed to disentangle how visual attention span and phonological skills work together in reading comprehension for readers with dyslexia. We found visual attention span to have a significant direct effect on more difficult reading comprehension but not on an easier level. It also had a significant direct effect on pseudo-word identification but not on word identification. In addition, we found that visual attention span indirectly explains reading comprehension through pseudo-word reading and word reading skills. This study supports the hypothesis that at least part of the dyslexic profile can be explained by visual attention abilities. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. The Effect of Electronic Storybooks on Struggling Fourth-Graders' Reading Comprehension

    Science.gov (United States)

    Ertem, Ihsan Seyit

    2010-01-01

    This quantitative research examined the differences in struggling readers' comprehension of storybooks according to the medium of presentation. Each student was randomly assigned with one of three conditions: (1) computer presentation of storybooks with animation; (2) computer presentation of storybooks without animation; and (3) traditional print…

  20. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... kinds of evolutionary algorithms, have been prudently analyzed. This analysis was followed by a thorough analysis of various issues involved in stochastic local search algorithms. An interesting survey of various technological and industrial applications in mechanical engineering and design has been...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...

  1. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  2. Psychology of computer use: XXXII. Computer screen-savers as distractors.

    Science.gov (United States)

    Volk, F A; Halcomb, C G

    1994-12-01

    The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.

  3. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis.

    Science.gov (United States)

    Kim, Seongsoon; Park, Donghyeon; Choi, Yonghwa; Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon; Kang, Jaewoo

    2018-01-05

    With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last

  4. Trident: scalable compute archives: workflows, visualization, and analysis

    Science.gov (United States)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub

  5. Metasynthetic computing and engineering of complex systems

    CERN Document Server

    Cao, Longbing

    2015-01-01

    Provides a comprehensive overview and introduction to the concepts, methodologies, analysis, design and applications of metasynthetic computing and engineering. The author: Presents an overview of complex systems, especially open complex giant systems such as the Internet, complex behavioural and social problems, and actionable knowledge discovery and delivery in the big data era. Discusses ubiquitous intelligence in complex systems, including human intelligence, domain intelligence, social intelligence, network intelligence, data intelligence and machine intelligence, and their synergy thro

  6. Computer-aided pulmonary image analysis in small animal models

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J. [Center for Infectious Disease Imaging (CIDI), Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Bagci, Ulas, E-mail: ulasbagci@gmail.com [Center for Research in Computer Vision (CRCV), University of Central Florida (UCF), Orlando, Florida 32816 (United States); Kramer-Marek, Gabriela [The Institute of Cancer Research, London SW7 3RP (United Kingdom); Luna, Brian [Microfluidic Laboratory Automation, University of California-Irvine, Irvine, California 92697-2715 (United States); Kubler, Andre [Department of Medicine, Imperial College London, London SW7 2AZ (United Kingdom); Dey, Bappaditya; Jain, Sanjay [Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Foster, Brent [Department of Biomedical Engineering, University of California-Davis, Davis, California 95817 (United States); Papadakis, Georgios Z. [Radiology and Imaging Sciences, National Institutes of Health (NIH), Bethesda, Maryland 32892 (United States); Camp, Jeremy V. [Department of Microbiology and Immunology, University of Louisville, Louisville, Kentucky 40202 (United States); Jonsson, Colleen B. [National Institute for Mathematical and Biological Synthesis, University of Tennessee, Knoxville, Tennessee 37996 (United States); Bishai, William R. [Howard Hughes Medical Institute, Chevy Chase, Maryland 20815 and Center for Tuberculosis Research, Johns Hopkins University School of Medicine, Baltimore, Maryland 21231 (United States); Udupa, Jayaram K. [Medical Image Processing Group, Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  7. Sensitive and comprehensive analysis of O-glycosylation in biotherapeutics: a case study of novel erythropoiesis stimulating protein.

    Science.gov (United States)

    Kim, Unyong; Oh, Myung Jin; Seo, Youngsuk; Jeon, Yinae; Eom, Joon-Ho; An, Hyun Joo

    2017-09-01

    Glycosylation of recombinant human erythropoietins (rhEPOs) is significantly associated with drug's quality and potency. Thus, comprehensive characterization of glycosylation is vital to assess the biotherapeutic quality and establish the equivalency of biosimilar rhEPOs. However, current glycan analysis mainly focuses on the N-glycans due to the absence of analytical tools to liberate O-glycans with high sensitivity. We developed selective and sensitive method to profile native O-glycans on rhEPOs. O-glycosylation on rhEPO including O-acetylation on a sialic acid was comprehensively characterized. Details such as O-glycan structure and O-acetyl-modification site were obtained from tandem MS. This method may be applied to QC and batch analysis of not only rhEPOs but also other biotherapeutics bearing multiple O-glycosylations.

  8. Integrated computer-aided forensic case analysis, presentation, and documentation based on multimodal 3D data.

    Science.gov (United States)

    Bornik, Alexander; Urschler, Martin; Schmalstieg, Dieter; Bischof, Horst; Krauskopf, Astrid; Schwark, Thorsten; Scheurer, Eva; Yen, Kathrin

    2018-06-01

    Three-dimensional (3D) crime scene documentation using 3D scanners and medical imaging modalities like computed tomography (CT) and magnetic resonance imaging (MRI) are increasingly applied in forensic casework. Together with digital photography, these modalities enable comprehensive and non-invasive recording of forensically relevant information regarding injuries/pathologies inside the body and on its surface. Furthermore, it is possible to capture traces and items at crime scenes. Such digitally secured evidence has the potential to similarly increase case understanding by forensic experts and non-experts in court. Unlike photographs and 3D surface models, images from CT and MRI are not self-explanatory. Their interpretation and understanding requires radiological knowledge. Findings in tomography data must not only be revealed, but should also be jointly studied with all the 2D and 3D data available in order to clarify spatial interrelations and to optimally exploit the data at hand. This is technically challenging due to the heterogeneous data representations including volumetric data, polygonal 3D models, and images. This paper presents a novel computer-aided forensic toolbox providing tools to support the analysis, documentation, annotation, and illustration of forensic cases using heterogeneous digital data. Conjoint visualization of data from different modalities in their native form and efficient tools to visually extract and emphasize findings help experts to reveal unrecognized correlations and thereby enhance their case understanding. Moreover, the 3D case illustrations created for case analysis represent an efficient means to convey the insights gained from case analysis to forensic non-experts involved in court proceedings like jurists and laymen. The capability of the presented approach in the context of case analysis, its potential to speed up legal procedures and to ultimately enhance legal certainty is demonstrated by introducing a number of

  9. Comprehensive evaluation of garment assembly line with simulation

    Science.gov (United States)

    Xu, Y.; Thomassey, S.; Chen, Y.; Zeng, X.

    2017-10-01

    In this paper, a comprehensive evaluation system is established to assess the garment production performance. It is based on performance indicators and supported with the corresponding results obtained by manual calculation or computer simulation. The assembly lines of a typical men’s shirt are taken as the study objects. With the comprehensive evaluation results, garments production arrangement scenarios are better analysed and then the appropriate one is supposed to be put into actual production. This will be a guidance given to companies on quick decision-making and multi-objective optimization of garment production.

  10. PinAPL-Py: A comprehensive web-application for the analysis of CRISPR/Cas9 screens.

    Science.gov (United States)

    Spahn, Philipp N; Bath, Tyler; Weiss, Ryan J; Kim, Jihoon; Esko, Jeffrey D; Lewis, Nathan E; Harismendy, Olivier

    2017-11-20

    Large-scale genetic screens using CRISPR/Cas9 technology have emerged as a major tool for functional genomics. With its increased popularity, experimental biologists frequently acquire large sequencing datasets for which they often do not have an easy analysis option. While a few bioinformatic tools have been developed for this purpose, their utility is still hindered either due to limited functionality or the requirement of bioinformatic expertise. To make sequencing data analysis of CRISPR/Cas9 screens more accessible to a wide range of scientists, we developed a Platform-independent Analysis of Pooled Screens using Python (PinAPL-Py), which is operated as an intuitive web-service. PinAPL-Py implements state-of-the-art tools and statistical models, assembled in a comprehensive workflow covering sequence quality control, automated sgRNA sequence extraction, alignment, sgRNA enrichment/depletion analysis and gene ranking. The workflow is set up to use a variety of popular sgRNA libraries as well as custom libraries that can be easily uploaded. Various analysis options are offered, suitable to analyze a large variety of CRISPR/Cas9 screening experiments. Analysis output includes ranked lists of sgRNAs and genes, and publication-ready plots. PinAPL-Py helps to advance genome-wide screening efforts by combining comprehensive functionality with user-friendly implementation. PinAPL-Py is freely accessible at http://pinapl-py.ucsd.edu with instructions and test datasets.

  11. Prediction During Natural Language Comprehension.

    Science.gov (United States)

    Willems, Roel M; Frank, Stefan L; Nijhof, Annabel D; Hagoort, Peter; van den Bosch, Antal

    2016-06-01

    The notion of prediction is studied in cognitive neuroscience with increasing intensity. We investigated the neural basis of 2 distinct aspects of word prediction, derived from information theory, during story comprehension. We assessed the effect of entropy of next-word probability distributions as well as surprisal A computational model determined entropy and surprisal for each word in 3 literary stories. Twenty-four healthy participants listened to the same 3 stories while their brain activation was measured using fMRI. Reversed speech fragments were presented as a control condition. Brain areas sensitive to entropy were left ventral premotor cortex, left middle frontal gyrus, right inferior frontal gyrus, left inferior parietal lobule, and left supplementary motor area. Areas sensitive to surprisal were left inferior temporal sulcus ("visual word form area"), bilateral superior temporal gyrus, right amygdala, bilateral anterior temporal poles, and right inferior frontal sulcus. We conclude that prediction during language comprehension can occur at several levels of processing, including at the level of word form. Our study exemplifies the power of combining computational linguistics with cognitive neuroscience, and additionally underlines the feasibility of studying continuous spoken language materials with fMRI. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Cloud computing assessing the risks

    CERN Document Server

    Carstensen, Jared; Golden, Bernard

    2012-01-01

    Cloud Computing: Assessing the risks answers these questions and many more. Using jargon-free language and relevant examples, analogies and diagrams, it is an up-to-date, clear and comprehensive guide the security, governance, risk, and compliance elements of Cloud Computing.

  13. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  14. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  15. Informed consent comprehension in African research settings.

    Science.gov (United States)

    Afolabi, Muhammed O; Okebe, Joseph U; McGrath, Nuala; Larson, Heidi J; Bojang, Kalifa; Chandramohan, Daniel

    2014-06-01

    Previous reviews on participants' comprehension of informed consent information have focused on developed countries. Experience has shown that ethical standards developed on Western values may not be appropriate for African settings where research concepts are unfamiliar. We undertook this review to describe how informed consent comprehension is defined and measured in African research settings. We conducted a comprehensive search involving five electronic databases: Medline, Embase, Global Health, EthxWeb and Bioethics Literature Database (BELIT). We also examined African Index Medicus and Google Scholar for relevant publications on informed consent comprehension in clinical studies conducted in sub-Saharan Africa. 29 studies satisfied the inclusion criteria; meta-analysis was possible in 21 studies. We further conducted a direct comparison of participants' comprehension on domains of informed consent in all eligible studies. Comprehension of key concepts of informed consent varies considerably from country to country and depends on the nature and complexity of the study. Meta-analysis showed that 47% of a total of 1633 participants across four studies demonstrated comprehension about randomisation (95% CI 13.9-80.9%). Similarly, 48% of 3946 participants in six studies had understanding about placebo (95% CI 19.0-77.5%), while only 30% of 753 participants in five studies understood the concept of therapeutic misconception (95% CI 4.6-66.7%). Measurement tools for informed consent comprehension were developed with little or no validation. Assessment of comprehension was carried out at variable times after disclosure of study information. No uniform definition of informed consent comprehension exists to form the basis for development of an appropriate tool to measure comprehension in African participants. Comprehension of key concepts of informed consent is poor among study participants across Africa. There is a vital need to develop a uniform definition for

  16. Improving reading comprehension skills through the SCRATCH program

    Directory of Open Access Journals (Sweden)

    Erdal Papatga

    2016-09-01

    Full Text Available The aim of this study was to reveal how reading comprehension skills of elementary fourth graders who have problems in reading comprehension can be improved by means of the SCRATCH program. The study was designed as a participant action research. It was carried out within a 15-week process at an elementary school with middle socio-economic level in the Eskisehir province in the fall term of the 2015-2016 school year. The participants of the study were eight fourth graders who had problems in reading comprehension and were selected based on the criterion sampling method. Different data gathering tools were employed in different stages of the study. These were the Informal Reading Inventory, readability assessment rubric, participant selection form and identification forms for developmental level in reading comprehension for the quantitative data, and observation notes, a researcher diary, video recordings, teacher and student observation notes, and the projects the students prepared using the SCRATCH program for the qualitative data. In the study, the analysis of the quantitative data was done with correlation analysis, and Kendall W Test that shows inter-rater reliability. In addition, the identification forms for developmental level in reading comprehension were used to reveal the improvement in reading comprehension skills, and the Informal Reading Inventory was employed to score these forms. On the other hand, the qualitative data were analysed through the thematic analysis method, and MAXQDA was used for the analysis. As a result of the analyses, it was found that the reading level of the eight students who had problems in reading comprehension went up from the anxiety level to the instructional level in some forms, and even to the independent reading level in other forms; in other words, there was an improvement in the reading comprehension skills of all eight students.

  17. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  18. Predicting and understanding comprehensive drug-drug interactions via semi-nonnegative matrix factorization.

    Science.gov (United States)

    Yu, Hui; Mao, Kui-Tao; Shi, Jian-Yu; Huang, Hua; Chen, Zhi; Dong, Kai; Yiu, Siu-Ming

    2018-04-11

    Drug-drug interactions (DDIs) always cause unexpected and even adverse drug reactions. It is important to identify DDIs before drugs are used in the market. However, preclinical identification of DDIs requires much money and time. Computational approaches have exhibited their abilities to predict potential DDIs on a large scale by utilizing pre-market drug properties (e.g. chemical structure). Nevertheless, none of them can predict two comprehensive types of DDIs, including enhancive and degressive DDIs, which increases and decreases the behaviors of the interacting drugs respectively. There is a lack of systematic analysis on the structural relationship among known DDIs. Revealing such a relationship is very important, because it is able to help understand how DDIs occur. Both the prediction of comprehensive DDIs and the discovery of structural relationship among them play an important guidance when making a co-prescription. In this work, treating a set of comprehensive DDIs as a signed network, we design a novel model (DDINMF) for the prediction of enhancive and degressive DDIs based on semi-nonnegative matrix factorization. Inspiringly, DDINMF achieves the conventional DDI prediction (AUROC = 0.872 and AUPR = 0.605) and the comprehensive DDI prediction (AUROC = 0.796 and AUPR = 0.579). Compared with two state-of-the-art approaches, DDINMF shows it superiority. Finally, representing DDIs as a binary network and a signed network respectively, an analysis based on NMF reveals crucial knowledge hidden among DDIs. Our approach is able to predict not only conventional binary DDIs but also comprehensive DDIs. More importantly, it reveals several key points about the DDI network: (1) both binary and signed networks show fairly clear clusters, in which both drug degree and the difference between positive degree and negative degree show significant distribution; (2) the drugs having large degrees tend to have a larger difference between positive degree

  19. Practical computer analysis of switch mode power supplies

    CERN Document Server

    Bennett, Johnny C

    2006-01-01

    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  20. Quantitative characterization of galectin-3-C affinity mass spectrometry measurements: Comprehensive data analysis, obstacles, shortcuts and robustness.

    Science.gov (United States)

    Haramija, Marko; Peter-Katalinić, Jasna

    2017-10-30

    Affinity mass spectrometry (AMS) is an emerging tool in the field of the study of protein•carbohydrate complexes. However, experimental obstacles and data analysis are preventing faster integration of AMS methods into the glycoscience field. Here we show how analysis of direct electrospray ionization mass spectrometry (ESI-MS) AMS data can be simplified for screening purposes, even for complex AMS spectra. A direct ESI-MS assay was tested in this study and binding data for the galectin-3C•lactose complex were analyzed using a comprehensive and simplified data analysis approach. In the comprehensive data analysis approach, noise, all protein charge states, alkali ion adducts and signal overlap were taken into account. In a simplified approach, only the intensities of the fully protonated free protein and the protein•carbohydrate complex for the main protein charge state were taken into account. In our study, for high intensity signals, noise was negligible, sodiated protein and sodiated complex signals cancelled each other out when calculating the K d value, and signal overlap influenced the Kd value only to a minor extent. Influence of these parameters on low intensity signals was much higher. However, low intensity protein charge states should be avoided in quantitative AMS analyses due to poor ion statistics. The results indicate that noise, alkali ion adducts, signal overlap, as well as low intensity protein charge states, can be neglected for preliminary experiments, as well as in screening assays. One comprehensive data analysis performed as a control should be sufficient to validate this hypothesis for other binding systems as well. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  2. Conceptual design of pipe whip restraints using interactive computer analysis

    International Nuclear Information System (INIS)

    Rigamonti, G.; Dainora, J.

    1975-01-01

    Protection against pipe break effects necessitates a complex interaction between failure mode analysis, piping layout, and structural design. Many iterations are required to finalize structural designs and equipment arrangements. The magnitude of the pipe break loads transmitted by the pipe whip restraints to structural embedments precludes the application of conservative design margins. A simplified analytical formulation of the nonlinear dynamic problems associated with pipe whip has been developed and applied using interactive computer analysis techniques. In the dynamic analysis, the restraint and the associated portion of the piping system, are modeled using the finite element lumped mass approach to properly reflect the dynamic characteristics of the piping/restraint system. The analysis is performed as a series of piecewise linear increments. Each of these linear increments is terminated by either formation of plastic conditions or closing/opening of gaps. The stiffness matrix is modified to reflect the changed stiffness characteristics of the system and re-started using the previous boundary conditions. The formation of yield hinges are related to the plastic moment of the section and unloading paths are automatically considered. The conceptual design of the piping/restraint system is performed using interactive computer analysis. The application of the simplified analytical approach with interactive computer analysis results in an order of magnitude reduction in engineering time and computer cost. (Auth.)

  3. RDS - A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    International Nuclear Information System (INIS)

    Salim, Mohd Faiz; Roslan, Ridha; Ibrahim, Mohd Rizal Mamat

    2014-01-01

    Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges

  4. RDS - A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Salim, Mohd Faiz, E-mail: mohdfaizs@tnb.com.my [Nuclear Energy Department, Tenaga Nasional Berhad, Level 32, Dua Sentral, 50470 Kuala Lumpur (Malaysia); Roslan, Ridha [Nuclear Installation Division, Atomic Energy Licensing Board, Batu 24, Jalan Dengkil, 43800 Dengkil, Selangor (Malaysia); Ibrahim, Mohd Rizal Mamat [Technical Support Division, Malaysian Nuclear Agency, Bangi, 43000 Kajang, Selangor (Malaysia)

    2014-02-12

    Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges.

  5. READING COMPREHENSION EXERCISES ONLINE: THE EFFECTS OF FEEDBACK, PROFICIENCY AND INTERACTION

    Directory of Open Access Journals (Sweden)

    Philip Murphy

    2007-02-01

    Full Text Available This paper describes an ongoing project to create an online version of a reading programme, a custom-designed English language proficiency course at a university in Japan. Following an interactionist view of second language acquisition, it was hypothesised that comprehension of a reading passage could be enhanced by online materials promoting interaction between students as they completed a multiple-choice reading comprehension exercise. Interaction was promoted: (a through pair work at a single computer and (b by providing Elaborative feedback in the form of hints about incorrect answers as a means of stimulating discussion about corrections. Students were randomly selected from upper and lower levels of English proficiency, as determined by the Kanda English Proficiency Test (Bonk & Ockey, 2003, to receive either Elaborative feedback or Knowledge of Correct Response feedback (which supplies the correct answers. Within these groups, some students worked in pairs and some alone. Quantitative results show that the interaction between Type of feedback and Manner of study (individual or pair work was statistically significant; students performed best on a follow-up comprehension exercise when in pairs and having been provided with Elaborative feedback. Furthermore, qualitative analysis of transcribed interactions also shows that Elaborative feedback was conducive to quality interaction.

  6. A Comprehensive Evaluation Rubric for Assessing Instructional Apps

    Directory of Open Access Journals (Sweden)

    Cheng-Yuan Lee

    2015-01-01

    Full Text Available There is a pressing need for an evaluation rubric that examines all aspects of educational apps designed for instructional purposes. In past decades, many rubrics have been developed for evaluating educational computer-based programs; however, rubrics designed for evaluating the instructional implications of educational apps are scarce. When an Internet search for existing rubrics was conducted, only two such rubrics were found, and the evaluation criteria used in those rubrics was not clearly linked to previously conducted research nor were their evaluative dimensions clearly defined. These shortcomings result in reviewers being unable to use those rubrics to provide teachers with a precise analysis of an educational app’s instructional potential. In response, this paper presents a comprehensive rubric with 24-evaluative dimensions tailored specifically to analyze the educational potential of instructional apps.

  7. WASTE-ACC: A computer model for analysis of waste management accidents

    International Nuclear Information System (INIS)

    Nabelssi, B.K.; Folga, S.; Kohout, E.J.; Mueller, C.J.; Roglans-Ribas, J.

    1996-12-01

    In support of the U.S. Department of Energy's (DOE's) Waste Management Programmatic Environmental Impact Statement, Argonne National Laboratory has developed WASTE-ACC, a computational framework and integrated PC-based database system, to assess atmospheric releases from facility accidents. WASTE-ACC facilitates the many calculations for the accident analyses necessitated by the numerous combinations of waste types, waste management process technologies, facility locations, and site consolidation strategies in the waste management alternatives across the DOE complex. WASTE-ACC is a comprehensive tool that can effectively test future DOE waste management alternatives and assumptions. The computational framework can access several relational databases to calculate atmospheric releases. The databases contain throughput volumes, waste profiles, treatment process parameters, and accident data such as frequencies of initiators, conditional probabilities of subsequent events, and source term release parameters of the various waste forms under accident stresses. This report describes the computational framework and supporting databases used to conduct accident analyses and to develop source terms to assess potential health impacts that may affect on-site workers and off-site members of the public under various DOE waste management alternatives

  8. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  9. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  10. Pressure Points in Reading Comprehension: A Quantile Multiple Regression Analysis

    Science.gov (United States)

    Logan, Jessica

    2017-01-01

    The goal of this study was to examine how selected pressure points or areas of vulnerability are related to individual differences in reading comprehension and whether the importance of these pressure points varies as a function of the level of children's reading comprehension. A sample of 245 third-grade children were given an assessment battery…

  11. Computer-Aided Communication Satellite System Analysis and Optimization.

    Science.gov (United States)

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  12. Instruction of Research-Based Comprehension Strategies in Basal Reading Programs

    Science.gov (United States)

    Pilonieta, Paola

    2010-01-01

    Research supports using research-based comprehension strategies; however, comprehension strategy instruction is not highly visible in basal reading programs or classroom instruction, resulting in many students who struggle with comprehension. A content analysis examined which research-based comprehension strategies were presented in five…

  13. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  14. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  15. Comprehensive evaluation of coal-fired power plants based on grey relational analysis and analytic hierarchy process

    International Nuclear Information System (INIS)

    Xu Gang; Yang Yongping; Lu Shiyuan; Li Le; Song Xiaona

    2011-01-01

    In China, coal-fired power plants are the main supplier of electricity, as well as the largest consumer of coal and water resources and the biggest emitter of SO x , NO x , and greenhouse gases (GHGs). Therefore, it is important to establish a scientific, reasonable, and feasible comprehensive evaluation system for coal-fired power plants to guide them in achieving multi-optimisation of their thermal, environmental, and economic performance. This paper proposes a novel comprehensive evaluation method, which is based on a combination of the grey relational analysis (GRA) and the analytic hierarchy process (AHP), to assess the multi-objective performance of power plants. Unlike the traditional evaluation method that uses coal consumption as a basic indicator, the proposed evaluation method also takes water consumption and pollutant emissions as indicators. On the basis of the proposed evaluation method, a case study on typical 600 MW coal-fired power plants is carried out to determine the relevancy rules among factors including the coal consumption, water consumption, pollutant, and GHG emissions of power plants. This research offers new ideas and methods for the comprehensive performance evaluation of complex energy utilisation systems, and is beneficial to the synthesised consideration of resources, economy, and environment factors in system optimising and policy making. - Research highlights: → We proposed a comprehensive evaluation method for coal-fired power plants. → The method is based on the grey relational analysis (GRA). → The method also introduces the idea of the analytic hierarchy process (AHP). → The method can assess thermal, economic and environmental performance. → The method can play an active role in guiding power plants' improvements.

  16. Comprehensive evaluation of coal-fired power plants based on grey relational analysis and analytic hierarchy process

    Energy Technology Data Exchange (ETDEWEB)

    Xu Gang, E-mail: xg2008@ncepu.edu.c [Key Lab of Condition Monitoring and Control for Power Plant Equipment of Ministry of Education, School of Energy Power and Mechanical Engineering, North China Electric Power University, Beijing 102206 (China); Yang Yongping, E-mail: yyp@ncepu.edu.c [Key Lab of Condition Monitoring and Control for Power Plant Equipment of Ministry of Education, School of Energy Power and Mechanical Engineering, North China Electric Power University, Beijing 102206 (China); Lu Shiyuan; Li Le [Key Lab of Condition Monitoring and Control for Power Plant Equipment of Ministry of Education, School of Energy Power and Mechanical Engineering, North China Electric Power University, Beijing 102206 (China); Song Xiaona [Electromechanical Practice Center, Beijing Information Science and Technology University, Beijing (China)

    2011-05-15

    In China, coal-fired power plants are the main supplier of electricity, as well as the largest consumer of coal and water resources and the biggest emitter of SO{sub x}, NO{sub x}, and greenhouse gases (GHGs). Therefore, it is important to establish a scientific, reasonable, and feasible comprehensive evaluation system for coal-fired power plants to guide them in achieving multi-optimisation of their thermal, environmental, and economic performance. This paper proposes a novel comprehensive evaluation method, which is based on a combination of the grey relational analysis (GRA) and the analytic hierarchy process (AHP), to assess the multi-objective performance of power plants. Unlike the traditional evaluation method that uses coal consumption as a basic indicator, the proposed evaluation method also takes water consumption and pollutant emissions as indicators. On the basis of the proposed evaluation method, a case study on typical 600 MW coal-fired power plants is carried out to determine the relevancy rules among factors including the coal consumption, water consumption, pollutant, and GHG emissions of power plants. This research offers new ideas and methods for the comprehensive performance evaluation of complex energy utilisation systems, and is beneficial to the synthesised consideration of resources, economy, and environment factors in system optimising and policy making. - Research highlights: {yields} We proposed a comprehensive evaluation method for coal-fired power plants. {yields} The method is based on the grey relational analysis (GRA). {yields} The method also introduces the idea of the analytic hierarchy process (AHP). {yields} The method can assess thermal, economic and environmental performance. {yields} The method can play an active role in guiding power plants' improvements.

  17. Computer aided analysis of disturbances

    International Nuclear Information System (INIS)

    Baldeweg, F.; Lindner, A.

    1986-01-01

    Computer aided analysis of disturbances and the prevention of failures (diagnosis and therapy control) in technological plants belong to the most important tasks of process control. Research in this field is very intensive due to increasing requirements to security and economy of process control and due to a remarkable increase of the efficiency of digital electronics. This publication concerns with analysis of disturbances in complex technological plants, especially in so called high risk processes. The presentation emphasizes theoretical concept of diagnosis and therapy control, modelling of the disturbance behaviour of the technological process and the man-machine-communication integrating artificial intelligence methods, e.g., expert system approach. Application is given for nuclear power plants. (author)

  18. Computer-aided diagnosis in chest radiology.

    Science.gov (United States)

    MacMahon, H; Doi, K; Chan, H P; Giger, M L; Katsuragawa, S; Nakamori, N

    1990-01-01

    Digital radiography offers several important advantages over conventional systems, including abilities for image manipulation, transmission, and storage. In the long term, however, the unique ability to apply artificial intelligence techniques for automated detection and quantitation of disease may have an even greater impact on radiologic practice. Although CAD is still in its infancy, the results of several recent studies clearly indicate a major potential for the future. The concept of using computers to analyze medical images is not new, but recent advances in computer technology together with progress in implementing practical digital radiography systems have stimulated research efforts in this exciting field. Several facets of CAD are presently being developed at the University of Chicago and elsewhere for application in chest radiology as well as in mammography and vascular imaging. To date, investigators have focused on a limited number of subjects that have been, by their nature, particularly suitable for computer analysis. There is no aspect of radiologic diagnosis that could not potentially benefit from this approach, however. The ultimate goal of these endeavors is to provide a system for comprehensive automated image analysis, the results of which could be accepted or modified at the discretion of the radiologist.

  19. A Comprehensive Analysis of Marketing Journal Rankings

    Science.gov (United States)

    Steward, Michelle D.; Lewis, Bruce R.

    2010-01-01

    The purpose of this study is to offer a comprehensive assessment of journal standings in Marketing from two perspectives. The discipline perspective of rankings is obtained from a collection of published journal ranking studies during the past 15 years. The studies in the published ranking stream are assessed for reliability by examining internal…

  20. Comprehension and computation in Bayesian problem solving

    Directory of Open Access Journals (Sweden)

    Eric D. Johnson

    2015-07-01

    Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.

  1. Comprehensive analysis of temporal alterations in cellular proteome of Bacillus subtilis under curcumin treatment.

    Directory of Open Access Journals (Sweden)

    Panga Jaipal Reddy

    Full Text Available Curcumin is a natural dietary compound with antimicrobial activity against various gram positive and negative bacteria. This study aims to investigate the proteome level alterations in Bacillus subtilis due to curcumin treatment and identification of its molecular/cellular targets to understand the mechanism of action. We have performed a comprehensive proteomic analysis of B. subtilis AH75 strain at different time intervals of curcumin treatment (20, 60 and 120 min after the drug exposure, three replicates to compare the protein expression profiles using two complementary quantitative proteomic techniques, 2D-DIGE and iTRAQ. To the best of our knowledge, this is the first comprehensive longitudinal investigation describing the effect of curcumin treatment on B. subtilis proteome. The proteomics analysis revealed several interesting targets such UDP-N-acetylglucosamine 1-carboxyvinyltransferase 1, putative septation protein SpoVG and ATP-dependent Clp protease proteolytic subunit. Further, in silico pathway analysis using DAVID and KOBAS has revealed modulation of pathways related to the fatty acid metabolism and cell wall synthesis, which are crucial for cell viability. Our findings revealed that curcumin treatment lead to inhibition of the cell wall and fatty acid synthesis in addition to differential expression of many crucial proteins involved in modulation of bacterial metabolism. Findings obtained from proteomics analysis were further validated using 5-cyano-2,3-ditolyl tetrazolium chloride (CTC assay for respiratory activity, resazurin assay for metabolic activity and membrane integrity assay by potassium and inorganic phosphate leakage measurement. The gene expression analysis of selected cell wall biosynthesis enzymes has strengthened the proteomics findings and indicated the major effect of curcumin on cell division.

  2. A Comprehensive Review on Handcrafted and Learning-Based Action Representation Approaches for Human Activity Recognition

    Directory of Open Access Journals (Sweden)

    Allah Bux Sargano

    2017-01-01

    Full Text Available Human activity recognition (HAR is an important research area in the fields of human perception and computer vision due to its wide range of applications. These applications include: intelligent video surveillance, ambient assisted living, human computer interaction, human-robot interaction, entertainment, and intelligent driving. Recently, with the emergence and successful deployment of deep learning techniques for image classification, researchers have migrated from traditional handcrafting to deep learning techniques for HAR. However, handcrafted representation-based approaches are still widely used due to some bottlenecks such as computational complexity of deep learning techniques for activity recognition. However, approaches based on handcrafted representation are not able to handle complex scenarios due to their limitations and incapability; therefore, resorting to deep learning-based techniques is a natural option. This review paper presents a comprehensive survey of both handcrafted and learning-based action representations, offering comparison, analysis, and discussions on these approaches. In addition to this, the well-known public datasets available for experimentations and important applications of HAR are also presented to provide further insight into the field. This is the first review paper of its kind which presents all these aspects of HAR in a single review article with comprehensive coverage of each part. Finally, the paper is concluded with important discussions and research directions in the domain of HAR.

  3. NGScloud: RNA-seq analysis of non-model species using cloud computing.

    Science.gov (United States)

    Mora-Márquez, Fernando; Vázquez-Poletti, José Luis; López de Heredia, Unai

    2018-05-03

    RNA-seq analysis usually requires large computing infrastructures. NGScloud is a bioinformatic system developed to analyze RNA-seq data using the cloud computing services of Amazon that permit the access to ad hoc computing infrastructure scaled according to the complexity of the experiment, so its costs and times can be optimized. The application provides a user-friendly front-end to operate Amazon's hardware resources, and to control a workflow of RNA-seq analysis oriented to non-model species, incorporating the cluster concept, which allows parallel runs of common RNA-seq analysis programs in several virtual machines for faster analysis. NGScloud is freely available at https://github.com/GGFHF/NGScloud/. A manual detailing installation and how-to-use instructions is available with the distribution. unai.lopezdeheredia@upm.es.

  4. Single neuron computation

    CERN Document Server

    McKenna, Thomas M; Zornetzer, Steven F

    1992-01-01

    This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold. From a computational neuroscience perspective, a thorough understanding of the information processing performed by single neurons leads to an understanding of circuit- and systems-level activity. From the standpoint of artificial neural networks (ANNs), a single real neuron is as complex an operational unit as an entire ANN, and formalizing the complex computations performed by real n

  5. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  6. ATLAS distributed computing: experience and evolution

    International Nuclear Information System (INIS)

    Nairz, A

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb −1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, energies and event complexities. An essential requirement will be the efficient utilisation of current and future processor technologies as well as a broad range of computing platforms, including supercomputing and cloud resources. We will report on experience gained thus far and our progress in preparing ATLAS computing for the future

  7. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  8. A computational fluid dynamics analysis on stratified scavenging system of medium capacity two-stroke internal combustion engines

    Directory of Open Access Journals (Sweden)

    Pitta Srinivasa Rao

    2008-01-01

    Full Text Available The main objective of the present work is to make a computational study of stratified scavenging system in two-stroke medium capacity engines to reduce or to curb the emissions from the two-stroke engines. The 3-D flows within the cylinder are simulated using computational fluid dynamics and the code Fluent 6. Flow structures in the transfer ports and the exhaust port are predicted without the stratification and with the stratification, and are well predicted. The total pressure and velocity map from computation provided comprehensive information on the scavenging and stratification phenomenon. Analysis is carried out for the transfer ports flow and the extra port in the transfer port along with the exhaust port when the piston is moving from the top dead center to the bottom dead center, as the ports are closed, half open, three forth open, and full port opening. An unstructured cell is adopted for meshing the geometry created in CATIA software. Flow is simulated by solving governing equations namely conservation of mass momentum and energy using SIMPLE algorithm. Turbulence is modeled by high Reynolds number version k-e model. Experimental measurements are made for validating the numerical prediction. Good agreement is observed between predicted result and experimental data; that the stratification had significantly reduced the emissions and fuel economy is achieved.

  9. Computer Programme for the Dynamic Analysis of Tall Regular ...

    African Journals Online (AJOL)

    The traditional method of dynamic analysis of tall rigid frames assumes the shear frame model. Models that allow joint rotations with/without the inclusion of the column axial loads give improved results but pose much more computational difficulty. In this work a computer program Natfrequency that determines the dynamic ...

  10. COALA--A Computational System for Interlanguage Analysis.

    Science.gov (United States)

    Pienemann, Manfred

    1992-01-01

    Describes a linguistic analysis computational system that responds to highly complex queries about morphosyntactic and semantic structures contained in large sets of language acquisition data by identifying, displaying, and analyzing sentences that meet the defined linguistic criteria. (30 references) (Author/CB)

  11. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  12. Computational Aspects of Dam Risk Analysis: Findings and Challenges

    Directory of Open Access Journals (Sweden)

    Ignacio Escuder-Bueno

    2016-09-01

    Full Text Available In recent years, risk analysis techniques have proved to be a useful tool to inform dam safety management. This paper summarizes the outcomes of three themes related to dam risk analysis discussed in the Benchmark Workshops organized by the International Commission on Large Dams Technical Committee on “Computational Aspects of Analysis and Design of Dams.” In the 2011 Benchmark Workshop, estimation of the probability of failure of a gravity dam for the sliding failure mode was discussed. Next, in 2013, the discussion focused on the computational challenges of the estimation of consequences in dam risk analysis. Finally, in 2015, the probability of sliding and overtopping in an embankment was analyzed. These Benchmark Workshops have allowed a complete review of numerical aspects for dam risk analysis, showing that risk analysis methods are a very useful tool to analyze the risk of dam systems, including downstream consequence assessments and the uncertainty of structural models.

  13. Reading comprehension and its underlying components in second-language learners: A meta-analysis of studies comparing first- and second-language learners.

    Science.gov (United States)

    Melby-Lervåg, Monica; Lervåg, Arne

    2014-03-01

    We report a systematic meta-analytic review of studies comparing reading comprehension and its underlying components (language comprehension, decoding, and phonological awareness) in first- and second-language learners. The review included 82 studies, and 576 effect sizes were calculated for reading comprehension and underlying components. Key findings were that, compared to first-language learners, second-language learners display a medium-sized deficit in reading comprehension (pooled effect size d = -0.62), a large deficit in language comprehension (pooled effect size d = -1.12), but only small differences in phonological awareness (pooled effect size d = -0.08) and decoding (pooled effect size d = -0.12). A moderator analysis showed that characteristics related to the type of reading comprehension test reliably explained the variation in the differences in reading comprehension between first- and second-language learners. For language comprehension, studies of samples from low socioeconomic backgrounds and samples where only the first language was used at home generated the largest group differences in favor of first-language learners. Test characteristics and study origin reliably contributed to the variations between the studies of language comprehension. For decoding, Canadian studies showed group differences in favor of second-language learners, whereas the opposite was the case for U.S. studies. Regarding implications, unless specific decoding problems are detected, interventions that aim to ameliorate reading comprehension problems among second-language learners should focus on language comprehension skills.

  14. Computational analysis of cerebral cortex

    Energy Technology Data Exchange (ETDEWEB)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Tokyo (Japan)

    2010-08-15

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  15. Computational analysis of cerebral cortex

    International Nuclear Information System (INIS)

    Takao, Hidemasa; Abe, Osamu; Ohtomo, Kuni

    2010-01-01

    Magnetic resonance imaging (MRI) has been used in many in vivo anatomical studies of the brain. Computational neuroanatomy is an expanding field of research, and a number of automated, unbiased, objective techniques have been developed to characterize structural changes in the brain using structural MRI without the need for time-consuming manual measurements. Voxel-based morphometry is one of the most widely used automated techniques to examine patterns of brain changes. Cortical thickness analysis is also becoming increasingly used as a tool for the study of cortical anatomy. Both techniques can be relatively easily used with freely available software packages. MRI data quality is important in order for the processed data to be accurate. In this review, we describe MRI data acquisition and preprocessing for morphometric analysis of the brain and present a brief summary of voxel-based morphometry and cortical thickness analysis. (orig.)

  16. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    Directory of Open Access Journals (Sweden)

    Gregor Wiedemann

    2013-05-01

    Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231

  17. The comprehension of mathematic problems in primary school

    Directory of Open Access Journals (Sweden)

    Karel Pérez Ariza

    2015-05-01

    Full Text Available The paper describes the result of the research project “A study of causes of difficulties in learning comprehension from an interdisciplinary perspective in Camagüey. The main objective of that study is to propose a methodology for the comprehension of mathematic problems in primary school. In designing the methodology, the characteristics of this text variety, basic principle of the theory of reading comprehension and problem solving were taking into account. In this research work several theoretical methods were used —analysis-synthesis, historical-logical, inductive-deductive— to elaborate the theoretical framework, while modeling and system approach in the methodology construction. Additionally, empirical methods were used in order to assess the knowledge about comprehension of mathematic problems; among them observation and analysis of the activity results.

  18. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  19. Quality Risk Evaluation of the Food Supply Chain Using a Fuzzy Comprehensive Evaluation Model and Failure Mode, Effects, and Criticality Analysis

    Directory of Open Access Journals (Sweden)

    Libiao Bai

    2018-01-01

    Full Text Available Evaluating the quality risk level in the food supply chain can reduce quality information asymmetry and food quality incidents and promote nationally integrated regulations for food quality. In order to evaluate it, a quality risk evaluation indicator system for the food supply chain is constructed based on an extensive literature review in this paper. Furthermore, a mathematical model based on the fuzzy comprehensive evaluation model (FCEM and failure mode, effects, and criticality analysis (FMECA for evaluating the quality risk level in the food supply chain is developed. A computational experiment aimed at verifying the effectiveness and feasibility of this proposed model is conducted on the basis of a questionnaire survey. The results suggest that this model can be used as a general guideline to assess the quality risk level in the food supply chain and achieve the most important objective of providing a reference for the public and private sectors when making decisions on food quality management.

  20. Comprehensive analysis of shielding effectiveness for HDPE, BPE and concrete as candidate materials for neutron shielding

    International Nuclear Information System (INIS)

    Dhang, Prosenjit; Verma, Rishi; Shyam, Anurag

    2015-01-01

    In the compact accelerator based DD neutron generator, the deuterium ions generated by the ion source are accelerated after the extraction and bombarded to a deuterated titanium target. The emitted neutrons have typical energy of ∼2.45MeV. Utilization of these compact accelerator based neutron generators of yield up to 10 9 neutron/second (DD) is under active consideration in many research laboratories for conducting active neutron interrogation experiments. Requirement of an adequately shielded laboratory is mandatory for the effective and safe utilization of these generators for intended applications. In this reference, we report the comprehensive analysis of shielding effectiveness for High Density Polyethylene (HDPE), Borated Polyethylene (BPE) and Concrete as candidate materials for neutron shielding. In shielding calculations, neutron induced scattering and absorption gamma dose has also been considered along with neutron dose. Contemporarily any material with higher hydrogenous concentration is best suited for neutron shielding. Choice of shielding material is also dominated by practical issues like economic viability and availability of space. Our computational analysis results reveal that utilization of BPE sheets results in minimum wall thickness requirement for attaining similar range of attenuation in neutron and gamma dose. The added advantage of using borated polyethylene is that it reduces the effect of both neutron and gamma dose by absorbing neutron and producing lithium and alpha particle. It has also been realized that for deciding upon optimum thickness determination of any shielding material, three important factors to be necessarily considered are: use factor, occupancy factor and work load factor. (author)

  1. Multichannel microscale system for high throughput preparative separation with comprehensive collection and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel

    2003-12-09

    A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.

  2. A Principal Component Analysis/Fuzzy Comprehensive Evaluation for Rockburst Potential in Kimberlite

    Science.gov (United States)

    Pu, Yuanyuan; Apel, Derek; Xu, Huawei

    2018-02-01

    Kimberlite is an igneous rock which sometimes bears diamonds. Most of the diamonds mined in the world today are found in kimberlite ores. Burst potential in kimberlite has not been investigated, because kimberlite is mostly mined using open-pit mining, which poses very little threat of rock bursting. However, as the mining depth keeps increasing, the mines convert to underground mining methods, which can pose a threat of rock bursting in kimberlite. This paper focuses on the burst potential of kimberlite at a diamond mine in northern Canada. A combined model with the methods of principal component analysis (PCA) and fuzzy comprehensive evaluation (FCE) is developed to process data from 12 different locations in kimberlite pipes. Based on calculated 12 fuzzy evaluation vectors, 8 locations show a moderate burst potential, 2 locations show no burst potential, and 2 locations show strong and violent burst potential, respectively. Using statistical principles, a Mahalanobis distance is adopted to build a comprehensive fuzzy evaluation vector for the whole mine and the final evaluation for burst potential is moderate, which is verified by a practical rockbursting situation at mine site.

  3. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  4. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  5. Introduction to reversible computing

    CERN Document Server

    Perumalla, Kalyan S

    2013-01-01

    Few books comprehensively cover the software and programming aspects of reversible computing. Filling this gap, Introduction to Reversible Computing offers an expanded view of the field that includes the traditional energy-motivated hardware viewpoint as well as the emerging application-motivated software approach. Collecting scattered knowledge into one coherent account, the book provides a compendium of both classical and recently developed results on reversible computing. It explores up-and-coming theories, techniques, and tools for the application of rever

  6. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  7. Report--COMOLA: A Computer System for the Analysis of Interlanguage Data.

    Science.gov (United States)

    Jagtman, Margriet; Bongaerts, Theo

    1994-01-01

    Discusses the design and use of the Computer Model for Language Acquisition (COMOLA), a computer program designed to analyze syntactic development in second-language learners by examining their oral utterances. Also compares COMOLA to the recently developed Computer-Aides Linguistic Analysis (COALA) program. (MDM)

  8. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...

  9. Computer System Analysis for Decommissioning Management of Nuclear Reactor

    International Nuclear Information System (INIS)

    Nurokhim; Sumarbagiono

    2008-01-01

    Nuclear reactor decommissioning is a complex activity that should be planed and implemented carefully. A system based on computer need to be developed to support nuclear reactor decommissioning. Some computer systems have been studied for management of nuclear power reactor. Software system COSMARD and DEXUS that have been developed in Japan and IDMT in Italy used as models for analysis and discussion. Its can be concluded that a computer system for nuclear reactor decommissioning management is quite complex that involved some computer code for radioactive inventory database calculation, calculation module on the stages of decommissioning phase, and spatial data system development for virtual reality. (author)

  10. Isotopic analysis of plutonium by computer controlled mass spectrometry

    International Nuclear Information System (INIS)

    1974-01-01

    Isotopic analysis of plutonium chemically purified by ion exchange is achieved using a thermal ionization mass spectrometer. Data acquisition from and control of the instrument is done automatically with a dedicated system computer in real time with subsequent automatic data reduction and reporting. Separation of isotopes is achieved by varying the ion accelerating high voltage with accurate computer control

  11. Stochastic biological response to radiation. Comprehensive analysis of gene expression

    International Nuclear Information System (INIS)

    Inoue, Tohru; Hirabayashi, Yoko

    2012-01-01

    Authors explain that the radiation effect on biological system is stochastic along the law of physics, differing from chemical effect, using instances of Cs-137 gamma-ray (GR) and benzene (BZ) exposures to mice and of resultant comprehensive analyses of gene expression. Single GR irradiation is done with Gamma Cell 40 (CSR) to C57BL/6 or C3H/He mouse at 0, 0.6 and 3 Gy. BE is given orally at 150 mg/kg/day for 5 days x 2 weeks. Bone marrow cells are sampled 1 month after the exposure. Comprehensive gene expression is analyzed by Gene Chip Mouse Genome 430 2.0 Array (Affymetrix) and data are processed by programs like case normalization, statistics, network generation, functional analysis etc. GR irradiation brings about changes of gene expression, which are classifiable in common genes variable commonly on the dose change and stochastic genes variable stochastically within each dose: e.g., with Welch-t-test, significant differences are between 0/3 Gy (dose-specific difference, 455 pbs (probe set), in stochastic 2113 pbs), 0/0.6 Gy (267 in 1284 pbs) and 0.6/3 Gy (532 pbs); and with one-way analysis of variation (ANOVA) and hierarchial/dendrographic analyses, 520 pbs are shown to involve the dose-dependent 226 and dose-specific 294 pbs. It is also shown that at 3 Gy, expression of common genes are rather suppressed, including those related to the proliferation/apoptosis of B/T cells, and of stochastic genes, related to cell division/signaling. Ven diagram of the common genes of above 520 pbs, stochastic 2113 pbs at 3 Gy and 1284 pbs at 0.6 Gy shows the overlapping genes 29, 2 and 4, respectively, indicating only 35 pbs are overlapping in total. Network analysis of changes by GR shows the rather high expression of genes around hub of cAMP response element binding protein (CREB) at 0.6 Gy, and rather variable expression around CREB hub/suppressed expression of kinesin hub at 3 Gy; in the network by BZ exposure, unchanged or low expression around p53 hub and suppression

  12. Investigating the computer analysis of eddy current NDT data

    International Nuclear Information System (INIS)

    Brown, R.L.

    1979-01-01

    The objective of this activity was to investigate and develop techniques for computer analysis of eddy current nondestructive testing (NDT) data. A single frequency commercial eddy current tester and a precision mechanical scanner were interfaced with a PDP-11/34 computer to obtain and analyze eddy current data from samples of 316 stainless steel tubing containing known discontinuities. Among the data analysis techniques investigated were: correlation, Fast Fourier Transforms (FFT), clustering, and Adaptive Learning Networks (ALN). The results were considered encouraging. ALN, for example, correctly identified 88% of the defects and non-defects from a group of 153 signal indications

  13. RDS; A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    International Nuclear Information System (INIS)

    Mohd Faiz Salim; Ridha Roslan; Mohd Rizal Mamat

    2013-01-01

    Full-text: Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBIMOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges. (author)

  14. A Comprehensive study on Cloud Green Computing: To Reduce Carbon Footprints Using Clouds

    OpenAIRE

    Chiranjeeb Roy Chowdhury, Arindam Chatterjee, Alap Sardar, Shalabh Agarwal, Asoke Nath

    2013-01-01

    Cloud computing and Green computing are twomostemergent areas in information communicationtechnology (ICT) with immense applications in theentire globe. The future trends of ICT will be moretowards cloud computing and green computing.Due to tremendous improvements in computernetworks now the people prefer the Network-basedcomputing instead of doing something inan in-house based computing.In any business sector dailybusiness and individual computing are nowmigrating from individual hard drives...

  15. Classical and quantum computing with C++ and Java simulations

    CERN Document Server

    Hardy, Y

    2001-01-01

    Classical and Quantum computing provides a self-contained, systematic and comprehensive introduction to all the subjects and techniques important in scientific computing. The style and presentation are readily accessible to undergraduates and graduates. A large number of examples, accompanied by complete C++ and Java code wherever possible, cover every topic. Features and benefits: - Comprehensive coverage of the theory with many examples - Topics in classical computing include boolean algebra, gates, circuits, latches, error detection and correction, neural networks, Turing machines, cryptography, genetic algorithms - For the first time, genetic expression programming is presented in a textbook - Topics in quantum computing include mathematical foundations, quantum algorithms, quantum information theory, hardware used in quantum computing This book serves as a textbook for courses in scientific computing and is also very suitable for self-study. Students, professionals and practitioners in computer...

  16. RFA Guardian: Comprehensive Simulation of Radiofrequency Ablation Treatment of Liver Tumors.

    Science.gov (United States)

    Voglreiter, Philip; Mariappan, Panchatcharam; Pollari, Mika; Flanagan, Ronan; Blanco Sequeiros, Roberto; Portugaller, Rupert Horst; Fütterer, Jurgen; Schmalstieg, Dieter; Kolesnik, Marina; Moche, Michael

    2018-01-15

    The RFA Guardian is a comprehensive application for high-performance patient-specific simulation of radiofrequency ablation of liver tumors. We address a wide range of usage scenarios. These include pre-interventional planning, sampling of the parameter space for uncertainty estimation, treatment evaluation and, in the worst case, failure analysis. The RFA Guardian is the first of its kind that exhibits sufficient performance for simulating treatment outcomes during the intervention. We achieve this by combining a large number of high-performance image processing, biomechanical simulation and visualization techniques into a generalized technical workflow. Further, we wrap the feature set into a single, integrated application, which exploits all available resources of standard consumer hardware, including massively parallel computing on graphics processing units. This allows us to predict or reproduce treatment outcomes on a single personal computer with high computational performance and high accuracy. The resulting low demand for infrastructure enables easy and cost-efficient integration into the clinical routine. We present a number of evaluation cases from the clinical practice where users performed the whole technical workflow from patient-specific modeling to final validation and highlight the opportunities arising from our fast, accurate prediction techniques.

  17. A Computer-based 21st Century Prototype

    Directory of Open Access Journals (Sweden)

    Pannathon Sangarun

    2015-01-01

    Full Text Available Abstract This paper describes a prototype computer-based reading comprehension program. It begins with a short description, at a general level, of theoretical issues relating to the learning of comprehension skills in a foreign/second language learning. These issues cover such areas as personal meaning-making on the basis of individual differences and the need for individualized intervention to maximize the comprehension process. Modern technology facilitates this process and enables simultaneous support of large numbers of students. Specifically, from a learning perspective, the program focuses on students’ personal understandings while, from a reading perspective, the construction of meaning is based on an interactive model where both high-level (global, inferential structures are elicited/studied as well as low-level structures (e.g. vocabulary, grammar. These principles are strengthened with research findings from studies in awareness and language processing based on eye-movement analysis. As part of its reading comprehensions focus, the system also has a strong commitment to the development of critical thinking skills, recognized as one of the most important 21st Century skills. The program is then described in detail, including its ability to store students’ responses and to be administered through standard learning management systems. Finally, an outline of planned future developments and enhancements is presented.

  18. Comprehensive Modeling and Analysis of Rotorcraft Variable Speed Propulsion System With Coupled Engine/Transmission/Rotor Dynamics

    Science.gov (United States)

    DeSmidt, Hans A.; Smith, Edward C.; Bill, Robert C.; Wang, Kon-Well

    2013-01-01

    This project develops comprehensive modeling and simulation tools for analysis of variable rotor speed helicopter propulsion system dynamics. The Comprehensive Variable-Speed Rotorcraft Propulsion Modeling (CVSRPM) tool developed in this research is used to investigate coupled rotor/engine/fuel control/gearbox/shaft/clutch/flight control system dynamic interactions for several variable rotor speed mission scenarios. In this investigation, a prototypical two-speed Dual-Clutch Transmission (DCT) is proposed and designed to achieve 50 percent rotor speed variation. The comprehensive modeling tool developed in this study is utilized to analyze the two-speed shift response of both a conventional single rotor helicopter and a tiltrotor drive system. In the tiltrotor system, both a Parallel Shift Control (PSC) strategy and a Sequential Shift Control (SSC) strategy for constant and variable forward speed mission profiles are analyzed. Under the PSC strategy, selecting clutch shift-rate results in a design tradeoff between transient engine surge margins and clutch frictional power dissipation. In the case of SSC, clutch power dissipation is drastically reduced in exchange for the necessity to disengage one engine at a time which requires a multi-DCT drive system topology. In addition to comprehensive simulations, several sections are dedicated to detailed analysis of driveline subsystem components under variable speed operation. In particular an aeroelastic simulation of a stiff in-plane rotor using nonlinear quasi-steady blade element theory was conducted to investigate variable speed rotor dynamics. It was found that 2/rev and 4/rev flap and lag vibrations were significant during resonance crossings with 4/rev lagwise loads being directly transferred into drive-system torque disturbances. To capture the clutch engagement dynamics, a nonlinear stick-slip clutch torque model is developed. Also, a transient gas-turbine engine model based on first principles mean

  19. Introduction to computer data representation

    CERN Document Server

    Fenwick, Peter

    2014-01-01

    Introduction to Computer Data Representation introduces readers to the representation of data within computers. Starting from basic principles of number representation in computers, the book covers the representation of both integer and floating point numbers, and characters or text. It comprehensively explains the main techniques of computer arithmetic and logical manipulation. The book also features chapters covering the less usual topics of basic checksums and 'universal' or variable length representations for integers, with additional coverage of Gray Codes, BCD codes and logarithmic repre

  20. Is Word-Problem Solving a Form of Text Comprehension?

    Science.gov (United States)

    Fuchs, Lynn S.; Fuchs, Douglas; Compton, Donald L.; Hamlett, Carol L.; Wang, Amber Y.

    2015-01-01

    This study’s hypotheses were that (a) word-problem (WP) solving is a form of text comprehension that involves language comprehension processes, working memory, and reasoning, but (b) WP solving differs from other forms of text comprehension by requiring WP-specific language comprehension as well as general language comprehension. At the start of the 2nd grade, children (n = 206; on average, 7 years, 6 months) were assessed on general language comprehension, working memory, nonlinguistic reasoning, processing speed (a control variable), and foundational skill (arithmetic for WPs; word reading for text comprehension). In spring, they were assessed on WP-specific language comprehension, WPs, and text comprehension. Path analytic mediation analysis indicated that effects of general language comprehension on text comprehension were entirely direct, whereas effects of general language comprehension on WPs were partially mediated by WP-specific language. By contrast, effects of working memory and reasoning operated in parallel ways for both outcomes. PMID:25866461

  1. Wideband and UWB Antennas for Wireless Applications: A Comprehensive Review

    Directory of Open Access Journals (Sweden)

    Renato Cicchetti

    2017-01-01

    Full Text Available A comprehensive review concerning the geometry, the manufacturing technologies, the materials, and the numerical techniques, adopted for the analysis and design of wideband and ultrawideband (UWB antennas for wireless applications, is presented. Planar, printed, dielectric, and wearable antennas, achievable on laminate (rigid and flexible, and textile dielectric substrates are taken into account. The performances of small, low-profile, and dielectric resonator antennas are illustrated paying particular attention to the application areas concerning portable devices (mobile phones, tablets, glasses, laptops, wearable computers, etc. and radio base stations. This information provides a guidance to the selection of the different antenna geometries in terms of bandwidth, gain, field polarization, time-domain response, dimensions, and materials useful for their realization and integration in modern communication systems.

  2. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.

    2008-01-01

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  3. Grid computing infrastructure, service, and applications

    CERN Document Server

    Jie, Wei; Chen, Jinjun

    2009-01-01

    Offering a comprehensive discussion of advances in grid computing, this book summarizes the concepts, methods, technologies, and applications. It covers topics such as philosophy, middleware, architecture, services, and applications. It also includes technical details to demonstrate how grid computing works in the real world

  4. Comprehensive Care

    Science.gov (United States)

    ... Comprehensive Care Share this page Facebook Twitter Email Comprehensive Care Understand the importance of comprehensive MS care ... In this article A complex disease requires a comprehensive approach Today multiple sclerosis (MS) is not a ...

  5. Enhancing Next-Generation Sequencing-Guided Cancer Care Through Cognitive Computing.

    Science.gov (United States)

    Patel, Nirali M; Michelini, Vanessa V; Snell, Jeff M; Balu, Saianand; Hoyle, Alan P; Parker, Joel S; Hayward, Michele C; Eberhard, David A; Salazar, Ashley H; McNeillie, Patrick; Xu, Jia; Huettner, Claudia S; Koyama, Takahiko; Utro, Filippo; Rhrissorrakrai, Kahn; Norel, Raquel; Bilal, Erhan; Royyuru, Ajay; Parida, Laxmi; Earp, H Shelton; Grilley-Olson, Juneko E; Hayes, D Neil; Harvey, Stephen J; Sharpless, Norman E; Kim, William Y

    2018-02-01

    Using next-generation sequencing (NGS) to guide cancer therapy has created challenges in analyzing and reporting large volumes of genomic data to patients and caregivers. Specifically, providing current, accurate information on newly approved therapies and open clinical trials requires considerable manual curation performed mainly by human "molecular tumor boards" (MTBs). The purpose of this study was to determine the utility of cognitive computing as performed by Watson for Genomics (WfG) compared with a human MTB. One thousand eighteen patient cases that previously underwent targeted exon sequencing at the University of North Carolina (UNC) and subsequent analysis by the UNCseq informatics pipeline and the UNC MTB between November 7, 2011, and May 12, 2015, were analyzed with WfG, a cognitive computing technology for genomic analysis. Using a WfG-curated actionable gene list, we identified additional genomic events of potential significance (not discovered by traditional MTB curation) in 323 (32%) patients. The majority of these additional genomic events were considered actionable based upon their ability to qualify patients for biomarker-selected clinical trials. Indeed, the opening of a relevant clinical trial within 1 month prior to WfG analysis provided the rationale for identification of a new actionable event in nearly a quarter of the 323 patients. This automated analysis took potentially improve patient care by providing a rapid, comprehensive approach for data analysis and consideration of up-to-date availability of clinical trials. The results of this study demonstrate that the interpretation and actionability of somatic next-generation sequencing results are evolving too rapidly to rely solely on human curation. Molecular tumor boards empowered by cognitive computing can significantly improve patient care by providing a fast, cost-effective, and comprehensive approach for data analysis in the delivery of precision medicine. Patients and physicians who

  6. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  7. How language production shapes language form and comprehension

    Directory of Open Access Journals (Sweden)

    Maryellen C MacDonald

    2013-04-01

    Full Text Available Language production processes can provide insight into how language comprehension works and language typology—why languages tend to have certain characteristics more often than others. Drawing on work in memory retrieval, motor planning, and serial order in action planning, the Production-Distribution-Comprehension (PDC account links work in the fields of language production, typology, and comprehension: 1 faced with substantial computational burdens of planning and producing utterances, language producers implicitly follow three biases in utterance planning that promote word order choices that reduce these burdens, thereby improving production fluency. 2 These choices, repeated over many utterances and individuals, shape the distributions of utterance forms in language. The claim that language form stems in large degree from producers’ attempts to mitigate utterance planning difficulty is contrasted with alternative accounts in which form is driven by language use more broadly, language acquisition processes, or producers’ attempts to create language forms that are easily understood by comprehenders. 3 Language perceivers implicitly learn the statistical regularities in their linguistic input, and they use this prior experience to guide comprehension of subsequent language. In particular, they learn to predict the sequential structure of linguistic signals, based on the statistics of previously-encountered input. Thus key aspects of comprehension behavior are tied to lexico-syntactic statistics in the language, which in turn derive from utterance planning biases promoting production of comparatively easy utterance forms over more difficult ones. This approach contrasts with classic theories in which comprehension behaviors are attributed to innate design features of the language comprehension system and associated working memory. The PDC instead links basic features of comprehension to a different source: production processes that shape

  8. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  9. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  10. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  11. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  12. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....

  13. Aerodynamic analysis of Pegasus - Computations vs reality

    Science.gov (United States)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  14. Analysis of Comprehensive Utilization of Coconut Waste

    OpenAIRE

    Zheng, Kan; Liang, Dong; Zhang, Xirui

    2013-01-01

    This paper describes and analyzes the coconut cultivation in China, and the current comprehensive utilization of waste resources generated during cultivation and processing of coconut. The wastes generated in the process of cultivation include old coconut tree trunk, roots, withered coconut leaves, coconut flower and fallen cracking coconut, mainly used for biogas extraction, direct combustion and power generation, brewing, pharmacy, and processing of building materials; the wastes generated ...

  15. Conversation Analysis in Computer-Assisted Language Learning

    Science.gov (United States)

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  16. Quantitative Proteomics for the Comprehensive Analysis of Stress Responses of Lactobacillus paracasei subsp. paracasei F19.

    Science.gov (United States)

    Schott, Ann-Sophie; Behr, Jürgen; Geißler, Andreas J; Kuster, Bernhard; Hahne, Hannes; Vogel, Rudi F

    2017-10-06

    Lactic acid bacteria are broadly employed as starter cultures in the manufacture of foods. Upon technological preparation, they are confronted with drying stress that amalgamates numerous stress conditions resulting in losses of fitness and survival. To better understand and differentiate physiological stress responses, discover general and specific markers for the investigated stress conditions, and predict optimal preconditioning for starter cultures, we performed a comprehensive genomic and quantitative proteomic analysis of a commonly used model system, Lactobacillus paracasei subsp. paracasei TMW 1.1434 (isogenic with F19) under 11 typical stress conditions, including among others oxidative, osmotic, pH, and pressure stress. We identified and quantified >1900 proteins in triplicate analyses, representing 65% of all genes encoded in the genome. The identified genes were thoroughly annotated in terms of subcellular localization prediction and biological functions, suggesting unbiased and comprehensive proteome coverage. In total, 427 proteins were significantly differentially expressed in at least one condition. Most notably, our analysis suggests that optimal preconditioning toward drying was predicted to be alkaline and high-pressure stress preconditioning. Taken together, we believe the presented strategy may serve as a prototypic example for the analysis and utility of employing quantitative-mass-spectrometry-based proteomics to study bacterial physiology.

  17. Computer assisted analysis of research-based teaching method in English newspaper reading teaching

    Science.gov (United States)

    Jie, Zheng

    2017-06-01

    In recent years, the teaching of English newspaper reading has been developing rapidly. However, the teaching effect of the existing course is not ideal. The paper tries to apply the research-based teaching model to English newspaper reading teaching, investigates the current situation in higher vocational colleges, and analyzes the problems. It designs a teaching model of English newspaper reading and carries out the empirical research conducted by computers. The results show that the teaching mode can use knowledge and ability to stimulate learners interest and comprehensively improve their ability to read newspapers.

  18. Language and Cognitive Predictors of Text Comprehension: Evidence from Multivariate Analysis

    Science.gov (United States)

    Kim, Young-Suk

    2015-01-01

    Using data from children in South Korea (N = 145, M[subscript age] = 6.08), it was determined how low-level language and cognitive skills (vocabulary, syntactic knowledge, and working memory) and high-level cognitive skills (comprehension monitoring and theory of mind [ToM]) are related to listening comprehension and whether listening…

  19. Introducing remarks upon the analysis of computer systems performance

    International Nuclear Information System (INIS)

    Baum, D.

    1980-05-01

    Some of the basis ideas of analytical techniques to study the behaviour of computer systems are presented. Single systems as well as networks of computers are viewed as stochastic dynamical systems which may be modelled by queueing networks. Therefore this report primarily serves as an introduction to probabilistic methods for qualitative analysis of systems. It is supplemented by an application example of Chandy's collapsing method. (orig.) [de

  20. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    Willie, R.R.

    1978-08-01

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  1. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  2. Computation for the analysis of designed experiments

    CERN Document Server

    Heiberger, Richard

    2015-01-01

    Addresses the statistical, mathematical, and computational aspects of the construction of packages and analysis of variance (ANOVA) programs. Includes a disk at the back of the book that contains all program codes in four languages, APL, BASIC, C, and FORTRAN. Presents illustrations of the dual space geometry for all designs, including confounded designs.

  3. Development of a comprehensive nuclear materials accountancy system at JAEA

    International Nuclear Information System (INIS)

    Takeda, Hideyuki; Usami, Masayuki; Hirosawa, Naonori; Fujita, Yoshihisa; Kodani, Yoshiki; Komata, Kazuhiro

    2007-01-01

    The Japan Atomic Energy Agency (JAEA) is submitting various types of accounting reports of international controlled materials to the Ministry of Education, Culture, Sports, Science, and Technology (MEXT) based on domestic laws and regulations. JAEA developed a comprehensive Nuclear Material Accountancy System to achieve uniform management of the data of each facility by using a company-wide database. Personal computers in each facility are connected throughout the company using an in-house network to create the comprehensive Nuclear Material Accountancy System. This System uses personal computers to facilitate timely communication and for easy maintenance and operation. Efficient data processing and quality control functions for accountancy reporting are also realized by this System. In addition, the System has the ability to extract and summarize data about Plutonium Management in the company for public announcement. This report introduces and describes the details and functions of this System. (author)

  4. Effects of Strength of Accent on an L2 Interactive Lecture Listening Comprehension Test

    Science.gov (United States)

    Ockey, Gary J.; Papageorgiou, Spiros; French, Robert

    2016-01-01

    This article reports on a study which aimed to determine the effect of strength of accent on listening comprehension of interactive lectures. Test takers (N = 21,726) listened to an interactive lecture given by one of nine speakers and responded to six comprehension items. The test taker responses were analyzed with the Rasch computer program…

  5. CALPHAD calculation of phase diagrams : a comprehensive guide

    CERN Document Server

    Saunders, N; Miodownik, A P

    1998-01-01

    This monograph acts as a benchmark to current achievements in the field of Computer Coupling of Phase Diagrams and Thermochemistry, often called CALPHAD which is an acronym for Computer CALculation of PHAse Diagrams. It also acts as a guide to both the basic background of the subject area and the cutting edge of the topic, combining comprehensive discussions of the underlying physical principles of the CALPHAD method with detailed descriptions of their application to real complex multi-component materials. Approaches which combine both thermodynamic and kinetic models to interpret non-equilibrium phase transformations are also reviewed.

  6. Computed tomography of human joints and radioactive waste drums

    International Nuclear Information System (INIS)

    Martz, Harry E.; Roberson, G. Patrick; Hollerbach, Karin; Logan, Clinton M.; Ashby, Elaine; Bernardi, Richard

    1999-01-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have seen increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed, 1.) Our computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. 2.) We are developing NDE and NDA techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity

  7. Analysis of Biosignals During Immersion in Computer Games.

    Science.gov (United States)

    Yeo, Mina; Lim, Seokbeen; Yoon, Gilwon

    2017-11-17

    The number of computer game users is increasing as computers and various IT devices in connection with the Internet are commonplace in all ages. In this research, in order to find the relevance of behavioral activity and its associated biosignal, biosignal changes before and after as well as during computer games were measured and analyzed for 31 subjects. For this purpose, a device to measure electrocardiogram, photoplethysmogram and skin temperature was developed such that the effect of motion artifacts could be minimized. The device was made wearable for convenient measurement. The game selected for the experiments was League of Legends™. Analysis on the pulse transit time, heart rate variability and skin temperature showed increased sympathetic nerve activities during computer game, while the parasympathetic nerves became less active. Interestingly, the sympathetic predominance group showed less change in the heart rate variability as compared to the normal group. The results can be valuable for studying internet gaming disorder.

  8. Computer methods for transient fluid-structure analysis of nuclear reactors

    International Nuclear Information System (INIS)

    Belytschko, T.; Liu, W.K.

    1985-01-01

    Fluid-structure interaction problems in nuclear engineering are categorized according to the dominant physical phenomena and the appropriate computational methods. Linear fluid models that are considered include acoustic fluids, incompressible fluids undergoing small disturbances, and small amplitude sloshing. Methods available in general-purpose codes for these linear fluid problems are described. For nonlinear fluid problems, the major features of alternative computational treatments are reviewed; some special-purpose and multipurpose computer codes applicable to these problems are then described. For illustration, some examples of nuclear reactor problems that entail coupled fluid-structure analysis are described along with computational results

  9. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  10. A Computational Analysis Model for Open-ended Cognitions

    Science.gov (United States)

    Morita, Junya; Miwa, Kazuhisa

    In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.

  11. Comprehensive Study on Lexicon-based Ensemble Classification Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Łukasz Augustyniak

    2015-12-01

    Full Text Available We propose a novel method for counting sentiment orientation that outperforms supervised learning approaches in time and memory complexity and is not statistically significantly different from them in accuracy. Our method consists of a novel approach to generating unigram, bigram and trigram lexicons. The proposed method, called frequentiment, is based on calculating the frequency of features (words in the document and averaging their impact on the sentiment score as opposed to documents that do not contain these features. Afterwards, we use ensemble classification to improve the overall accuracy of the method. What is important is that the frequentiment-based lexicons with sentiment threshold selection outperform other popular lexicons and some supervised learners, while being 3–5 times faster than the supervised approach. We compare 37 methods (lexicons, ensembles with lexicon’s predictions as input and supervised learners applied to 10 Amazon review data sets and provide the first statistical comparison of the sentiment annotation methods that include ensemble approaches. It is one of the most comprehensive comparisons of domain sentiment analysis in the literature.

  12. Genome-wide identification of specific oligonucleotides using artificial neural network and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Chen Jiun-Ching

    2007-05-01

    Full Text Available Abstract Background Genome-wide identification of specific oligonucleotides (oligos is a computationally-intensive task and is a requirement for designing microarray probes, primers, and siRNAs. An artificial neural network (ANN is a machine learning technique that can effectively process complex and high noise data. Here, ANNs are applied to process the unique subsequence distribution for prediction of specific oligos. Results We present a novel and efficient algorithm, named the integration of ANN and BLAST (IAB algorithm, to identify specific oligos. We establish the unique marker database for human and rat gene index databases using the hash table algorithm. We then create the input vectors, via the unique marker database, to train and test the ANN. The trained ANN predicted the specific oligos with high efficiency, and these oligos were subsequently verified by BLAST. To improve the prediction performance, the ANN over-fitting issue was avoided by early stopping with the best observed error and a k-fold validation was also applied. The performance of the IAB algorithm was about 5.2, 7.1, and 6.7 times faster than the BLAST search without ANN for experimental results of 70-mer, 50-mer, and 25-mer specific oligos, respectively. In addition, the results of polymerase chain reactions showed that the primers predicted by the IAB algorithm could specifically amplify the corresponding genes. The IAB algorithm has been integrated into a previously published comprehensive web server to support microarray analysis and genome-wide iterative enrichment analysis, through which users can identify a group of desired genes and then discover the specific oligos of these genes. Conclusion The IAB algorithm has been developed to construct SpecificDB, a web server that provides a specific and valid oligo database of the probe, siRNA, and primer design for the human genome. We also demonstrate the ability of the IAB algorithm to predict specific oligos through

  13. Computational methods for fracture mechanics analysis of pressurized-thermal-shock experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1984-01-01

    Extensive computational analyses are required to determine material parameters and optimum pressure-temperature transients compatible with proposed pressurized-thermal-shock (PTS) test scenarios and with the capabilities of the PTS test facility at the Oak Ridge National Laboratory (ORNL). Computational economy has led to the application of techniques suitable for parametric studies involving the analysis of a large number of transients. These techniques, which include analysis capability for two- and three-dimensional (2-D and 3-D) superposition, inelastic ligament stability, and upper-shelf arrest, have been incorporated into the OCA/USA computer program. Features of the OCA/USA program are discussed, including applications to the PTS test configuration

  14. Computational methods for fracture mechanics analysis of pressurized-thermal-shock experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Bryan, R.H.; Bryson, J.W.; Merkle, J.G.

    1984-01-01

    Extensive computational analyses are required to determine material parameters and optimum pressure-temperature transients compatible with proposed pressurized-thermal-shock (PTS) test scenarios and with the capabilities of the PTS test facility at the Oak Ridge National Laboratory (ORNL). Computational economy has led to the application of techniques suitable for parametric studies involving the analysis of a large number of transients. These techniques, which include analysis capability for two- and three-dimensional (2-D and 3-D) superposition, inelastic ligament stability, and upper-shelf arrest, have been incorporated into the OCA/ USA computer program. Features of the OCA/USA program are discussed, including applications to the PTS test configuration. (author)

  15. Computer-aided visualization and analysis system for sequence evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  16. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  17. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  18. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  19. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  20. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  1. The Effect of Gloss Type and Mode on Iranian EFL Learners' Reading Comprehension

    Science.gov (United States)

    Sadeghi, Karim; Ahmadi, Negar

    2012-01-01

    This study investigated the effects of three kinds of gloss conditions that is traditional non-CALL marginal gloss, computer-based audio gloss, and computer-based extended audio gloss, on reading comprehension of Iranian EFL learners. To this end, three experimental and one control groups, each comprising 15 participants, took part in this study.…

  2. ECG Signal Processing, Classification and Interpretation A Comprehensive Framework of Computational Intelligence

    CERN Document Server

    Pedrycz, Witold

    2012-01-01

    Electrocardiogram (ECG) signals are among the most important sources of diagnostic information in healthcare so improvements in their analysis may also have telling consequences. Both the underlying signal technology and a burgeoning variety of algorithms and systems developments have proved successful targets for recent rapid advances in research. ECG Signal Processing, Classification and Interpretation shows how the various paradigms of Computational Intelligence, employed either singly or in combination, can produce an effective structure for obtaining often vital information from ECG signals. Neural networks do well at capturing the nonlinear nature of the signals, information granules realized as fuzzy sets help to confer interpretability on the data and evolutionary optimization may be critical in supporting the structural development of ECG classifiers and models of ECG signals. The contributors address concepts, methodology, algorithms, and case studies and applications exploiting the paradigm of Comp...

  3. Neural activity associated with metaphor comprehension: spatial analysis.

    Science.gov (United States)

    Sotillo, María; Carretié, Luis; Hinojosa, José A; Tapia, Manuel; Mercado, Francisco; López-Martín, Sara; Albert, Jacobo

    2005-01-03

    Though neuropsychological data indicate that the right hemisphere (RH) plays a major role in metaphor processing, other studies suggest that, at least during some phases of this processing, a RH advantage may not exist. The present study explores, through a temporally agile neural signal--the event-related potentials (ERPs)--, and through source-localization algorithms applied to ERP recordings, whether the crucial phase of metaphor comprehension presents or not a RH advantage. Participants (n=24) were submitted to a S1-S2 experimental paradigm. S1 consisted of visually presented metaphoric sentences (e.g., "Green lung of the city"), followed by S2, which consisted of words that could (i.e., "Park") or could not (i.e., "Semaphore") be defined by S1. ERPs elicited by S2 were analyzed using temporal principal component analysis (tPCA) and source-localization algorithms. These analyses revealed that metaphorically related S2 words showed significantly higher N400 amplitudes than non-related S2 words. Source-localization algorithms showed differential activity between the two S2 conditions in the right middle/superior temporal areas. These results support the existence of an important RH contribution to (at least) one phase of metaphor processing and, furthermore, implicate the temporal cortex with respect to that contribution.

  4. Computer-Assisted Digital Image Analysis of Plus Disease in Retinopathy of Prematurity.

    Science.gov (United States)

    Kemp, Pavlina S; VanderVeen, Deborah K

    2016-01-01

    The objective of this study is to review the current state and role of computer-assisted analysis in diagnosis of plus disease in retinopathy of prematurity. Diagnosis and documentation of retinopathy of prematurity are increasingly being supplemented by digital imaging. The incorporation of computer-aided techniques has the potential to add valuable information and standardization regarding the presence of plus disease, an important criterion in deciding the necessity of treatment of vision-threatening retinopathy of prematurity. A review of literature found that several techniques have been published examining the process and role of computer aided analysis of plus disease in retinopathy of prematurity. These techniques use semiautomated image analysis techniques to evaluate retinal vascular dilation and tortuosity, using calculated parameters to evaluate presence or absence of plus disease. These values are then compared with expert consensus. The study concludes that computer-aided image analysis has the potential to use quantitative and objective criteria to act as a supplemental tool in evaluating for plus disease in the setting of retinopathy of prematurity.

  5. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  6. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  7. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    Directory of Open Access Journals (Sweden)

    Ikeda Noriaki

    2006-10-01

    Full Text Available Abstract Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel interface, and the

  8. Comprehensive Analysis of CD8+-T-Cell Responses against Hepatitis C Virus Reveals Multiple Unpredicted Specificities

    OpenAIRE

    Lauer, Georg M.; Ouchi, Kei; Chung, Raymond T.; Nguyen, Tam N.; Day, Cheryl L.; Purkis, Deborah R.; Reiser, Markus; Kim, Arthur Y.; Lucas, Michaela; Klenerman, Paul; Walker, Bruce D.

    2002-01-01

    The hepatitis C virus (HCV)-specific CD8+-T-cell response is thought to play a critical role in HCV infection. Studies of these responses have largely relied on the analysis of a small number of previously described or predicted HCV epitopes, mostly restricted by HLA A2. In order to determine the actual breadth and magnitude of CD8+-T-cell responses in the context of diverse HLA class I alleles, we performed a comprehensive analysis of responses to all expressed HCV proteins. By using a panel...

  9. Status of computer codes available in AEOI for reactor physics analysis

    International Nuclear Information System (INIS)

    Karbassiafshar, M.

    1986-01-01

    Many of the nuclear computer codes available in Atomic Energy Organization of Iran AEOI can be used for physics analysis of an operating reactor or design purposes. Grasp of the various methods involved and practical experience with these codes would be the starting point for interesting design studies or analysis of operating conditions of presently existing and future reactors. A review of the objectives and flowchart of commonly practiced procedures in reactor physics analysis of LWRs and related computer codes was made, extrapolating to the nationally and internationally available resources. Finally, effective utilization of the existing facilities is discussed and called upon

  10. The Effectiveness of Electronic Text and Pictorial Graphic Organizers to Improve Comprehension Related to Functional Skills

    Science.gov (United States)

    Douglas, Karen H.; Ayres, Kevin M.; Langone, John; Bramlett, Virginia Bell

    2011-01-01

    This study evaluated the effects of a computer-based instructional program to assist three students with mild to moderate intellectual disabilities in using pictorial graphic organizers as aids for increasing comprehension of electronic text-based recipes. Student comprehension of recipes was measured by their ability to verbally retell recipe…

  11. Componential analysis of kinship terminology a computational perspective

    CERN Document Server

    Pericliev, V

    2013-01-01

    This book presents the first computer program automating the task of componential analysis of kinship vocabularies. The book examines the program in relation to two basic problems: the commonly occurring inconsistency of componential models; and the huge number of alternative componential models.

  12. Analysis of sponge zones for computational fluid mechanics

    International Nuclear Information System (INIS)

    Bodony, Daniel J.

    2006-01-01

    The use of sponge regions, or sponge zones, which add the forcing term -σ(q - q ref ) to the right-hand-side of the governing equations in computational fluid mechanics as an ad hoc boundary treatment is widespread. They are used to absorb and minimize reflections from computational boundaries and as forcing sponges to introduce prescribed disturbances into a calculation. A less common usage is as a means of extending a calculation from a smaller domain into a larger one, such as in computing the far-field sound generated in a localized region. By analogy to the penalty method of finite elements, the method is placed on a solid foundation, complete with estimates of convergence. The analysis generalizes the work of Israeli and Orszag [M. Israeli, S.A. Orszag, Approximation of radiation boundary conditions, J. Comp. Phys. 41 (1981) 115-135] and confirms their findings when applied as a special case to one-dimensional wave propagation in an absorbing sponge. It is found that the rate of convergence of the actual solution to the target solution, with an appropriate norm, is inversely proportional to the sponge strength. A detailed analysis for acoustic wave propagation in one-dimension verifies the convergence rate given by the general theory. The exponential point-wise convergence derived by Israeli and Orszag in the high-frequency limit is recovered and found to hold over all frequencies. A weakly nonlinear analysis of the method when applied to Burgers' equation shows similar convergence properties. Three numerical examples are given to confirm the analysis: the acoustic extension of a two-dimensional time-harmonic point source, the acoustic extension of a three-dimensional initial-value problem of a sound pulse, and the introduction of unstable eigenmodes from linear stability theory into a two-dimensional shear layer

  13. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  14. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  15. Power system stability modelling, analysis and control

    CERN Document Server

    Sallam, Abdelhay A

    2015-01-01

    This book provides a comprehensive treatment of the subject from both a physical and mathematical perspective and covers a range of topics including modelling, computation of load flow in the transmission grid, stability analysis under both steady-state and disturbed conditions, and appropriate controls to enhance stability.

  16. The state-of-the-art and problems of fuel element structural analysis

    International Nuclear Information System (INIS)

    Lassmann, K.

    1980-02-01

    This study of fuel element structural analysis is arranged in two parts: In the first, self-contained, part the general basic principles of deterministic computer programs for structural analysis of fuel elements are reviewed critically and an approach is shown which can be used to expand the system with respect to statistical investigations. The second part contains technical details summarized in 11 publications, all of which appeared in periodicals with reviewer teams. The major aspects of this study are thought to be the following ones: Contributions to the 'philosophy' of fuel element structural analysis. Critical analysis of the basic structure of computer programs. Critical analysis of the mechanical concept of integral fuel rod computer programs. Establishment of a comprehensive computer program system (URANUS). Expansion from purely deterministic information by statistical analyses. Methodological and computer program developments for the analysis of fast accidents. (orig.) 891 HP/orig. 892 MKO [de

  17. Fast Virtual Fractional Flow Reserve Based Upon Steady-State Computational Fluid Dynamics Analysis

    Directory of Open Access Journals (Sweden)

    Paul D. Morris, PhD

    2017-08-01

    Full Text Available Fractional flow reserve (FFR-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel “pseudotransient” analysis protocol for computing virtual fractional flow reserve (vFFR based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33% and more by microvascular physiology (59%. If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.

  18. Mathematical structures for computer graphics

    CERN Document Server

    Janke, Steven J

    2014-01-01

    A comprehensive exploration of the mathematics behind the modeling and rendering of computer graphics scenes Mathematical Structures for Computer Graphics presents an accessible and intuitive approach to the mathematical ideas and techniques necessary for two- and three-dimensional computer graphics. Focusing on the significant mathematical results, the book establishes key algorithms used to build complex graphics scenes. Written for readers with various levels of mathematical background, the book develops a solid foundation for graphics techniques and fills in relevant grap

  19. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  20. Comprehensive efficiency analysis of supercomputer resource usage based on system monitoring data

    Science.gov (United States)

    Mamaeva, A. A.; Shaykhislamov, D. I.; Voevodin, Vad V.; Zhumatiy, S. A.

    2018-03-01

    One of the main problems of modern supercomputers is the low efficiency of their usage, which leads to the significant idle time of computational resources, and, in turn, to the decrease in speed of scientific research. This paper presents three approaches to study the efficiency of supercomputer resource usage based on monitoring data analysis. The first approach performs an analysis of computing resource utilization statistics, which allows to identify different typical classes of programs, to explore the structure of the supercomputer job flow and to track overall trends in the supercomputer behavior. The second approach is aimed specifically at analyzing off-the-shelf software packages and libraries installed on the supercomputer, since efficiency of their usage is becoming an increasingly important factor for the efficient functioning of the entire supercomputer. Within the third approach, abnormal jobs – jobs with abnormally inefficient behavior that differs significantly from the standard behavior of the overall supercomputer job flow – are being detected. For each approach, the results obtained in practice in the Supercomputer Center of Moscow State University are demonstrated.

  1. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  2. A comprehensive overview on the foundations of formal concept analysis

    Directory of Open Access Journals (Sweden)

    K. Sumangali

    2017-12-01

    Full Text Available The immersion of voluminous collection of data is inevitable almost everywhere. The invention of mathematical models to analyse the patterns and trends of the data is an emerging necessity to extract and predict useful information in any Knowledge Discovery from Data (KDD process. The Formal Concept Analysis (FCA is an efficient mathematical model used in the process of KDD which is specially designed to portray the structure of the data in a context and depict the underlying patterns and hierarchies in it. Due to the huge increase in the application of FCA in various fields, the number of research and review articles on FCA has raised to a large extent. This review differs from the existing ones in presenting the comprehensive survey on the fundamentals of FCA in a compact and crisp manner to benefit the beginners and its focuses on the scalability issues in FCA. Further, we present the generic anatomy of FCA apart from its origin and growth at a primary level.

  3. Updated Lagrangian finite element formulations of various biological soft tissue non-linear material models: a comprehensive procedure and review.

    Science.gov (United States)

    Townsend, Molly T; Sarigul-Klijn, Nesrin

    2016-01-01

    Simplified material models are commonly used in computational simulation of biological soft tissue as an approximation of the complicated material response and to minimize computational resources. However, the simulation of complex loadings, such as long-duration tissue swelling, necessitates complex models that are not easy to formulate. This paper strives to offer the updated Lagrangian formulation comprehensive procedure of various non-linear material models for the application of finite element analysis of biological soft tissues including a definition of the Cauchy stress and the spatial tangential stiffness. The relationships between water content, osmotic pressure, ionic concentration and the pore pressure stress of the tissue are discussed with the merits of these models and their applications.

  4. Instrumentation, computer software and experimental techniques used in low-frequency internal friction studies at WNRE

    International Nuclear Information System (INIS)

    Sprugmann, K.W.; Ritchie, I.G.

    1980-04-01

    A detailed and comprehensive account of the equipment, computer programs and experimental methods developed at the Whiteshell Nuclear Research Estalbishment for the study of low-frequency internal friction is presented. Part 1 describes the mechanical apparatus, electronic instrumentation and computer software, while Part II describes in detail the laboratory techniques and various types of experiments performed together with data reduction and analysis. Experimental procedures for the study of internal friction as a function of temperature, strain amplitude or time are described. Computer control of these experiments using the free-decay technique is outlined. In addition, a pendulum constant-amplitude drive system is described. (auth)

  5. Informational-computer system for the neutron spectra analysis

    International Nuclear Information System (INIS)

    Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.

    1979-01-01

    In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described

  6. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  7. Computer Game Play as an Imaginary Stage for Reading: Implicit Spatial Effects of Computer Games Embedded in Hard Copy Books

    Science.gov (United States)

    Smith, Glenn Gordon

    2012-01-01

    This study compared books with embedded computer games (via pentop computers with microdot paper and audio feedback) with regular books with maps, in terms of fifth graders' comprehension and retention of spatial details from stories. One group read a story in hard copy with embedded computer games, the other group read it in regular book format…

  8. Comprehensive curation and analysis of global interaction networks in Saccharomyces cerevisiae

    Science.gov (United States)

    Reguly, Teresa; Breitkreutz, Ashton; Boucher, Lorrie; Breitkreutz, Bobby-Joe; Hon, Gary C; Myers, Chad L; Parsons, Ainslie; Friesen, Helena; Oughtred, Rose; Tong, Amy; Stark, Chris; Ho, Yuen; Botstein, David; Andrews, Brenda; Boone, Charles; Troyanskya, Olga G; Ideker, Trey; Dolinski, Kara; Batada, Nizar N; Tyers, Mike

    2006-01-01

    Background The study of complex biological networks and prediction of gene function has been enabled by high-throughput (HTP) methods for detection of genetic and protein interactions. Sparse coverage in HTP datasets may, however, distort network properties and confound predictions. Although a vast number of well substantiated interactions are recorded in the scientific literature, these data have not yet been distilled into networks that enable system-level inference. Results We describe here a comprehensive database of genetic and protein interactions, and associated experimental evidence, for the budding yeast Saccharomyces cerevisiae, as manually curated from over 31,793 abstracts and online publications. This literature-curated (LC) dataset contains 33,311 interactions, on the order of all extant HTP datasets combined. Surprisingly, HTP protein-interaction datasets currently achieve only around 14% coverage of the interactions in the literature. The LC network nevertheless shares attributes with HTP networks, including scale-free connectivity and correlations between interactions, abundance, localization, and expression. We find that essential genes or proteins are enriched for interactions with other essential genes or proteins, suggesting that the global network may be functionally unified. This interconnectivity is supported by a substantial overlap of protein and genetic interactions in the LC dataset. We show that the LC dataset considerably improves the predictive power of network-analysis approaches. The full LC dataset is available at the BioGRID () and SGD () databases. Conclusion Comprehensive datasets of biological interactions derived from the primary literature provide critical benchmarks for HTP methods, augment functional prediction, and reveal system-level attributes of biological networks. PMID:16762047

  9. Control-C and ACSL for computer-aided control systems engineering

    International Nuclear Information System (INIS)

    Schrick, B.; Anex, R.

    1986-01-01

    A computer-aided engineering package, called CTRL-C, provides an easy-to-use work-bench for the analysis and design of multivariable systems. ACSL, a powerful simulation language, complements CTRL-C and provides a flexible evaluation tool. Combined, the CTRL-C/ACSL package is an interaction environment with a comprehensive set of tools for analysis, design, simulation, and evaluation. A unified software system is possible for engineering analysis and presentation. Recognizing that no software can satisfy all needs, Ctrl-C provides consistent and flexible communication with the operating system and other software, most especially ACSL. CTRL-C and ACSL demonstrate that an interactive environment can be a powerful, accessible, and extensible tool for engineering

  10. Computer-Based Image Analysis for Plus Disease Diagnosis in Retinopathy of Prematurity: Performance of the "i-ROP" System and Image Features Associated With Expert Diagnosis.

    Science.gov (United States)

    Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Campbell, J Peter; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir; Jonas, Karyn; Chan, R V Paul; Ostmo, Susan; Chiang, Michael F

    2015-11-01

    We developed and evaluated the performance of a novel computer-based image analysis system for grading plus disease in retinopathy of prematurity (ROP), and identified the image features, shapes, and sizes that best correlate with expert diagnosis. A dataset of 77 wide-angle retinal images from infants screened for ROP was collected. A reference standard diagnosis was determined for each image by combining image grading from 3 experts with the clinical diagnosis from ophthalmoscopic examination. Manually segmented images were cropped into a range of shapes and sizes, and a computer algorithm was developed to extract tortuosity and dilation features from arteries and veins. Each feature was fed into our system to identify the set of characteristics that yielded the highest-performing system compared to the reference standard, which we refer to as the "i-ROP" system. Among the tested crop shapes, sizes, and measured features, point-based measurements of arterial and venous tortuosity (combined), and a large circular cropped image (with radius 6 times the disc diameter), provided the highest diagnostic accuracy. The i-ROP system achieved 95% accuracy for classifying preplus and plus disease compared to the reference standard. This was comparable to the performance of the 3 individual experts (96%, 94%, 92%), and significantly higher than the mean performance of 31 nonexperts (81%). This comprehensive analysis of computer-based plus disease suggests that it may be feasible to develop a fully-automated system based on wide-angle retinal images that performs comparably to expert graders at three-level plus disease discrimination. Computer-based image analysis, using objective and quantitative retinal vascular features, has potential to complement clinical ROP diagnosis by ophthalmologists.

  11. Computational Fatigue Life Analysis of Carbon Fiber Laminate

    Science.gov (United States)

    Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.

    2018-02-01

    In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.

  12. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  13. Computing fundamentals IC3 edition

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    Kick start your journey into computing and prepare for your IC3 certification With this essential course book you'll be sending e-mails, surfing the web and understanding the basics of computing in no time. Written by Faithe Wempen, a Microsoft Office Master Instructor and author of more than 120 books, this complete guide to the basics has been tailored to provide comprehensive instruction on the full range of entry-level computing skills. It is a must for students looking to move into almost any profession, as entry-level computing courses have become a compulsory requirement in the modern w

  14. Two-dimensional chromatographic analysis using three second-dimension columns for continuous comprehensive analysis of intact proteins.

    Science.gov (United States)

    Zhu, Zaifang; Chen, Huang; Ren, Jiangtao; Lu, Juan J; Gu, Congying; Lynch, Kyle B; Wu, Si; Wang, Zhe; Cao, Chengxi; Liu, Shaorong

    2018-03-01

    We develop a new two-dimensional (2D) high performance liquid chromatography (HPLC) approach for intact protein analysis. Development of 2D HPLC has a bottleneck problem - limited second-dimension (second-D) separation speed. We solve this problem by incorporating multiple second-D columns to allow several second-D separations to be proceeded in parallel. To demonstrate the feasibility of using this approach for comprehensive protein analysis, we select ion-exchange chromatography as the first-dimension and reverse-phase chromatography as the second-D. We incorporate three second-D columns in an innovative way so that three reverse-phase separations can be performed simultaneously. We test this system for separating both standard proteins and E. coli lysates and achieve baseline resolutions for eleven standard proteins and obtain more than 500 peaks for E. coli lysates. This is an indication that the sample complexities are greatly reduced. We see less than 10 bands when each fraction of the second-D effluents are analyzed by sodium dodecyl sulfate - polyacrylamide gel electrophoresis (SDS-PAGE), compared to hundreds of SDS-PAGE bands as the original sample is analyzed. This approach could potentially be an excellent and general tool for protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Ubiquitous computing in sports: A review and analysis.

    Science.gov (United States)

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  16. Visual computing scientific visualization and imaging systems

    CERN Document Server

    2014-01-01

    This volume aims to stimulate discussions on research involving the use of data and digital images as an understanding approach for analysis and visualization of phenomena and experiments. The emphasis is put not only on graphically representing data as a way of increasing its visual analysis, but also on the imaging systems which contribute greatly to the comprehension of real cases. Scientific Visualization and Imaging Systems encompass multidisciplinary areas, with applications in many knowledge fields such as Engineering, Medicine, Material Science, Physics, Geology, Geographic Information Systems, among others. This book is a selection of 13 revised and extended research papers presented in the International Conference on Advanced Computational Engineering and Experimenting -ACE-X conferences 2010 (Paris), 2011 (Algarve), 2012 (Istanbul) and 2013 (Madrid). The examples were particularly chosen from materials research, medical applications, general concepts applied in simulations and image analysis and ot...

  17. Comparative Neutronics Analysis of DIMPLE S06 Criticality Benchmark with Contemporary Reactor Core Analysis Computer Code Systems

    Directory of Open Access Journals (Sweden)

    Wonkyeong Kim

    2015-01-01

    Full Text Available A high-leakage core has been known to be a challenging problem not only for a two-step homogenization approach but also for a direct heterogeneous approach. In this paper the DIMPLE S06 core, which is a small high-leakage core, has been analyzed by a direct heterogeneous modeling approach and by a two-step homogenization modeling approach, using contemporary code systems developed for reactor core analysis. The focus of this work is a comprehensive comparative analysis of the conventional approaches and codes with a small core design, DIMPLE S06 critical experiment. The calculation procedure for the two approaches is explicitly presented in this paper. Comprehensive comparative analysis is performed by neutronics parameters: multiplication factor and assembly power distribution. Comparison of two-group homogenized cross sections from each lattice physics codes shows that the generated transport cross section has significant difference according to the transport approximation to treat anisotropic scattering effect. The necessity of the ADF to correct the discontinuity at the assembly interfaces is clearly presented by the flux distributions and the result of two-step approach. Finally, the two approaches show consistent results for all codes, while the comparison with the reference generated by MCNP shows significant error except for another Monte Carlo code, SERPENT2.

  18. Computer image analysis of etched tracks from ionizing radiation

    Science.gov (United States)

    Blanford, George E.

    1994-01-01

    I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.

  19. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Cradduck, T.D.; Knowles, L.G.

    1977-01-01

    The decision to buy a computer is difficult. The wide variety of computing systems available makes that decision even harder because each of the systems has unique advantages and disadvantages. The following list contains many of the essentials any computer system for nuclear medicine should embody: (1) sophisticated and reliable hardware with sufficient memory capacity to acquire or display at least 128 x 128 static images or 64 x 64 dynamic studies and with the facility for adding extra hardware and peripheral equipment at a later date; (2) a well-proved, general-purpose, real-time operating system to which the programs specific to the gamma camera have been interfaced and which will allow expansion or modification of both hardware and software in the future; (3) a display exhibiting at least 128 x 128 resolution, a monochrome mode with extended gray scale, and perhaps color; a varied set of programmed image formats and hardware system that includes local refresher capabilities; (4) a high-level language, such as FORTRAN or BASIC, with the ability to directly access all data files and interact with system programs as well as a macroprogramming capability so the user may write his own programs for data manipulation and analysis; (5) a comprehensive yet generally applicable set of system programs to enable data acquisition, storage, analysis, and display. In addition to the above, one should expect the services of a team of well-trained maintenance technicians and engineers. The manufacturer should offer software support and exhibit a plan for continued development and upgrading of the software initially provided

  20. Computer-aided topological analysis of Nd-Fe-B ternary system

    International Nuclear Information System (INIS)

    Liu, G.; Xu, P.; Zhang, W.

    1993-01-01

    A three-dimensional partially matrixed topological model of the Nd-Fe-B ternary phase diagram has been established based on experimental results assessed comprehensively with the aid of a computer-aided design and graphic and graphics software, AutoCAD (R10), and application programs developed in this work. Vertical sections at 5.88 at.% B, Nd:B = 1:1, Fe-Nd/sub 2/Fe/sub 14/B-Nd, Nd/sub 2/Fe/sub 17/-Nd/sub 2/Fe/sub 7/B/sub 6/ have been cut out from the model and the corresponding phase relationships have been analyzed. Among them, those on the Nd-rich protons of both the sections at 5.88 at.% B and Nd:B = 2:1 and those on the Nd/sub 2/Fe/sub 14/B-Nd section are given for the first time. (author)

  1. Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.

    Science.gov (United States)

    Latha, Indu; Reichenbach, Stephen E; Tao, Qingping

    2011-09-23

    Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Interactive computing in BASIC an introduction to interactive computing and a practical course in the BASIC language

    CERN Document Server

    Sanderson, Peter C

    1973-01-01

    Interactive Computing in BASIC: An Introduction to Interactive Computing and a Practical Course in the BASIC Language provides a general introduction to the principles of interactive computing and a comprehensive practical guide to the programming language Beginners All-purpose Symbolic Instruction Code (BASIC). The book starts by providing an introduction to computers and discussing the aspects of terminal usage, programming languages, and the stages in writing and testing a program. The text then discusses BASIC with regard to methods in writing simple arithmetical programs, control stateme

  3. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  4. Development of integrated platform for computational material design

    International Nuclear Information System (INIS)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato; Hideaki, Koike

    2003-01-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned

  5. Functional and shape data analysis

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This textbook for courses on function data analysis and shape data analysis describes how to define, compare, and mathematically represent shapes, with a focus on statistical modeling and inference. It is aimed at graduate students in analysis in statistics, engineering, applied mathematics, neuroscience, biology, bioinformatics, and other related areas. The interdisciplinary nature of the broad range of ideas covered—from introductory theory to algorithmic implementations and some statistical case studies—is meant to familiarize graduate students with an array of tools that are relevant in developing computational solutions for shape and related analyses. These tools, gleaned from geometry, algebra, statistics, and computational science, are traditionally scattered across different courses, departments, and disciplines; Functional and Shape Data Analysis offers a unified, comprehensive solution by integrating the registration problem into shape analysis, better preparing graduate students for handling fu...

  6. Prioritizing strategies for comprehensive liver cancer control in Asia: a conjoint analysis

    Directory of Open Access Journals (Sweden)

    Bridges John FP

    2012-10-01

    Full Text Available Abstract Background Liver cancer is a complex and burdensome disease, with Asia accounting for 75% of known cases. Comprehensive cancer control requires the use of multiple strategies, but various stakeholders may have different views as to which strategies should have the highest priority. This study identified priorities across multiple strategies for comprehensive liver cancer control (CLCC from the perspective of liver cancer clinical, policy, and advocacy stakeholders in China, Japan, South Korea and Taiwan. Concordance of priorities was assessed across the region and across respondent roles. Methods Priorities for CLCC were examined as part of a cross-sectional survey of liver cancer experts. Respondents completed several conjoint-analysis choice tasks to prioritize 11 strategies. In each task, respondents judged which of two competing CLCC plans, consisting of mutually exclusive and exhaustive subsets of the strategies, would have the greatest impact. The dependent variable was the chosen plan, which was then regressed on the strategies of different plans. The restricted least squares (RLS method was utilized to compare aggregate and stratified models, and t-tests and Wald tests were used to test for significance and concordance, respectively. Results Eighty respondents (69.6% were eligible and completed the survey. Their primary interests were hepatitis (26%, hepatocellular carcinoma (HCC (58%, metastatic liver cancer (10% and transplantation (6%. The most preferred strategies were monitoring at-risk populations (pclinician education (pnational guidelines (ptransplantation infrastructure (p=0.009 was valued lower in China, measuring social burden (p=0.037 was valued higher in Taiwan, and national guidelines (p=0.025 was valued higher in China. Priorities did not differ across stakeholder groups (p=0.438. Conclusions Priorities for CLCC in Asia include monitoring at-risk populations, clinician education, national guidelines

  7. Comprehensive analysis of the specificity of transcription activator-like effector nucleases

    DEFF Research Database (Denmark)

    Juillerat, Alexandre; Dubois, Gwendoline; Valton, Julien

    2014-01-01

    A key issue when designing and using DNA-targeting nucleases is specificity. Ideally, an optimal DNA-targeting tool has only one recognition site within a genomic sequence. In practice, however, almost all designer nucleases available today can accommodate one to several mutations within...... their target site. The ability to predict the specificity of targeting is thus highly desirable. Here, we describe the first comprehensive experimental study focused on the specificity of the four commonly used repeat variable diresidues (RVDs; NI:A, HD:C, NN:G and NG:T) incorporated in transcription activator......-like effector nucleases (TALEN). The analysis of >15 500 unique TALEN/DNA cleavage profiles allowed us to monitor the specificity gradient of the RVDs along a TALEN/DNA binding array and to present a specificity scoring matrix for RVD/nucleotide association. Furthermore, we report that TALEN can only...

  8. Constraint Programming for Context Comprehension

    DEFF Research Database (Denmark)

    Christiansen, Henning

    2014-01-01

    A close similarity is demonstrated between context comprehension, such as discourse analysis, and constraint programming. The constraint store takes the role of a growing knowledge base learned throughout the discourse, and a suitable con- straint solver does the job of incorporating new pieces...

  9. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    International Nuclear Information System (INIS)

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-01-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  10. Analysis of electronic circuits using digital computers

    International Nuclear Information System (INIS)

    Tapu, C.

    1968-01-01

    Various programmes have been proposed for studying electronic circuits with the help of computers. It is shown here how it possible to use the programme ECAP, developed by I.B.M., for studying the behaviour of an operational amplifier from different point of view: direct current, alternating current and transient state analysis, optimisation of the gain in open loop, study of the reliability. (author) [fr

  11. Computational content analysis of European Central Bank statements

    NARCIS (Netherlands)

    Milea, D.V.; Almeida, R.J.; Sharef, N.M.; Kaymak, U.; Frasincar, F.

    2012-01-01

    In this paper we present a framework for the computational content analysis of European Central Bank (ECB) statements. Based on this framework, we provide two approaches that can be used in a practical context. Both approaches use the content of ECB statements to predict upward and downward movement

  12. Automatic design of optical systems by digital computer

    Science.gov (United States)

    Casad, T. A.; Schmidt, L. F.

    1967-01-01

    Computer program uses geometrical optical techniques and a least squares optimization method employing computing equipment for the automatic design of optical systems. It evaluates changes in various optical parameters, provides comprehensive ray-tracing, and generally determines the acceptability of the optical system characteristics.

  13. Similarity, Clustering, and Scaling Analyses for the Foreign Exchange Market ---Comprehensive Analysis on States of Market Participants with High-Frequency Financial Data---

    Science.gov (United States)

    Sato, A.; Sakai, H.; Nishimura, M.; Holyst, J. A.

    This article proposes mathematical methods to quantify states of marketparticipants in the foreign exchange market (FX market) and conduct comprehensive analysis on behavior of market participants by means of high-frequency financial data. Based on econophysics tools and perspectives we study similarity measures for both rate movements and quotation activities among various currency pairs. We perform also clustering analysis on market states for observation days, and find scaling relationship between mean values of quotation activities and their standard deviations. Using these mathematical methods we can visualize states of the FX market comprehensively. Finally we conclude that states of market participants temporally vary due to both external and internal factors.

  14. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific-analysis computer programs for year-2000 compliance is part of AECL' s year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  15. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific analysis computer programs for year-2000 compliance is part of AECL's year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  16. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  17. Minimal canonical comprehensive Gröbner systems

    OpenAIRE

    Manubens, Montserrat; Montes, Antonio

    2009-01-01

    This is the continuation of Montes' paper "On the canonical discussion of polynomial systems with parameters''. In this paper, we define the Minimal Canonical Comprehensive Gröbner System of a parametric ideal and fix under which hypothesis it exists and is computable. An algorithm to obtain a canonical description of the segments of the Minimal Canonical CGS is given, thus completing the whole MCCGS algorithm (implemented in Maple and Singular). We show its high utility for applications, suc...

  18. High fidelity thermal-hydraulic analysis using CFD and massively parallel computers

    International Nuclear Information System (INIS)

    Weber, D.P.; Wei, T.Y.C.; Brewster, R.A.; Rock, Daniel T.; Rizwan-uddin

    2000-01-01

    Thermal-hydraulic analyses play an important role in design and reload analysis of nuclear power plants. These analyses have historically relied on early generation computational fluid dynamics capabilities, originally developed in the 1960s and 1970s. Over the last twenty years, however, dramatic improvements in both computational fluid dynamics codes in the commercial sector and in computing power have taken place. These developments offer the possibility of performing large scale, high fidelity, core thermal hydraulics analysis. Such analyses will allow a determination of the conservatism employed in traditional design approaches and possibly justify the operation of nuclear power systems at higher powers without compromising safety margins. The objective of this work is to demonstrate such a large scale analysis approach using a state of the art CFD code, STAR-CD, and the computing power of massively parallel computers, provided by IBM. A high fidelity representation of a current generation PWR was analyzed with the STAR-CD CFD code and the results were compared to traditional analyses based on the VIPRE code. Current design methodology typically involves a simplified representation of the assemblies, where a single average pin is used in each assembly to determine the hot assembly from a whole core analysis. After determining this assembly, increased refinement is used in the hot assembly, and possibly some of its neighbors, to refine the analysis for purposes of calculating DNBR. This latter calculation is performed with sub-channel codes such as VIPRE. The modeling simplifications that are used involve the approximate treatment of surrounding assemblies and coarse representation of the hot assembly, where the subchannel is the lowest level of discretization. In the high fidelity analysis performed in this study, both restrictions have been removed. Within the hot assembly, several hundred thousand to several million computational zones have been used, to

  19. Introduction to scientific computing and data analysis

    CERN Document Server

    Holmes, Mark H

    2016-01-01

    This textbook provides and introduction to numerical computing and its applications in science and engineering. The topics covered include those usually found in an introductory course, as well as those that arise in data analysis. This includes optimization and regression based methods using a singular value decomposition. The emphasis is on problem solving, and there are numerous exercises throughout the text concerning applications in engineering and science. The essential role of the mathematical theory underlying the methods is also considered, both for understanding how the method works, as well as how the error in the computation depends on the method being used. The MATLAB codes used to produce most of the figures and data tables in the text are available on the author’s website and SpringerLink.

  20. COMPREHENSIVE REVIEW OF AES AND RSA SECURITY ALGORITHMS IN CLOUD COMPUTING

    OpenAIRE

    Shubham Kansal*, Harkiran Kaur

    2017-01-01

    Cloud Computing referred as revolutionary approach which has changed the IT and business integration. It has benefits to almost every type of IT requirement, it can be used by enterprises to cut their IT costs, and it can be used by individual to use it as a storage solution with a disaster recovery solution. One major problem that exists with Cloud Computing, in the present scenario, is security and privacy of the data. Encryption is the most important part of the security if you own a priva...

  1. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  2. The reading comprehension skills in English, on the professionals training in education in third year of the Labour Education and computing career

    Directory of Open Access Journals (Sweden)

    Agnes Aydely Leal

    2016-03-01

    Full Text Available This article evidence in detail, the different stages through which has passed the teaching of Labor Education and Information Technology, an analysis of docuemnts of the different curriculum with whom we worked and are currently working, the historical past of the development of reading comprehension skills in English as well as English for specific purpuse in the training of professionals in both profiles.

  3. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  4. Investigation on structural analysis computer program of spent nuclear fuel shipping cask

    International Nuclear Information System (INIS)

    Yagawa, Ganki; Ikushima, Takeshi.

    1987-10-01

    This report describes the results done by the Sub-Committee of Research Cooperation Committee (RC-62) of the Japan Society of Mechanical Engineers under the trust of the Japan Atomic Energy Research Institute. The principal fulfilments and accomplishments are summarized as follows: (1) Regarding the survey of structural analysis methods of spent fuel shipping cask, several documents, which explain the features and applications of the exclusive computer programs for impact analysis on the basis of 2 or 3 dimensional finite element or difference methods such as HONDO, STEALTH and DYNA-3D, were reviewed. (2) In comparative evaluation of the existing computer programs, the common benchmark test problems for 9 m vertical drop impact of the axisymmetric lead cylinder with and without stainless steel clads were adopted where the calculational evaluations for taking into account the strain rate effect were carried out. (3) Evaluation of impact analysis algorithm of computer programs were conducted and the requirements for computer programs to be developed in future and an index for further studies have been clarified. (author)

  5. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  6. Wheeze sound analysis using computer-based techniques: a systematic review.

    Science.gov (United States)

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  7. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  8. Comprehensive Thematic T-Matrix Reference Database: A 2014-2015 Update

    Science.gov (United States)

    Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2015-01-01

    The T-matrix method is one of the most versatile and efficient direct computer solvers of the macroscopic Maxwell equations and is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper is the seventh update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated by us in 2004 and includes relevant publications that have appeared since 2013. It also lists a number of earlier publications overlooked previously.

  9. Computational issues in the analysis of nonlinear two-phase flow dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Rosa, Mauricio A. Pinheiro [Centro Tecnico Aeroespacial (CTA-IEAv), Sao Jose dos Campos, SP (Brazil). Inst. de Estudos Avancados. Div. de Energia Nuclear], e-mail: pinheiro@ieav.cta.br; Podowski, Michael Z. [Rensselaer Polytechnic Institute, New York, NY (United States)

    2001-07-01

    This paper is concerned with the issue of computer simulations of flow-induced instabilities in boiling channels and systems. A computational model is presented for the time-domain analysis of nonlinear oscillations in interconnected parallel boiling channels. The results of model testing and validation are shown. One of the main concerns here has been to show the importance in performing numerical testing regarding the selection of a proper numerical integration method and associated nodalization and time step as well as to demonstrate the convergence of the numerical solution prior to any analysis. (author)

  10. Quality Assurance of Computer Programs for Photopeak Integration in Activation Analysis

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    1978-01-01

    The purpose of a computer program for quantitative activation analysis is basically to produce information on the ratio of radioactive decays of a specific radio-nuclide observed by a detector from two alternative sources. It is assumed that at least one of the sources is known to contain...... the radionuclide in question, and qualitative analysis is therefore needed only to the extent that the decay characteristics of this radionuclide could be confused with those of other possible radionuclides, thus interfering with its determination. The quality of these computer programs can only be assured...

  11. Computational image analysis of Suspension Plasma Sprayed YSZ coatings

    Directory of Open Access Journals (Sweden)

    Michalak Monika

    2017-01-01

    Full Text Available The paper presents the computational studies of microstructure- and topography- related features of suspension plasma sprayed (SPS coatings of yttria-stabilized zirconia (YSZ. The study mainly covers the porosity assessment, provided by ImageJ software analysis. The influence of boundary conditions, defined by: (i circularity and (ii size limits, on the computed values of porosity is also investigated. Additionally, the digital topography evaluation is performed: confocal laser scanning microscope (CLSM and scanning electron microscope (SEM operating in Shape from Shading (SFS mode measure surface roughness of deposited coatings. Computed values of porosity and roughness are referred to the variables of the spraying process, which influence the morphology of coatings and determines the possible fields of their applications.

  12. Comprehensive sequence analysis of nine Usher syndrome genes in the UK National Collaborative Usher Study

    OpenAIRE

    Le Quesne Stabej, Polona; Saihan, Zubin; Rangesh, Nell; Steele-Stallard, Heather B; Ambrose, John; Coffey, Alison; Emmerson, Jenny; Haralambous, Elene; Hughes, Yasmin; Steel, Karen P; Luxon, Linda M; Webster, Andrew R; Bitner-Glindzicz, Maria

    2011-01-01

    Background Usher syndrome (USH) is an autosomal recessive disorder comprising retinitis pigmentosa, hearing loss and, in some cases, vestibular dysfunction. It is clinically and genetically heterogeneous with three distinctive clinical types (I?III) and nine Usher genes identified. This study is a comprehensive clinical and genetic analysis of 172 Usher patients and evaluates the contribution of digenic inheritance. Methods The genes MYO7A, USH1C, CDH23, PCDH15, USH1G, USH2A, GPR98, WHRN, CLR...

  13. Analysis of the computed tomography in the acute abdomen

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Moraes, Everton; Haygert, Carlos Jesus Pereira; Antunes, Paulo Sergio Pase; Gazzoni, Fernando; Lopes, Luis Felipe Dias

    2007-01-01

    Introduction: This study tends to test the capacity of the computed tomography in assist in the diagnosis and the approach of the acute abdomen. Material and method: This is a longitudinal and prospective study, in which were analyzed the patients with the diagnosis of acute abdomen. There were obtained 105 cases of acute abdomen and after the application of the exclusions criteria were included 28 patients in the study. Results: Computed tomography changed the diagnostic hypothesis of the physicians in 50% of the cases (p 0.05), where 78.57% of the patients had surgical indication before computed tomography and 67.86% after computed tomography (p = 0.0546). The index of accurate diagnosis of computed tomography, when compared to the anatomopathologic examination and the final diagnosis, was observed in 82.14% of the cases (p = 0.013). When the analysis was done dividing the patients in surgical and nonsurgical group, were obtained an accuracy of 89.28% (p 0.0001). The difference of 7.2 days of hospitalization (p = 0.003) was obtained compared with the mean of the acute abdomen without use the computed tomography. Conclusion: The computed tomography is correlative with the anatomopathology and has great accuracy in the surgical indication, associated with the capacity of increase the confident index of the physicians, reduces the hospitalization time, reduces the number of surgeries and is cost-effective. (author)

  14. Conducting Computer Security Assessments at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    Computer security is increasingly recognized as a key component in nuclear security. As technology advances, it is anticipated that computer and computing systems will be used to an even greater degree in all aspects of plant operations including safety and security systems. A rigorous and comprehensive assessment process can assist in strengthening the effectiveness of the computer security programme. This publication outlines a methodology for conducting computer security assessments at nuclear facilities. The methodology can likewise be easily adapted to provide assessments at facilities with other radioactive materials

  15. The application of computer image analysis in life sciences and environmental engineering

    Science.gov (United States)

    Mazur, R.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.

    2014-04-01

    The main aim of the article was to present research on the application of computer image analysis in Life Science and Environmental Engineering. The authors used different methods of computer image analysis in developing of an innovative biotest in modern biomonitoring of water quality. Created tools were based on live organisms such as bioindicators Lemna minor L. and Hydra vulgaris Pallas as well as computer image analysis method in the assessment of negatives reactions during the exposition of the organisms to selected water toxicants. All of these methods belong to acute toxicity tests and are particularly essential in ecotoxicological assessment of water pollutants. Developed bioassays can be used not only in scientific research but are also applicable in environmental engineering and agriculture in the study of adverse effects on water quality of various compounds used in agriculture and industry.

  16. Shor's factoring algorithm and modern cryptography. An illustration of the capabilities inherent in quantum computers

    Science.gov (United States)

    Gerjuoy, Edward

    2005-06-01

    The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.

  17. Comprehensive computational design of ordered peptide macrocycles

    Science.gov (United States)

    Hosseinzadeh, Parisa; Bhardwaj, Gaurav; Mulligan, Vikram Khipple; Shortridge, Matthew D.; Craven, Timothy W.; Pardo-Avila, Fátima; Rettie, Stephen A.; Kim, David E.; Silva, Daniel-Adriano; Ibrahim, Yehia M.; Webb, Ian K.; Cort, John R.; Adkins, Joshua N.; Varani, Gabriele; Baker, David

    2018-01-01

    Mixed-chirality peptide macrocycles such as cyclosporine are among the most potent therapeutics identified to date, but there is currently no way to systematically search the structural space spanned by such compounds. Natural proteins do not provide a useful guide: Peptide macrocycles lack regular secondary structures and hydrophobic cores, and can contain local structures not accessible with L-amino acids. Here, we enumerate the stable structures that can be adopted by macrocyclic peptides composed of L- and D-amino acids by near-exhaustive backbone sampling followed by sequence design and energy landscape calculations. We identify more than 200 designs predicted to fold into single stable structures, many times more than the number of currently available unbound peptide macrocycle structures. Nuclear magnetic resonance structures of 9 of 12 designed 7- to 10-residue macrocycles, and three 11- to 14-residue bicyclic designs, are close to the computational models. Our results provide a nearly complete coverage of the rich space of structures possible for short peptide macrocycles and vastly increase the available starting scaffolds for both rational drug design and library selection methods. PMID:29242347

  18. A review of research and recent trends in analysis of composite plates

    Indian Academy of Sciences (India)

    Pravin Kulkarni

    2018-06-07

    Jun 7, 2018 ... work is aimed to provide a comprehensive review of research in the structural analysis of composite plates ... The study of composite materials, appli- ..... These family of ..... balance the computation cost and result accuracy.

  19. Visual Cluster Analysis for Computing Tasks at Workflow Management System of the ATLAS Experiment

    CERN Document Server

    Grigoryeva, Maria; The ATLAS collaboration

    2018-01-01

    Hundreds of petabytes of experimental data in high energy and nuclear physics (HENP) have already been obtained by unique scientific facilities such as LHC, RHIC, KEK. As the accelerators are being modernized (energy and luminosity were increased), data volumes are rapidly growing and have reached the exabyte scale, that also affects the increasing the number of analysis and data processing tasks, that are competing continuously for computational resources. The increase of processing tasks causes an increase in the performance of the computing environment by the involvement of high-performance computing resources, and forming a heterogeneous distributed computing environment (hundreds of distributed computing centers). In addition, errors happen to occur while executing tasks for data analysis and processing, which are caused by software and hardware failures. With a distributed model of data processing and analysis, the optimization of data management and workload systems becomes a fundamental task, and the ...

  20. Cost analysis of a school-based comprehensive malaria program in primary schools in Sikasso region, Mali.

    Science.gov (United States)

    Maccario, Roberta; Rouhani, Saba; Drake, Tom; Nagy, Annie; Bamadio, Modibo; Diarra, Seybou; Djanken, Souleymane; Roschnik, Natalie; Clarke, Siân E; Sacko, Moussa; Brooker, Simon; Thuilliez, Josselin

    2017-06-12

    The expansion of malaria prevention and control to school-aged children is receiving increasing attention, but there are still limited data on the costs of intervention. This paper analyses the costs of a comprehensive school-based intervention strategy, delivered by teachers, that included participatory malaria educational activities, distribution of long lasting insecticide-treated nets (LLIN), and Intermittent Parasite Clearance in schools (IPCs) in southern Mali. Costs were collected alongside a randomised controlled trial conducted in 80 primary schools in Sikasso Region in Mali in 2010-2012. Cost data were compiled between November 2011 and March 2012 for the 40 intervention schools (6413 children). A provider perspective was adopted. Using an ingredients approach, costs were classified by cost category and by activity. Total costs and cost per child were estimated for the actual intervention, as well as for a simpler version of the programme more suited for scale-up by the government. Univariate sensitivity analysis was performed. The economic cost of the comprehensive intervention was estimated to $10.38 per child (financial cost $8.41) with malaria education, LLIN distribution and IPCs costing $2.13 (20.5%), $5.53 (53.3%) and $2.72 (26.2%) per child respectively. Human resources were found to be the key cost driver, and training costs were the greatest contributor to overall programme costs. Sensitivity analysis showed that an adapted intervention delivering one LLIN instead of two would lower the economic cost to $8.66 per child; and that excluding LLIN distribution in schools altogether, for example in settings where malaria control already includes universal distribution of LLINs at community-level, would reduce costs to $4.89 per child. A comprehensive school-based control strategy may be a feasible and affordable way to address the burden of malaria among schoolchildren in the Sahel.

  1. Cost analysis of a school-based comprehensive malaria program in primary schools in Sikasso region, Mali

    Directory of Open Access Journals (Sweden)

    Roberta Maccario

    2017-06-01

    Full Text Available Abstract Background The expansion of malaria prevention and control to school-aged children is receiving increasing attention, but there are still limited data on the costs of intervention. This paper analyses the costs of a comprehensive school-based intervention strategy, delivered by teachers, that included participatory malaria educational activities, distribution of long lasting insecticide-treated nets (LLIN, and Intermittent Parasite Clearance in schools (IPCs in southern Mali. Methods Costs were collected alongside a randomised controlled trial conducted in 80 primary schools in Sikasso Region in Mali in 2010-2012. Cost data were compiled between November 2011 and March 2012 for the 40 intervention schools (6413 children. A provider perspective was adopted. Using an ingredients approach, costs were classified by cost category and by activity. Total costs and cost per child were estimated for the actual intervention, as well as for a simpler version of the programme more suited for scale-up by the government. Univariate sensitivity analysis was performed. Results The economic cost of the comprehensive intervention was estimated to $10.38 per child (financial cost $8.41 with malaria education, LLIN distribution and IPCs costing $2.13 (20.5%, $5.53 (53.3% and $2.72 (26.2% per child respectively. Human resources were found to be the key cost driver, and training costs were the greatest contributor to overall programme costs. Sensitivity analysis showed that an adapted intervention delivering one LLIN instead of two would lower the economic cost to $8.66 per child; and that excluding LLIN distribution in schools altogether, for example in settings where malaria control already includes universal distribution of LLINs at community-level, would reduce costs to $4.89 per child. Conclusions A comprehensive school-based control strategy may be a feasible and affordable way to address the burden of malaria among schoolchildren in the Sahel.

  2. Development of Computer Program for Analysis of Irregular Non Homogenous Radiation Shielding

    International Nuclear Information System (INIS)

    Bang Rozali; Nina Kusumah; Hendro Tjahjono; Darlis

    2003-01-01

    A computer program for radiation shielding analysis has been developed to obtain radiation attenuation calculation in non-homogenous radiation shielding and irregular geometry. By determining radiation source strength, geometrical shape of radiation source, location, dimension and geometrical shape of radiation shielding, radiation level of a point at certain position from radiation source can be calculated. By using a computer program, calculation result of radiation distribution analysis can be obtained for some analytical points simultaneously. (author)

  3. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  4. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  5. Two-dimensional liquid chromatography consisting of twelve second-dimension columns for comprehensive analysis of intact proteins.

    Science.gov (United States)

    Ren, Jiangtao; Beckner, Matthew A; Lynch, Kyle B; Chen, Huang; Zhu, Zaifang; Yang, Yu; Chen, Apeng; Qiao, Zhenzhen; Liu, Shaorong; Lu, Joann J

    2018-05-15

    A comprehensive two-dimensional liquid chromatography (LCxLC) system consisting of twelve columns in the second dimension was developed for comprehensive analysis of intact proteins in complex biological samples. The system consisted of an ion-exchange column in the first dimension and the twelve reverse-phase columns in the second dimension; all thirteen columns were monolithic and prepared inside 250 µm i.d. capillaries. These columns were assembled together through the use of three valves and an innovative configuration. The effluent from the first dimension was continuously fractionated and sequentially transferred into the twelve second-dimension columns, while the second-dimension separations were carried out in a series of batches (six columns per batch). This LCxLC system was tested first using standard proteins followed by real-world samples from E. coli. Baseline separation was observed for eleven standard proteins and hundreds of peaks were observed for the real-world sample analysis. Two-dimensional liquid chromatography, often considered as an effective tool for mapping proteins, is seen as laborious and time-consuming when configured offline. Our online LCxLC system with increased second-dimension columns promises to provide a solution to overcome these hindrances. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  7. Affective Computing and Intelligent Interaction

    CERN Document Server

    2012-01-01

    2012 International Conference on Affective Computing and Intelligent Interaction (ICACII 2012) was the most comprehensive conference focused on the various aspects of advances in Affective Computing and Intelligent Interaction. The conference provided a rare opportunity to bring together worldwide academic researchers and practitioners for exchanging the latest developments and applications in this field such as Intelligent Computing, Affective Computing, Machine Learning, Business Intelligence and HCI.   This volume is a collection of 119 papers selected from 410 submissions from universities and industries all over the world, based on their quality and relevancy to the conference. All of the papers have been peer-reviewed by selected experts.  

  8. Computer skills for the next generation of healthcare executives.

    Science.gov (United States)

    Côté, Murray J; Van Enyde, Donald F; DelliFraine, Jami L; Tucker, Stephen L

    2005-01-01

    Students beginning a career in healthcare administration must possess an array of professional and management skills in addition to a strong fundamental understanding of the field of healthcare administration. Proficient computer skills are a prime example of an essential management tool for healthcare administrators. However, it is unclear which computer skills are absolutely necessary for healthcare administrators and the extent of congruency between the computer skills possessed by new graduates and the needs of senior healthcare professionals. Our objectives in this research are to assess which computer skills are the most important to senior healthcare executives and recent healthcare administration graduates and examine the level of agreement between the two groups. Based on a survey of senior healthcare executives and graduate healthcare administration students, we identify a comprehensive and pragmatic array of computer skills and categorize them into four groups, according to their importance, for making recent health administration graduates valuable in the healthcare administration workplace. Traditional parametric hypothesis tests are used to assess congruency between responses of senior executives and of recent healthcare administration graduates. For each skill, responses of the two groups are averaged to create an overall ranking of the computer skills. Not surprisingly, both groups agreed on the importance of computer skills for recent healthcare administration graduates. In particular, computer skills such as word processing, graphics and presentation, using operating systems, creating and editing databases, spreadsheet analysis, using imported data, e-mail, using electronic bulletin boards, and downloading information were among the highest ranked computer skills necessary for recent graduates. However, there were statistically significant differences in perceptions between senior executives and healthcare administration students as to the extent

  9. Advanced Selling: A Comprehensive Course Sales Project

    Science.gov (United States)

    Yarrington-Young, Susan; Castleberry, Stephen B.; Coleman, Joshua T.

    2016-01-01

    A comprehensive project for the Advanced Selling course that has been tested at three universities is introduced. After selecting an industry and a company, students engage in a complete industry analysis, a company sales analysis, a sales-specific SWOT analysis, complete a ride day with a salesperson in that firm, then present their findings in a…

  10. Thermohydraulic analysis of nuclear power plant accidents by computer codes

    International Nuclear Information System (INIS)

    Petelin, S.; Stritar, A.; Istenic, R.; Gregoric, M.; Jerele, A.; Mavko, B.

    1982-01-01

    RELAP4/MOD6, BRUCH-D-06, CONTEMPT-LT-28, RELAP5/MOD1 and COBRA-4-1 codes were successful y implemented at the CYBER 172 computer in Ljubljana. Input models of NPP Krsko for the first three codes were prepared. Because of the high computer cost only one analysis of double ended guillotine break of the cold leg of NPP Krsko by RELAP4 code has been done. BRUCH code is easier and cheaper for use. Several analysis have been done. Sensitivity study was performed with CONTEMPT-LT-28 for double ended pump suction break. These codes are intended to be used as a basis for independent safety analyses. (author)

  11. A Video-Based CALL Program for Proficient and Less-Proficient L2 Learners' Comprehension Ability, Incidental Vocabulary Acquisition

    Science.gov (United States)

    Lin, Lu-Fang

    2010-01-01

    This study investigates first whether news video in a computer-assisted language learning (CALL) program can foster second language (L2) comprehension and incidental acquisition of adjectives, nouns, and verbs. Second, this study examines the relationship between the participants' vocabulary acquisition and their video comprehension. The…

  12. Comprehensive proteomic analysis of the wheat pathogenic fungus Zymoseptoria tritici.

    Science.gov (United States)

    Yang, Fen; Yin, Qi

    2016-01-01

    Zymoseptoria tritici causes Septoria tritici blotch disease of wheat. To obtain a comprehensive protein dataset of this fungal pathogen, proteomes of Z. tritici growing in nutrient-limiting and rich media and in vivo at a late stage of wheat infection were fractionated by 1D gel or strong cation exchange (SCX) chromatography and analyzed by LC-MS/MS. A total of 5731, 5376 and 3168 Z. tritici proteins were confidently identified from these conditions, respectively. Of these in vitro and in planta proteins, 9 and 11% were predicted to contain signal peptides, respectively. Functional classification analysis revealed the proteins were involved in the various cellular activities. Comparison of three distinct protein expression profiles demonstrates the elevated carbohydrate, lipid and secondary metabolisms, transport, protein processing and energy production specifically in the host environment, in contrast to the enhancement of signaling, defense, replication, transcription and cell division in vitro. The data provide useful targets towards a better understanding of the molecular basis of Z. tritici growth, development, stress response and pathogenicity. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Application of computer aided tolerance analysis in product design

    International Nuclear Information System (INIS)

    Du Hua

    2009-01-01

    This paper introduces the shortage of the traditional tolerance design method and the strong point of the computer aided tolerancing (CAT) method,compares the shortage and the strong point among the three tolerance analysis methods, which are Worst Case Analysis, Statistical Analysis and Monte-Carlo Simulation Analysis, and offers the basic courses and correlative details for CAT. As the study objects, the reactor pressure vessel, the core barrel, the hold-down barrel and the support plate are used to upbuild the tolerance simulation model, based on their 3D design models. Then the tolerance simulation analysis has been conducted and the scheme of the tolerance distribution is optimized based on the analysis results. (authors)

  14. Efficacy of computer technology-based HIV prevention interventions: a meta-analysis.

    Science.gov (United States)

    Noar, Seth M; Black, Hulda G; Pierce, Larson B

    2009-01-02

    To conduct a meta-analysis of computer technology-based HIV prevention behavioral interventions aimed at increasing condom use among a variety of at-risk populations. Systematic review and meta-analysis of existing published and unpublished studies testing computer-based interventions. Meta-analytic techniques were used to compute and aggregate effect sizes for 12 randomized controlled trials that met inclusion criteria. Variables that had the potential to moderate intervention efficacy were also tested. The overall mean weighted effect size for condom use was d = 0.259 (95% confidence interval = 0.201, 0.317; Z = 8.74, P partners, and incident sexually transmitted diseases. In addition, interventions were significantly more efficacious when they were directed at men or women (versus mixed sex groups), utilized individualized tailoring, used a Stages of Change model, and had more intervention sessions. Computer technology-based HIV prevention interventions have similar efficacy to more traditional human-delivered interventions. Given their low cost to deliver, ability to customize intervention content, and flexible dissemination channels, they hold much promise for the future of HIV prevention.

  15. Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2014, held at Warsaw, Poland, September 7-10, 2014. The book presents recent advances in computational optimization. The volume includes important real problems like parameter settings for controlling processes in bioreactor and other processes, resource constrained project scheduling, infection distribution, molecule distance geometry, quantum computing, real-time management and optimal control, bin packing, medical image processing, localization the abrupt atmospheric contamination source and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks.

  16. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  17. Comprehensive techno-economic analysis of wastewater-based algal biofuel production: A case study.

    Science.gov (United States)

    Xin, Chunhua; Addy, Min M; Zhao, Jinyu; Cheng, Yanling; Cheng, Sibo; Mu, Dongyan; Liu, Yuhuan; Ding, Rijia; Chen, Paul; Ruan, Roger

    2016-07-01

    Combining algae cultivation and wastewater treatment for biofuel production is considered the feasible way for resource utilization. An updated comprehensive techno-economic analysis method that integrates resources availability into techno-economic analysis was employed to evaluate the wastewater-based algal biofuel production with the consideration of wastewater treatment improvement, greenhouse gases emissions, biofuel production costs, and coproduct utilization. An innovative approach consisting of microalgae cultivation on centrate wastewater, microalgae harvest through flocculation, solar drying of biomass, pyrolysis of biomass to bio-oil, and utilization of co-products, was analyzed and shown to yield profound positive results in comparison with others. The estimated break even selling price of biofuel ($2.23/gallon) is very close to the acceptable level. The approach would have better overall benefits and the internal rate of return would increase up to 18.7% if three critical components, namely cultivation, harvest, and downstream conversion could achieve breakthroughs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Computer-controlled system for rapid soil analysis of 226Ra

    International Nuclear Information System (INIS)

    Doane, R.W.; Berven, B.A.; Blair, M.S.

    1984-01-01

    A computer-controlled multichannel analysis system has been developed by the Radiological Survey Activities Group at Oak Ridge National Laboratory (ORNL) for the Department of Energy (DOE) in support of the DOE's remedial action programs. The purpose of this system is to provide a rapid estimate of the 226 Ra concentration in soil samples using a 6 x 9-in. NaI(Tl) crystal containing a 3.25-in. deep by 3.5-in. diameter well. This gamma detection system is controlled by a mini-computer with a dual floppy disk storage medium. A two-chip interface was also designed at ORNL which handles all control signals generated from the computer keyboard. These computer-generated control signals are processed in machine language for rapid data transfer and BASIC language is used for data processing

  19. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  20. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  1. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  2. A Second Look at Dwyer's Studies by Means of Meta-Analysis: The Effects of Pictorial Realism on Text Comprehension and Vocabulary.

    Science.gov (United States)

    Reinwein, Joachim; Huberdeau, Lucie

    A meta-analysis examined a series of studies by F.M. Dwyer on the effect of illustrations on text comprehension. Principal component analysis was used to reduce the four posttests used by Dwyer to more fundamental factors of learning, followed by analyses of variance. All nine studies (involving secondary-school and college students) in which…

  3. Computer Aided Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    A methodology for computer aided design and analysis of separation processes involving electrolyte systems is presented. The methodology consists of three main parts. The thermodynamic part "creates" the problem specific property model package, which is a collection of pure component and mixture...... property models. The design and analysis part generates process (flowsheet) alternatives, evaluates/analyses feasibility of separation and provides a visual operation path for the desired separation. The simulation part consists of a simulation/calculation engine that allows the screening and validation...... of process alternatives. For the simulation part, a general multi-purpose, multi-phase separation model has been developed and integrated to an existing computer aided system. Application of the design and analysis methodology is highlighted through two illustrative case studies....

  4. Computer Aided Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    Takano, Kiyoteru; Gani, Rafiqul; Kolar, P.

    2000-01-01

    A methodology for computer aided design and analysis of separation processes involving electrolyte systems is presented. The methodology consists of three main parts. The thermodynamic part 'creates' the problem specific property model package, which is a collection of pure component and mixture...... property models. The design and analysis part generates process (flowsheet) alternatives, evaluates/analyses feasibility of separation and provides a visual operation path for the desired separation. The simulation part consists of a simulation/calculation engine that allows the screening and validation...... of process alternatives. For the simulation part, a general multi-purpose, multi-phase separation model has been developed and integrated to an existing computer aided system. Application of the design and analysis methodology is highlighted through two illustrative case studies, (C) 2000 Elsevier Science...

  5. Comprehensive analysis of the dynamic behavior of grid-connected DFIG-based wind turbines under LVRT conditions

    DEFF Research Database (Denmark)

    Alsmadi, Yazan M.; Xu, Longya; Blaabjerg, Frede

    2015-01-01

    ) capability of wind turbines during grid faults is one of the core requirements to ensure stability in the power grid during transients. The doubly-fed induction generators (DFIGs) offer several advantages when utilized in wind turbines, but discussions about their LVRT capabilities are limited. This paper...... presents a comprehensive study of the LVRT of grid-connected DFIG-based wind turbines. It provides a detailed investigation of the transient characteristics and the dynamic behavior of DFIGs during symmetrical and asymmetrical grid voltage sags. A detailed theoretical study supported by computer......Power generation and grid stability have become key issues in the last decade. The high penetration of large capacity wind generation into the electric power grid has led to serious concerns about their influence on the dynamic behavior of power systems. The Low-Voltage Ride-Through (LVRT...

  6. A Comprehensive Analysis on Wearable Acceleration Sensors in Human Activity Recognition.

    Science.gov (United States)

    Janidarmian, Majid; Roshan Fekr, Atena; Radecka, Katarzyna; Zilic, Zeljko

    2017-03-07

    Sensor-based motion recognition integrates the emerging area of wearable sensors with novel machine learning techniques to make sense of low-level sensor data and provide rich contextual information in a real-life application. Although Human Activity Recognition (HAR) problem has been drawing the attention of researchers, it is still a subject of much debate due to the diverse nature of human activities and their tracking methods. Finding the best predictive model in this problem while considering different sources of heterogeneities can be very difficult to analyze theoretically, which stresses the need of an experimental study. Therefore, in this paper, we first create the most complete dataset, focusing on accelerometer sensors, with various sources of heterogeneities. We then conduct an extensive analysis on feature representations and classification techniques (the most comprehensive comparison yet with 293 classifiers) for activity recognition. Principal component analysis is applied to reduce the feature vector dimension while keeping essential information. The average classification accuracy of eight sensor positions is reported to be 96.44% ± 1.62% with 10-fold evaluation, whereas accuracy of 79.92% ± 9.68% is reached in the subject-independent evaluation. This study presents significant evidence that we can build predictive models for HAR problem under more realistic conditions, and still achieve highly accurate results.

  7. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  8. CASKETSS: a computer code system for thermal and structural analysis of nuclear fuel shipping casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1989-02-01

    A computer program CASKETSS has been developed for the purpose of thermal and structural analysis of nuclear fuel shipping casks. CASKETSS measn a modular code system for CASK Evaluation code system Thermal and Structural Safety. Main features of CASKETSS are as follow; (1) Thermal and structural analysis computer programs for one-, two-, three-dimensional geometries are contained in the code system. (2) Some of the computer programs in the code system has been programmed to provide near optimal speed on vector processing computers. (3) Data libralies fro thermal and structural analysis are provided in the code system. (4) Input data generator is provided in the code system. (5) Graphic computer program is provided in the code system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  9. The Effectiveness of Grammar Learning in Impro ving Reading Comprehension of English Majors

    Institute of Scientific and Technical Information of China (English)

    田晓

    2015-01-01

    The importance of grammar knowledge has al-ways been neglected in reading comprehension. To help English teachers and learners see the value of grammar analysis, this pa-per, therefore, explores the correlation between grammar and reading comprehension. Forty-four freshmen of English majors were involved in the experiment, completing two tests of grammar and reading comprehension respectively, and it was followed by a personal interview for some exceptional cases after a week. The result of data analysis shows that grammar analysis accompanying with vocabulary, emotion, as well as other factors produce an ef-fect on learners’reading comprehension to a certain degree. It is suggested that language teachers as well as learners therefore should attach importance to learning grammatical knowledge.

  10. Data processing of X-ray fluorescence analysis using an electronic computer

    International Nuclear Information System (INIS)

    Yakubovich, A.L.; Przhiyalovskij, S.M.; Tsameryan, G.N.; Golubnichij, G.V.; Nikitin, S.A.

    1979-01-01

    Considered are problems of data processing of multi-element (for 17 elements) X-ray fluorescence analysis of tungsten and molybdenum ores. The analysis was carried out using silicon-lithium spectrometer with the energy resolution of about 300 eV and a 1024-channel analyzer. A characteristic radiation of elements was excited with two 109 Cd radioisotope sources, their general activity being 10 mCi. The period of measurements was 400 s. The data obtained were processed with a computer using the ''Proba-1'' and ''Proba-2'' programs. Data processing algorithms and computer calculation results are presented

  11. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Sarah; Devenish, Robin [Nuclear Physics Laboratory, Oxford University (United Kingdom)

    1989-07-15

    Computing in high energy physics has changed over the years from being something one did on a slide-rule, through early computers, then a necessary evil to the position today where computers permeate all aspects of the subject from control of the apparatus to theoretical lattice gauge calculations. The state of the art, as well as new trends and hopes, were reflected in this year's 'Computing In High Energy Physics' conference held in the dreamy setting of Oxford's spires. The conference aimed to give a comprehensive overview, entailing a heavy schedule of 35 plenary talks plus 48 contributed papers in two afternoons of parallel sessions. In addition to high energy physics computing, a number of papers were given by experts in computing science, in line with the conference's aim – 'to bring together high energy physicists and computer scientists'.

  12. [Recent advances in analysis of petroleum geological samples by comprehensive two-dimensional gas chromatography].

    Science.gov (United States)

    Gao, Xuanbo; Chang, Zhenyang; Dai, Wei; Tong, Ting; Zhang, Wanfeng; He, Sheng; Zhu, Shukui

    2014-10-01

    Abundant geochemical information can be acquired by analyzing the chemical compositions of petroleum geological samples. The information obtained from the analysis provides scientifical evidences for petroleum exploration. However, these samples are complicated and can be easily influenced by physical (e. g. evaporation, emulsification, natural dispersion, dissolution and sorption), chemical (photodegradation) and biological (mainly microbial degradation) weathering processes. Therefore, it is very difficult to analyze the petroleum geological samples and they cannot be effectively separated by traditional gas chromatography/mass spectrometry. A newly developed separation technique, comprehensive two-dimensional gas chromatography (GC x GC), has unique advantages in complex sample analysis, and recently it has been applied to petroleum geological samples. This article mainly reviews the research progres- ses in the last five years, the main problems and the future research about GC x GC applied in the area of petroleum geology.

  13. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    Science.gov (United States)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  14. Gas analysis by computer-controlled microwave rotational spectrometry

    International Nuclear Information System (INIS)

    Hrubesh, L.W.

    1978-01-01

    Microwave rotational spectrometry has inherently high resolution and is thus nearly ideal for qualitative gas mixture analysis. Quantitative gas analysis is also possible by a simplified method which utilizes the ease with which molecular rotational transitions can be saturated at low microwave power densities. This article describes a computer-controlled microwave spectrometer which is used to demonstrate for the first time a totally automated analysis of a complex gas mixture. Examples are shown for a complete qualitative and quantitative analysis, in which a search of over 100 different compounds is made in less than 7 min, with sensitivity for most compounds in the 10 to 100 ppm range. This technique is expected to find increased use in view of the reduced complexity and increased reliabiity of microwave spectrometers and because of new energy-related applications for analysis of mixtures of small molecules

  15. HAMOC: a computer program for fluid hammer analysis

    International Nuclear Information System (INIS)

    Johnson, H.G.

    1975-12-01

    A computer program has been developed for fluid hammer analysis of piping systems attached to a vessel which has undergone a known rapid pressure transient. The program is based on the characteristics method for solution of the partial differential equations of motion and continuity. Column separation logic is included for situations in which pressures fall to saturation values

  16. Systematization and sophistication of a comprehensive sensitivity analysis program. Phase 2

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    2004-02-01

    This study developed minute estimation by adopting comprehensive sensitivity analytical program for reliability of TRU waste repository concepts in a crystalline rock condition. We examined each components and groundwater scenario of geological repository and prepared systematic bases to examine the reliability from the point of comprehensiveness. Models and data are sophisticated to examine the reliability. Based on an existing TRU waste repository concepts, effects of parameters to nuclide migration were quantitatively classified. Those parameters, that will be decided quantitatively, are such as site character of natural barrier and design specification of engineered barriers. Considering the feasibility of those figures of specifications, reliability is re-examined on combinations of those parameters within a practical range. Future issues are; Comprehensive representation of hybrid geosphere model including the fractured medium and permeable matrix medium. Sophistication of tools to develop the reliable combinations of parameters. It is significant to continue this study because the disposal concepts and specification of TRU nuclides containing waste on various sites shall be determined rationally and safely through these studies. (author)

  17. Multicenter study of quantitative computed tomography analysis using a computer-aided three-dimensional system in patients with idiopathic pulmonary fibrosis.

    Science.gov (United States)

    Iwasawa, Tae; Kanauchi, Tetsu; Hoshi, Toshiko; Ogura, Takashi; Baba, Tomohisa; Gotoh, Toshiyuki; Oba, Mari S

    2016-01-01

    To evaluate the feasibility of automated quantitative analysis with a three-dimensional (3D) computer-aided system (i.e., Gaussian histogram normalized correlation, GHNC) of computed tomography (CT) images from different scanners. Each institution's review board approved the research protocol. Informed patient consent was not required. The participants in this multicenter prospective study were 80 patients (65 men, 15 women) with idiopathic pulmonary fibrosis. Their mean age was 70.6 years. Computed tomography (CT) images were obtained by four different scanners set at different exposures. We measured the extent of fibrosis using GHNC, and used Pearson's correlation analysis, Bland-Altman plots, and kappa analysis to directly compare the GHNC results with manual scoring by radiologists. Multiple linear regression analysis was performed to determine the association between the CT data and forced vital capacity (FVC). For each scanner, the extent of fibrosis as determined by GHNC was significantly correlated with the radiologists' score. In multivariate analysis, the extent of fibrosis as determined by GHNC was significantly correlated with FVC (p < 0.001). There was no significant difference between the results obtained using different CT scanners. Gaussian histogram normalized correlation was feasible, irrespective of the type of CT scanner used.

  18. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  19. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    Science.gov (United States)

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  20. Comprehensive RNA-Seq Analysis on the Regulation of Tomato Ripening by Exogenous Auxin.

    Directory of Open Access Journals (Sweden)

    Jiayin Li

    Full Text Available Auxin has been shown to modulate the fruit ripening process. However, the molecular mechanisms underlying auxin regulation of fruit ripening are still not clear. Illumina RNA sequencing was performed on mature green cherry tomato fruit 1 and 7 days after auxin treatment, with untreated fruit as a control. The results showed that exogenous auxin maintained system 1 ethylene synthesis and delayed the onset of system 2 ethylene synthesis and the ripening process. At the molecular level, genes associated with stress resistance were significantly up-regulated, but genes related to carotenoid metabolism, cell degradation and energy metabolism were strongly down-regulated by exogenous auxin. Furthermore, genes encoding DNA demethylases were inhibited by auxin, whereas genes encoding cytosine-5 DNA methyltransferases were induced, which contributed to the maintenance of high methylation levels in the nucleus and thus inhibited the ripening process. Additionally, exogenous auxin altered the expression patterns of ethylene and auxin signaling-related genes that were induced or repressed in the normal ripening process, suggesting significant crosstalk between these two hormones during tomato ripening. The present work is the first comprehensive transcriptome analysis of auxin-treated tomato fruit during ripening. Our results provide comprehensive insights into the effects of auxin on the tomato ripening process and the mechanism of crosstalk between auxin and ethylene.

  1. Computational anatomy based on whole body imaging basic principles of computer-assisted diagnosis and therapy

    CERN Document Server

    Masutani, Yoshitaka

    2017-01-01

    This book deals with computational anatomy, an emerging discipline recognized in medical science as a derivative of conventional anatomy. It is also a completely new research area on the boundaries of several sciences and technologies, such as medical imaging, computer vision, and applied mathematics. Computational Anatomy Based on Whole Body Imaging highlights the underlying principles, basic theories, and fundamental techniques in computational anatomy, which are derived from conventional anatomy, medical imaging, computer vision, and applied mathematics, in addition to various examples of applications in clinical data. The book will cover topics on the basics and applications of the new discipline. Drawing from areas in multidisciplinary fields, it provides comprehensive, integrated coverage of innovative approaches to computational anatomy. As well,Computational Anatomy Based on Whole Body Imaging serves as a valuable resource for researchers including graduate students in the field and a connection with ...

  2. Polygonal approximation and scale-space analysis of closed digital curves

    CERN Document Server

    Ray, Kumar S

    2013-01-01

    This book covers the most important topics in the area of pattern recognition, object recognition, computer vision, robot vision, medical computing, computational geometry, and bioinformatics systems. Students and researchers will find a comprehensive treatment of polygonal approximation and its real life applications. The book not only explains the theoretical aspects but also presents applications with detailed design parameters. The systematic development of the concept of polygonal approximation of digital curves and its scale-space analysis are useful and attractive to scholars in many fi

  3. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    Science.gov (United States)

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  4. The Research and Evaluation of Serious Games: Toward a Comprehensive Methodology

    Science.gov (United States)

    Mayer, Igor; Bekebrede, Geertje; Harteveld, Casper; Warmelink, Harald; Zhou, Qiqi; van Ruijven, Theo; Lo, Julia; Kortmann, Rens; Wenzler, Ivo

    2014-01-01

    The authors present the methodological background to and underlying research design of an ongoing research project on the scientific evaluation of serious games and/or computer-based simulation games (SGs) for advanced learning. The main research questions are: (1) what are the requirements and design principles for a comprehensive social…

  5. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  6. Prioritizing strategies for comprehensive liver cancer control in Asia: a conjoint analysis.

    Science.gov (United States)

    Bridges, John F P; Dong, Liming; Gallego, Gisselle; Blauvelt, Barri M; Joy, Susan M; Pawlik, Timothy M

    2012-10-30

    Liver cancer is a complex and burdensome disease, with Asia accounting for 75% of known cases. Comprehensive cancer control requires the use of multiple strategies, but various stakeholders may have different views as to which strategies should have the highest priority. This study identified priorities across multiple strategies for comprehensive liver cancer control (CLCC) from the perspective of liver cancer clinical, policy, and advocacy stakeholders in China, Japan, South Korea and Taiwan. Concordance of priorities was assessed across the region and across respondent roles. Priorities for CLCC were examined as part of a cross-sectional survey of liver cancer experts. Respondents completed several conjoint-analysis choice tasks to prioritize 11 strategies. In each task, respondents judged which of two competing CLCC plans, consisting of mutually exclusive and exhaustive subsets of the strategies, would have the greatest impact. The dependent variable was the chosen plan, which was then regressed on the strategies of different plans. The restricted least squares (RLS) method was utilized to compare aggregate and stratified models, and t-tests and Wald tests were used to test for significance and concordance, respectively. Eighty respondents (69.6%) were eligible and completed the survey. Their primary interests were hepatitis (26%), hepatocellular carcinoma (HCC) (58%), metastatic liver cancer (10%) and transplantation (6%). The most preferred strategies were monitoring at-risk populations (p<0.001), clinician education (p<0.001), and national guidelines (p<0.001). Most priorities were concordant across sites except for three strategies: transplantation infrastructure (p=0.009) was valued lower in China, measuring social burden (p=0.037) was valued higher in Taiwan, and national guidelines (p=0.025) was valued higher in China. Priorities did not differ across stakeholder groups (p=0.438). Priorities for CLCC in Asia include monitoring at-risk populations

  7. NJOY: a comprehensive ENDF/B processing system

    International Nuclear Information System (INIS)

    MacFarlane, R.E.; Barrett, R.J.; Muir, D.W.; Boicourt, R.M.

    1978-01-01

    NJOY is the successor to the MINX code. It provides an efficient and accurate capability for processing ENDF/B-IV and -V data for use in fast reactor, thermal reactor, fusion reactor, shielding, and weapons analysis. NJOY produces neutron cross sections and group-to-group scattering matrices, heat production cross sections, photon production matrices, photon interaction cross sections and group-to-group matrices, delayed neutron spectra, thermal scattering cross sections and matrices, and cross-section covariances. Detailed pointwise cross sections, heating KERMA factors, thermal cross sections, and energy-to-energy thermal matrices are also available for plotting and Monte Carlo applications. NJOY currently processes all types of data on ENDF/B except for the decay chain and fission product yield files. NJOY provides output in the CCCC ISOTXS, BRKOXS, and DLAYXS formats, in DTF/ANISN format, and in a new comprehensive format called MATXS. Other important features of NJOY include free-format input, efficient binary I/O, dynamic storage allocation, an extremely modular structure, an accurate center-of-mass Gaussian integration for two-body scattering, and a flux calculator that makes it possible to compute accurate self-shielded cross sections when wide and intermediate-width resonance effects are important. NJOY is a single, integrated, efficient system that produces almost all of the basic cross sections required for multigroup methods of nuclear analysis. 3 figures, 4 tables

  8. Computer-Mediated Collaborative Learning

    Science.gov (United States)

    Beatty, Ken; Nunan, David

    2004-01-01

    The study reported here investigates collaborative learning at the computer. Ten pairs of students were presented with a series of comprehension questions about Mary Shelley's novel "Frankenstein or a Modern Prometheus" along with a CD-ROM, "Frankenstein Illuminated," containing the novel and a variety of source material. Five students worked with…

  9. Improvement of the computing speed of the FBR fuel pin bundle deformation analysis code 'BAMBOO'

    International Nuclear Information System (INIS)

    Ito, Masahiro; Uwaba, Tomoyuki

    2005-04-01

    JNC has developed a coupled analysis system of a fuel pin bundle deformation analysis code 'BAMBOO' and a thermal hydraulics analysis code ASFRE-IV' for the purpose of evaluating the integrity of a subassembly under the BDI condition. This coupled analysis took much computation time because it needs convergent calculations to obtain numerically stationary solutions for thermal and mechanical behaviors. We improved the computation time of the BAMBOO code analysis to make the coupled analysis practicable. 'BAMBOO' is a FEM code and as such its matrix calculations consume large memory area to temporarily stores intermediate results in the solution of simultaneous linear equations. The code used the Hard Disk Drive (HDD) for the virtual memory area to save Random Access Memory (RAM) of the computer. However, the use of the HDD increased the computation time because Input/Output (I/O) processing with the HDD took much time in data accesses. We improved the code in order that it could conduct I/O processing only with the RAM in matrix calculations and run with in high-performance computers. This improvement considerably increased the CPU occupation rate during the simulation and reduced the total simulation time of the BAMBOO code to about one-seventh of that before the improvement. (author)

  10. Corporate Social Reporting: a Comprehensive Picture of Indonesian Mining Companies

    Directory of Open Access Journals (Sweden)

    Rosinta Ria Panggabean

    2014-09-01

    Full Text Available Recently, stakeholders demand that CSR reporting of a company provides social and environmental information as well as the financial information reported in financial statement. This research questioned whether CSR reporting of Indonesian mining companies may be regarded as a mechanism which social and environmental accountability are discharged. The purpose of this research is to provide a content analysis framework and information on the comprehensiveness of Corporate Social Responsibility (CSR reporting of Indonesian mining companies. The methodology used is content analysis method by a framework derived from GRI G3.1 Guidelines. Comprehensive reporting contains three types of information for each disclosed CSR item: (i vision and goals, (ii management approach, and (iii performance indicator. The framework was used to assess the comprehensiveness of CSR report by analyzing the 2012 financial reports and annual reports of Indonesian listed mining companies. The content analysis of CSR reporting of the listed mining companies in Indonesia shows a low level of comprehensive reporting. This finding agrees those of prior studies on the completeness of CSR reporting and adds to the debate regarding whether CSR reporting of Indonesian mining companies can be considered a mechanism for discharging social and environmental accountability.

  11. A comprehensive classification system for lipids

    NARCIS (Netherlands)

    Fahy, E.; Subramaniam, S.; Brown, H.A.; Glass, C.K.; Merrill, A.H.; Murphy, R.C.; Raetz, C.R.H.; Russell, D.W.; Seyama, Y.; Shaw, W.; Shimizu, T.; Spener, F.; van Meer, G.|info:eu-repo/dai/nl/068570368; VanNieuwenhze, M.S.; White, S.H.|info:eu-repo/dai/nl/304843539; Witztum, J.; Dennis, E.A.

    2005-01-01

    Lipids are produced, transported, and recognized by the concerted actions of numerous enzymes, binding proteins, and receptors. A comprehensive analysis of lipid molecules, “lipidomics,” in the context of genomics and proteomics is crucial to understanding cellular physiology and pathology;

  12. Comprehensive entropy weight observability-controllability risk ...

    African Journals Online (AJOL)

    Decision making for water resource planning is often related to social, economic and environmental factors. There are various methods for making decisions about water resource planning alternatives and measures with various shortcomings. A comprehensive entropy weight observability-controllability risk analysis ...

  13. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  14. Comprehensive analysis of the mutation spectrum in 301 German ALS families.

    Science.gov (United States)

    Müller, Kathrin; Brenner, David; Weydt, Patrick; Meyer, Thomas; Grehl, Torsten; Petri, Susanne; Grosskreutz, Julian; Schuster, Joachim; Volk, Alexander E; Borck, Guntram; Kubisch, Christian; Klopstock, Thomas; Zeller, Daniel; Jablonka, Sibylle; Sendtner, Michael; Klebe, Stephan; Knehr, Antje; Günther, Kornelia; Weis, Joachim; Claeys, Kristl G; Schrank, Berthold; Sperfeld, Anne-Dorte; Hübers, Annemarie; Otto, Markus; Dorst, Johannes; Meitinger, Thomas; Strom, Tim M; Andersen, Peter M; Ludolph, Albert C; Weishaupt, Jochen H

    2018-04-12

    Recent advances in amyotrophic lateral sclerosis (ALS) genetics have revealed that mutations in any of more than 25 genes can cause ALS, mostly as an autosomal-dominant Mendelian trait. Detailed knowledge about the genetic architecture of ALS in a specific population will be important for genetic counselling but also for genotype-specific therapeutic interventions. Here we combined fragment length analysis, repeat-primed PCR, Southern blotting, Sanger sequencing and whole exome sequencing to obtain a comprehensive profile of genetic variants in ALS disease genes in 301 German pedigrees with familial ALS. We report C9orf72 mutations as well as variants in consensus splice sites and non-synonymous variants in protein-coding regions of ALS genes. We furthermore estimate their pathogenicity by taking into account type and frequency of the respective variant as well as segregation within the families. 49% of our German ALS families carried a likely pathogenic variant in at least one of the earlier identified ALS genes. In 45% of the ALS families, likely pathogenic variants were detected in C9orf72, SOD1, FUS, TARDBP or TBK1 , whereas the relative contribution of the other ALS genes in this familial ALS cohort was 4%. We identified several previously unreported rare variants and demonstrated the absence of likely pathogenic variants in some of the recently described ALS disease genes. We here present a comprehensive genetic characterisation of German familial ALS. The present findings are of importance for genetic counselling in clinical practice, for molecular research and for the design of diagnostic gene panels or genotype-specific therapeutic interventions in Europe. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. New Mexico’s comprehensive impaired-driving program : crash data analysis.

    Science.gov (United States)

    2014-03-01

    In late 2004, the National Highway Traffic Safety Administration provided funds through a Cooperative Agreement to the New Mexico Department of Transportation to demonstrate a process for implementing a comprehensive State impaired-driving system. NH...

  16. Comprehensive Assessment of Osteoporosis and Bone Fragility with CT Colonography

    Science.gov (United States)

    Murthy, Naveen S.; Khosla, Sundeep; Clarke, Bart L.; Bruining, David H.; Kopperdahl, David L.; Lee, David C.; Keaveny, Tony M.

    2016-01-01

    Purpose To evaluate the ability of additional analysis of computed tomographic (CT) colonography images to provide a comprehensive osteoporosis assessment. Materials and Methods This Health Insurance Portability and Accountability Act–compliant study was approved by our institutional review board with a waiver of informed consent. Diagnosis of osteoporosis and assessment of fracture risk were compared between biomechanical CT analysis and dual-energy x-ray absorptiometry (DXA) in 136 women (age range, 43–92 years), each of whom underwent CT colonography and DXA within a 6-month period (between January 2008 and April 2010). Blinded to the DXA data, biomechanical CT analysis was retrospectively applied to CT images by using phantomless calibration and finite element analysis to measure bone mineral density and bone strength at the hip and spine. Regression, Bland-Altman, and reclassification analyses and paired t tests were used to compare results. Results For bone mineral density T scores at the femoral neck, biomechanical CT analysis was highly correlated (R2 = 0.84) with DXA, did not differ from DXA (P = .15, paired t test), and was able to identify osteoporosis (as defined by DXA), with 100% sensitivity in eight of eight patients (95% confidence interval [CI]: 67.6%, 100%) and 98.4% specificity in 126 of 128 patients (95% CI: 94.5%, 99.6%). Considering both the hip and spine, the classification of patients at high risk for fracture by biomechanical CT analysis—those with osteoporosis or “fragile bone strength”—agreed well against classifications for clinical osteoporosis by DXA (T score ≤−2.5 at the hip or spine), with 82.8% sensitivity in 24 of 29 patients (95% CI: 65.4%, 92.4%) and 85.7% specificity in 66 of 77 patients (95% CI: 76.2%, 91.8%). Conclusion Retrospective biomechanical CT analysis of CT colonography for colorectal cancer screening provides a comprehensive osteoporosis assessment without requiring changes in imaging protocols.

  17. Quantum computing from the ground up

    CERN Document Server

    Perry, Riley Tipton

    2012-01-01

    Quantum computing - the application of quantum mechanics to information - represents a fundamental break from classical information and promises to dramatically increase a computer's power. Many difficult problems, such as the factorization of large numbers, have so far resisted attack by classical computers yet are easily solved with quantum computers. If they become feasible, quantum computers will end standard practices such as RSA encryption. Most of the books or papers on quantum computing require (or assume) prior knowledge of certain areas such as linear algebra or quantum mechanics. The majority of the currently-available literature is hard to understand for the average computer enthusiast or interested layman. This text attempts to teach quantum computing from the ground up in an easily readable way, providing a comprehensive tutorial that includes all the necessary mathematics, computer science and physics.

  18. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    Science.gov (United States)

    Cenek, Martin; Dahl, Spencer K.

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  19. Computer-Aided Model Based Analysis for Design and Operation of a Copolymerization Process

    DEFF Research Database (Denmark)

    Lopez-Arenas, Maria Teresa; Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    . This will allow analysis of the process behaviour, contribute to a better understanding of the polymerization process, help to avoid unsafe conditions of operation, and to develop operational and optimizing control strategies. In this work, through a computer-aided modeling system ICAS-MoT, two first......The advances in computer science and computational algorithms for process modelling, process simulation, numerical methods and design/synthesis algorithms, makes it advantageous and helpful to employ computer-aided modelling systems and tools for integrated process analysis. This is illustrated......-principles models have been investigated with respect to design and operational issues for solution copolymerization reactors in general, and for the methyl methacrylate/vinyl acetate system in particular. The Model 1 is taken from literature and is commonly used for low conversion region, while the Model 2 has...

  20. GPU-based acceleration of computations in nonlinear finite element deformation analysis.

    Science.gov (United States)

    Mafi, Ramin; Sirouspour, Shahin

    2014-03-01

    The physics of deformation for biological soft-tissue is best described by nonlinear continuum mechanics-based models, which then can be discretized by the FEM for a numerical solution. However, computational complexity of such models have limited their use in applications requiring real-time or fast response. In this work, we propose a graphic processing unit-based implementation of the FEM using implicit time integration for dynamic nonlinear deformation analysis. This is the most general formulation of the deformation analysis. It is valid for large deformations and strains and can account for material nonlinearities. The data-parallel nature and the intense arithmetic computations of nonlinear FEM equations make it particularly suitable for implementation on a parallel computing platform such as graphic processing unit. In this work, we present and compare two different designs based on the matrix-free and conventional preconditioned conjugate gradients algorithms for solving the FEM equations arising in deformation analysis. The speedup achieved with the proposed parallel implementations of the algorithms will be instrumental in the development of advanced surgical simulators and medical image registration methods involving soft-tissue deformation. Copyright © 2013 John Wiley & Sons, Ltd.