WorldWideScience

Sample records for computational approach reveals

  1. Computational Approaches for Revealing the Structure of Membrane Transporters: Case Study on Bilitranslocase

    Directory of Open Access Journals (Sweden)

    Katja Venko

    Full Text Available The structural and functional details of transmembrane proteins are vastly underexplored, mostly due to experimental difficulties regarding their solubility and stability. Currently, the majority of transmembrane protein structures are still unknown and this present a huge experimental and computational challenge. Nowadays, thanks to X-ray crystallography or NMR spectroscopy over 3000 structures of membrane proteins have been solved, among them only a few hundred unique ones. Due to the vast biological and pharmaceutical interest in the elucidation of the structure and the functional mechanisms of transmembrane proteins, several computational methods have been developed to overcome the experimental gap. If combined with experimental data the computational information enables rapid, low cost and successful predictions of the molecular structure of unsolved proteins. The reliability of the predictions depends on the availability and accuracy of experimental data associated with structural information. In this review, the following methods are proposed for in silico structure elucidation: sequence-dependent predictions of transmembrane regions, predictions of transmembrane helix–helix interactions, helix arrangements in membrane models, and testing their stability with molecular dynamics simulations. We also demonstrate the usage of the computational methods listed above by proposing a model for the molecular structure of the transmembrane protein bilitranslocase. Bilitranslocase is bilirubin membrane transporter, which shares similar tissue distribution and functional properties with some of the members of the Organic Anion Transporter family and is the only member classified in the Bilirubin Transporter Family. Regarding its unique properties, bilitranslocase is a potentially interesting drug target. Keywords: Membrane proteins, Bilitranslocase, 3D protein structure, Transmembrane region predictors, Helix–helix interactions

  2. Computational Approaches Reveal New Insights into Regulation and Function of Non; coding RNAs and their Targets

    KAUST Repository

    Alam, Tanvir

    2016-11-28

    previously known to harbor LD motifs and we experimentally confirmed some of our predicted motifs. This novel discovery will expand our knowledge of cancer metastasis and will facilitate therapeutic targeting linking specific ncRNAs via paxillin proteins to diseases. Finally, through bioinformatics approaches, we identified lncRNAs as markers that distinguish classical from alternative activation of macrophage. This result may have good use in the diagnosis of infectious diseases.

  3. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  4. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  5. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  6. MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACH

    OpenAIRE

    Omar AlSheikSalem

    2016-01-01

    In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating healthcare applications and services is one of the vast data approaches that can be adapted to mobile cloud computing. This work proposes a framework of a global healthcare computing based combining both mobile computing and cloud computing. This approach leads to integrate all of ...

  7. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  8. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  9. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  10. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  11. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too

  12. Integrative approaches to computational biomedicine

    Science.gov (United States)

    Coveney, Peter V.; Diaz-Zuccarini, Vanessa; Graf, Norbert; Hunter, Peter; Kohl, Peter; Tegner, Jesper; Viceconti, Marco

    2013-01-01

    The new discipline of computational biomedicine is concerned with the application of computer-based techniques and particularly modelling and simulation to human health. Since 2007, this discipline has been synonymous, in Europe, with the name given to the European Union's ambitious investment in integrating these techniques with the eventual aim of modelling the human body as a whole: the virtual physiological human. This programme and its successors are expected, over the next decades, to transform the study and practice of healthcare, moving it towards the priorities known as ‘4P's’: predictive, preventative, personalized and participatory medicine.

  13. Infinitesimal symmetries: a computational approach

    International Nuclear Information System (INIS)

    Kersten, P.H.M.

    1985-01-01

    This thesis is concerned with computational aspects in the determination of infinitesimal symmetries and Lie-Baecklund transformations of differential equations. Moreover some problems are calculated explicitly. A brief introduction to some concepts in the theory of symmetries and Lie-Baecklund transformations, relevant for this thesis, are given. The mathematical formalism is shortly reviewed. The jet bundle formulation is chosen, in which, by its algebraic nature, objects can be described very precisely. Consequently it is appropriate for implementation. A number of procedures are discussed, which enable to carry through computations with the help of a computer. These computations are very extensive in practice. The Lie algebras of infinitesimal symmetries of a number of differential equations in Mathematical Physics are established and some of their applications are discussed, i.e., Maxwell equations, nonlinear diffusion equation, nonlinear Schroedinger equation, nonlinear Dirac equations and self dual SU(2) Yang-Mills equations. Lie-Baecklund transformations of Burgers' equation, Classical Boussinesq equation and the Massive Thirring Model are determined. Furthermore, nonlocal Lie-Baecklund transformations of the last equation are derived. (orig.)

  14. Computational approach in zeolite science

    NARCIS (Netherlands)

    Pidko, E.A.; Santen, van R.A.; Chester, A.W.; Derouane, E.G.

    2009-01-01

    This chapter presents an overview of different computational methods and their application to various fields of zeolite chemistry. We will discuss static lattice methods based on interatomic potentials to predict zeolite structures and topologies, Monte Carlo simulations for the investigation of

  15. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2007-01-01

    The era of seemingly unlimited growth in processor performance is over: single chip architectures can no longer overcome the performance limitations imposed by the power they consume and the heat they generate. Today, Intel and other semiconductor firms are abandoning the single fast processor model in favor of multi-core microprocessors--chips that combine two or more processors in a single package. In the fourth edition of Computer Architecture, the authors focus on this historic shift, increasing their coverage of multiprocessors and exploring the most effective ways of achieving parallelis

  16. Learning and geometry computational approaches

    CERN Document Server

    Smith, Carl

    1996-01-01

    The field of computational learning theory arose out of the desire to for­ mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo­ metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ­ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...

  17. Quantum Computing: a Quantum Group Approach

    OpenAIRE

    Wang, Zhenghan

    2013-01-01

    There is compelling theoretical evidence that quantum physics will change the face of information science. Exciting progress has been made during the last two decades towards the building of a large scale quantum computer. A quantum group approach stands out as a promising route to this holy grail, and provides hope that we may have quantum computers in our future.

  18. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  19. Cognitive Approaches for Medicine in Cloud Computing.

    Science.gov (United States)

    Ogiela, Urszula; Takizawa, Makoto; Ogiela, Lidia

    2018-03-03

    This paper will present the application potential of the cognitive approach to data interpretation, with special reference to medical areas. The possibilities of using the meaning approach to data description and analysis will be proposed for data analysis tasks in Cloud Computing. The methods of cognitive data management in Cloud Computing are aimed to support the processes of protecting data against unauthorised takeover and they serve to enhance the data management processes. The accomplishment of the proposed tasks will be the definition of algorithms for the execution of meaning data interpretation processes in safe Cloud Computing. • We proposed a cognitive methods for data description. • Proposed a techniques for secure data in Cloud Computing. • Application of cognitive approaches for medicine was described.

  20. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  1. Computational fluid dynamics a practical approach

    CERN Document Server

    Tu, Jiyuan; Liu, Chaoqun

    2018-01-01

    Computational Fluid Dynamics: A Practical Approach, Third Edition, is an introduction to CFD fundamentals and commercial CFD software to solve engineering problems. The book is designed for a wide variety of engineering students new to CFD, and for practicing engineers learning CFD for the first time. Combining an appropriate level of mathematical background, worked examples, computer screen shots, and step-by-step processes, this book walks the reader through modeling and computing, as well as interpreting CFD results. This new edition has been updated throughout, with new content and improved figures, examples and problems.

  2. Computational neuropharmacology: dynamical approaches in drug discovery.

    Science.gov (United States)

    Aradi, Ildiko; Erdi, Péter

    2006-05-01

    Computational approaches that adopt dynamical models are widely accepted in basic and clinical neuroscience research as indispensable tools with which to understand normal and pathological neuronal mechanisms. Although computer-aided techniques have been used in pharmaceutical research (e.g. in structure- and ligand-based drug design), the power of dynamical models has not yet been exploited in drug discovery. We suggest that dynamical system theory and computational neuroscience--integrated with well-established, conventional molecular and electrophysiological methods--offer a broad perspective in drug discovery and in the search for novel targets and strategies for the treatment of neurological and psychiatric diseases.

  3. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  4. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  5. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  6. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  7. Introducing Computational Approaches in Intermediate Mechanics

    Science.gov (United States)

    Cook, David M.

    2006-12-01

    In the winter of 2003, we at Lawrence University moved Lagrangian mechanics and rigid body dynamics from a required sophomore course to an elective junior/senior course, freeing 40% of the time for computational approaches to ordinary differential equations (trajectory problems, the large amplitude pendulum, non-linear dynamics); evaluation of integrals (finding centers of mass and moment of inertia tensors, calculating gravitational potentials for various sources); and finding eigenvalues and eigenvectors of matrices (diagonalizing the moment of inertia tensor, finding principal axes), and to generating graphical displays of computed results. Further, students begin to use LaTeX to prepare some of their submitted problem solutions. Placed in the middle of the sophomore year, this course provides the background that permits faculty members as appropriate to assign computer-based exercises in subsequent courses. Further, students are encouraged to use our Computational Physics Laboratory on their own initiative whenever that use seems appropriate. (Curricular development supported in part by the W. M. Keck Foundation, the National Science Foundation, and Lawrence University.)

  8. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  9. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  10. Novel computational approaches characterizing knee physiotherapy

    Directory of Open Access Journals (Sweden)

    Wangdo Kim

    2014-01-01

    Full Text Available A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physiotherapy by introducing a new dimension of foot loading to the knee axis alignment producing an improved functional status of the patient. New physiotherapeutic applications are then possible by aligning foot loading with the functional axis of the knee joint during the treatment of patients with osteoarthritis.

  11. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  12. A computational approach to animal breeding.

    Science.gov (United States)

    Berger-Wolf, Tanya Y; Moore, Cristopher; Saia, Jared

    2007-02-07

    We propose a computational model of mating strategies for controlled animal breeding programs. A mating strategy in a controlled breeding program is a heuristic with some optimization criteria as a goal. Thus, it is appropriate to use the computational tools available for analysis of optimization heuristics. In this paper, we propose the first discrete model of the controlled animal breeding problem and analyse heuristics for two possible objectives: (1) breeding for maximum diversity and (2) breeding a target individual. These two goals are representative of conservation biology and agricultural livestock management, respectively. We evaluate several mating strategies and provide upper and lower bounds for the expected number of matings. While the population parameters may vary and can change the actual number of matings for a particular strategy, the order of magnitude of the number of expected matings and the relative competitiveness of the mating heuristics remains the same. Thus, our simple discrete model of the animal breeding problem provides a novel viable and robust approach to designing and comparing breeding strategies in captive populations.

  13. Computation within the auxiliary field approach

    International Nuclear Information System (INIS)

    Baeurle, S.A.

    2003-01-01

    Recently, the classical auxiliary field methodology has been developed as a new simulation technique for performing calculations within the framework of classical statistical mechanics. Since the approach suffers from a sign problem, a judicious choice of the sampling algorithm, allowing a fast statistical convergence and an efficient generation of field configurations, is of fundamental importance for a successful simulation. In this paper we focus on the computational aspects of this simulation methodology. We introduce two different types of algorithms, the single-move auxiliary field Metropolis Monte Carlo algorithm and two new classes of force-based algorithms, which enable multiple-move propagation. In addition, to further optimize the sampling, we describe a preconditioning scheme, which permits to treat each field degree of freedom individually with regard to the evolution through the auxiliary field configuration space. Finally, we demonstrate the validity and assess the competitiveness of these algorithms on a representative practical example. We believe that they may also provide an interesting possibility for enhancing the computational efficiency of other auxiliary field methodologies

  14. A computational approach to negative priming

    Science.gov (United States)

    Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael

    2007-09-01

    Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).

  15. Blueprinting Approach in Support of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2012-03-01

    Full Text Available Current cloud service offerings, i.e., Software-as-a-service (SaaS, Platform-as-a-service (PaaS and Infrastructure-as-a-service (IaaS offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of Service-based Application (SBA developers to configure and syndicate offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e., SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, this paper introduces a formal Blueprint Template for unambiguously describing a blueprint, as well as a Blueprint Lifecycle that guides developers through the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within an EC’s FP7 project is reported and an associated blueprint prototype implementation is presented.

  16. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  17. Interactions and functionalities of the gut revealed by computational approaches

    NARCIS (Netherlands)

    Benis, Nirupama

    2017-01-01

    The gastrointestinal tract is subject of much research for its role in an organism’s health owing to its role as gatekeeper. The tissue acts as a barrier to keep out harmful substances like pathogens and toxins while absorbing nutrients that arise from the digestion of dietary components in in

  18. A comparative approach to closed-loop computation.

    Science.gov (United States)

    Roth, E; Sponberg, S; Cowan, N J

    2014-04-01

    Neural computation is inescapably closed-loop: the nervous system processes sensory signals to shape motor output, and motor output consequently shapes sensory input. Technological advances have enabled neuroscientists to close, open, and alter feedback loops in a wide range of experimental preparations. The experimental capability of manipulating the topology-that is, how information can flow between subsystems-provides new opportunities to understand the mechanisms and computations underlying behavior. These experiments encompass a spectrum of approaches from fully open-loop, restrained preparations to the fully closed-loop character of free behavior. Control theory and system identification provide a clear computational framework for relating these experimental approaches. We describe recent progress and new directions for translating experiments at one level in this spectrum to predictions at another level. Operating across this spectrum can reveal new understanding of how low-level neural mechanisms relate to high-level function during closed-loop behavior. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Bioinformatic approaches reveal metagenomic characterization of soil microbial community.

    Directory of Open Access Journals (Sweden)

    Zhuofei Xu

    Full Text Available As is well known, soil is a complex ecosystem harboring the most prokaryotic biodiversity on the Earth. In recent years, the advent of high-throughput sequencing techniques has greatly facilitated the progress of soil ecological studies. However, how to effectively understand the underlying biological features of large-scale sequencing data is a new challenge. In the present study, we used 33 publicly available metagenomes from diverse soil sites (i.e. grassland, forest soil, desert, Arctic soil, and mangrove sediment and integrated some state-of-the-art computational tools to explore the phylogenetic and functional characterizations of the microbial communities in soil. Microbial composition and metabolic potential in soils were comprehensively illustrated at the metagenomic level. A spectrum of metagenomic biomarkers containing 46 taxa and 33 metabolic modules were detected to be significantly differential that could be used as indicators to distinguish at least one of five soil communities. The co-occurrence associations between complex microbial compositions and functions were inferred by network-based approaches. Our results together with the established bioinformatic pipelines should provide a foundation for future research into the relation between soil biodiversity and ecosystem function.

  20. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  1. Substrate channel in nitrogenase revealed by a molecular dynamics approach.

    Science.gov (United States)

    Smith, Dayle; Danyal, Karamatullah; Raugei, Simone; Seefeldt, Lance C

    2014-04-15

    Mo-dependent nitrogenase catalyzes the biological reduction of N2 to two NH3 molecules at FeMo-cofactor buried deep inside the MoFe protein. Access of substrates, such as N2, to the active site is likely restricted by the surrounding protein, requiring substrate channels that lead from the surface to the active site. Earlier studies on crystallographic structures of the MoFe protein have suggested three putative substrate channels. Here, we have utilized submicrosecond atomistic molecular dynamics simulations to allow the nitrogenase MoFe protein to explore its conformational space in an aqueous solution at physiological ionic strength, revealing a putative substrate channel. The viability of this observed channel was tested by examining the free energy of passage of N2 from the surface through the channel to FeMo-cofactor, resulting in the discovery of a very low energy barrier. These studies point to a viable substrate channel in nitrogenase that appears during thermal motions of the protein in an aqueous environment and that approaches a face of FeMo-cofactor earlier implicated in substrate binding.

  2. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  3. CREATIVE APPROACHES TO COMPUTER SCIENCE EDUCATION

    Directory of Open Access Journals (Sweden)

    V. B. Raspopov

    2010-04-01

    Full Text Available Using the example of PPS «Toolbox of multimedia lessons «For Children About Chopin» we demonstrate the possibility of involving creative students in developing the software packages for educational purposes. Similar projects can be assigned to school and college students studying computer sciences and informatics, and implemented under the teachers’ supervision, as advanced assignments or thesis projects as a part of a high school course IT or Computer Sciences, a college course of Applied Scientific Research, or as a part of preparation for students’ participation in the Computer Science competitions or IT- competitions of Youth Academy of Sciences ( MAN in Russian or in Ukrainian.

  4. Computer science approach to quantum control

    International Nuclear Information System (INIS)

    Janzing, D.

    2006-01-01

    Whereas it is obvious that every computation process is a physical process it has hardly been recognized that many complex physical processes bear similarities to computation processes. This is in particular true for the control of physical systems on the nanoscopic level: usually the system can only be accessed via a rather limited set of elementary control operations and for many purposes only a concatenation of a large number of these basic operations will implement the desired process. This concatenation is in many cases quite similar to building complex programs from elementary steps and principles for designing algorithm may thus be a paradigm for designing control processes. For instance, one can decrease the temperature of one part of a molecule by transferring its heat to the remaining part where it is then dissipated to the environment. But the implementation of such a process involves a complex sequence of electromagnetic pulses. This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. We show that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer. Likewise, the implementation of a heat engine on the nanoscale requires to process the heat in a way that is similar to information processing and it can be shown that heat engines with maximal efficiency would be powerful computers, too. In the same way as problems in computer science can be classified by complexity classes we can also classify control problems according to their complexity. Moreover, we directly relate these complexity classes for control problems to the classes in computer science. Unifying notions of complexity in computer science and physics has therefore two aspects: on the one hand, computer science methods help to analyze the complexity of physical processes. On the other hand, reasonable

  5. Computational and Experimental Approaches to Visual Aesthetics

    Science.gov (United States)

    Brachmann, Anselm; Redies, Christoph

    2017-01-01

    Aesthetics has been the subject of long-standing debates by philosophers and psychologists alike. In psychology, it is generally agreed that aesthetic experience results from an interaction between perception, cognition, and emotion. By experimental means, this triad has been studied in the field of experimental aesthetics, which aims to gain a better understanding of how aesthetic experience relates to fundamental principles of human visual perception and brain processes. Recently, researchers in computer vision have also gained interest in the topic, giving rise to the field of computational aesthetics. With computing hardware and methodology developing at a high pace, the modeling of perceptually relevant aspect of aesthetic stimuli has a huge potential. In this review, we present an overview of recent developments in computational aesthetics and how they relate to experimental studies. In the first part, we cover topics such as the prediction of ratings, style and artist identification as well as computational methods in art history, such as the detection of influences among artists or forgeries. We also describe currently used computational algorithms, such as classifiers and deep neural networks. In the second part, we summarize results from the field of experimental aesthetics and cover several isolated image properties that are believed to have a effect on the aesthetic appeal of visual stimuli. Their relation to each other and to findings from computational aesthetics are discussed. Moreover, we compare the strategies in the two fields of research and suggest that both fields would greatly profit from a joined research effort. We hope to encourage researchers from both disciplines to work more closely together in order to understand visual aesthetics from an integrated point of view. PMID:29184491

  6. Computational Approaches to Chemical Hazard Assessment

    Science.gov (United States)

    Luechtefeld, Thomas; Hartung, Thomas

    2018-01-01

    Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769

  7. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  8. Approaching Engagement towards Human-Engaged Computing

    DEFF Research Database (Denmark)

    Niksirat, Kavous Salehzadeh; Sarcar, Sayan; Sun, Huatong

    2018-01-01

    Debates regarding the nature and role of HCI research and practice have intensified in recent years, given the ever increasingly intertwined relations between humans and technologies. The framework of Human-Engaged Computing (HEC) was proposed and developed over a series of scholarly workshops to...

  9. Computational and mathematical approaches to societal transitions

    NARCIS (Netherlands)

    J.S. Timmermans (Jos); F. Squazzoni (Flaminio); J. de Haan (Hans)

    2008-01-01

    textabstractAfter an introduction of the theoretical framework and concepts of transition studies, this article gives an overview of how structural change in social systems has been studied from various disciplinary perspectives. This overview first leads to the conclusion that computational and

  10. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document a ...

  11. A Constructive Induction Approach to Computer Immunology

    Science.gov (United States)

    1999-03-01

    LVM98] Lamont, Gary B., David A. Van Veldhuizen , and Robert E Marmelstein, A Distributed Architecture for a Self-Adaptive Computer Virus...Artificial Intelligence, Herndon, VA, 1995. [MVL98] Marmelstein, Robert E., David A. Van Veldhuizen , and Gary B. Lamont. Modeling & Analysis

  12. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic

  13. Data science in R a case studies approach to computational reasoning and problem solving

    CERN Document Server

    Nolan, Deborah

    2015-01-01

    Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar

  14. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...

  15. Machine learning and computer vision approaches for phenotypic profiling.

    Science.gov (United States)

    Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J

    2017-01-02

    With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.

  16. Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories

    Science.gov (United States)

    Ng, Hok Kwan; Sridhar, Banavar

    2016-01-01

    This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.

  17. Computational approach to large quantum dynamical problems

    International Nuclear Information System (INIS)

    Friesner, R.A.; Brunet, J.P.; Wyatt, R.E.; Leforestier, C.; Binkley, S.

    1987-01-01

    The organizational structure is described for a new program that permits computations on a variety of quantum mechanical problems in chemical dynamics and spectroscopy. Particular attention is devoted to developing and using algorithms that exploit the capabilities of current vector supercomputers. A key component in this procedure is the recursive transformation of the large sparse Hamiltonian matrix into a much smaller tridiagonal matrix. An application to time-dependent laser molecule energy transfer is presented. Rate of energy deposition in the multimode molecule for systematic variations in the molecular intermode coupling parameters is emphasized

  18. A complex network approach to cloud computing

    International Nuclear Information System (INIS)

    Travieso, Gonzalo; Ruggiero, Carlos Antônio; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2016-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the clients’ tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlaid by Erdős–Rényi (ER) and Barabási-Albert (BA) topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of the cost of communication between the client and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter, the ER topology provides better performance than the BA for smaller average degrees and opposite behaviour for larger average degrees. With respect to cost, smaller values are found in the BA topology irrespective of the average degree. In addition, we also verified that it is easier to find good servers in ER than in BA networks. Surprisingly, balance and cost are not too much affected by the presence of communities. However, for a well-defined community network, we found that it is important to assign each server to a different community so as to achieve better performance. (paper: interdisciplinary statistical mechanics )

  19. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  20. Bifurcation-based approach reveals synergism and optimal combinatorial perturbation.

    Science.gov (United States)

    Liu, Yanwei; Li, Shanshan; Liu, Zengrong; Wang, Ruiqi

    2016-06-01

    Cells accomplish the process of fate decisions and form terminal lineages through a series of binary choices in which cells switch stable states from one branch to another as the interacting strengths of regulatory factors continuously vary. Various combinatorial effects may occur because almost all regulatory processes are managed in a combinatorial fashion. Combinatorial regulation is crucial for cell fate decisions because it may effectively integrate many different signaling pathways to meet the higher regulation demand during cell development. However, whether the contribution of combinatorial regulation to the state transition is better than that of a single one and if so, what the optimal combination strategy is, seem to be significant issue from the point of view of both biology and mathematics. Using the approaches of combinatorial perturbations and bifurcation analysis, we provide a general framework for the quantitative analysis of synergism in molecular networks. Different from the known methods, the bifurcation-based approach depends only on stable state responses to stimuli because the state transition induced by combinatorial perturbations occurs between stable states. More importantly, an optimal combinatorial perturbation strategy can be determined by investigating the relationship between the bifurcation curve of a synergistic perturbation pair and the level set of a specific objective function. The approach is applied to two models, i.e., a theoretical multistable decision model and a biologically realistic CREB model, to show its validity, although the approach holds for a general class of biological systems.

  1. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin; Alkhalifah, Tariq Ali

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  2. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    Integration of case study approach, project design and computer modeling in managerial accounting education ... Journal of Fundamental and Applied Sciences ... in the Laboratory of Management Accounting and Controlling Systems at the ...

  3. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  4. Bioinspired Computational Approach to Missing Value Estimation

    Directory of Open Access Journals (Sweden)

    Israel Edem Agbehadji

    2018-01-01

    Full Text Available Missing data occurs when values of variables in a dataset are not stored. Estimating these missing values is a significant step during the data cleansing phase of a big data management approach. The reason of missing data may be due to nonresponse or omitted entries. If these missing data are not handled properly, this may create inaccurate results during data analysis. Although a traditional method such as maximum likelihood method extrapolates missing values, this paper proposes a bioinspired method based on the behavior of birds, specifically the Kestrel bird. This paper describes the behavior and characteristics of the Kestrel bird, a bioinspired approach, in modeling an algorithm to estimate missing values. The proposed algorithm (KSA was compared with WSAMP, Firefly, and BAT algorithm. The results were evaluated using the mean of absolute error (MAE. A statistical test (Wilcoxon signed-rank test and Friedman test was conducted to test the performance of the algorithms. The results of Wilcoxon test indicate that time does not have a significant effect on the performance, and the quality of estimation between the paired algorithms was significant; the results of Friedman test ranked KSA as the best evolutionary algorithm.

  5. Computational fluid dynamics in ventilation: Practical approach

    Science.gov (United States)

    Fontaine, J. R.

    The potential of computation fluid dynamics (CFD) for conceiving ventilation systems is shown through the simulation of five practical cases. The following examples are considered: capture of pollutants on a surface treating tank equipped with a unilateral suction slot in the presence of a disturbing air draft opposed to suction; dispersion of solid aerosols inside fume cupboards; performances comparison of two general ventilation systems in a silkscreen printing workshop; ventilation of a large open painting area; and oil fog removal inside a mechanical engineering workshop. Whereas the two first problems are analyzed through two dimensional numerical simulations, the three other cases require three dimensional modeling. For the surface treating tank case, numerical results are compared to laboratory experiment data. All simulations are carried out using EOL, a CFD software specially devised to deal with air quality problems in industrial ventilated premises. It contains many analysis tools to interpret the results in terms familiar to the industrial hygienist. Much experimental work has been engaged to validate the predictions of EOL for ventilation flows.

  6. Metabolomics Approach Reveals Integrated Metabolic Network Associated with Serotonin Deficiency

    Science.gov (United States)

    Weng, Rui; Shen, Sensen; Tian, Yonglu; Burton, Casey; Xu, Xinyuan; Liu, Yi; Chang, Cuilan; Bai, Yu; Liu, Huwei

    2015-07-01

    Serotonin is an important neurotransmitter that broadly participates in various biological processes. While serotonin deficiency has been associated with multiple pathological conditions such as depression, schizophrenia, Alzheimer’s disease and Parkinson’s disease, the serotonin-dependent mechanisms remain poorly understood. This study therefore aimed to identify novel biomarkers and metabolic pathways perturbed by serotonin deficiency using metabolomics approach in order to gain new metabolic insights into the serotonin deficiency-related molecular mechanisms. Serotonin deficiency was achieved through pharmacological inhibition of tryptophan hydroxylase (Tph) using p-chlorophenylalanine (pCPA) or genetic knockout of the neuronal specific Tph2 isoform. This dual approach improved specificity for the serotonin deficiency-associated biomarkers while minimizing nonspecific effects of pCPA treatment or Tph2 knockout (Tph2-/-). Non-targeted metabolic profiling and a targeted pCPA dose-response study identified 21 biomarkers in the pCPA-treated mice while 17 metabolites in the Tph2-/- mice were found to be significantly altered compared with the control mice. These newly identified biomarkers were associated with amino acid, energy, purine, lipid and gut microflora metabolisms. Oxidative stress was also found to be significantly increased in the serotonin deficient mice. These new biomarkers and the overall metabolic pathways may provide new understanding for the serotonin deficiency-associated mechanisms under multiple pathological states.

  7. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  8. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The sco...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  9. Digging deeper on "deep" learning: A computational ecology approach.

    Science.gov (United States)

    Buscema, Massimo; Sacco, Pier Luigi

    2017-01-01

    We propose an alternative approach to "deep" learning that is based on computational ecologies of structurally diverse artificial neural networks, and on dynamic associative memory responses to stimuli. Rather than focusing on massive computation of many different examples of a single situation, we opt for model-based learning and adaptive flexibility. Cross-fertilization of learning processes across multiple domains is the fundamental feature of human intelligence that must inform "new" artificial intelligence.

  10. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  11. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  12. Optimization of rootkit revealing system resources – A game theoretic approach

    Directory of Open Access Journals (Sweden)

    K. Muthumanickam

    2015-10-01

    Full Text Available Malicious rootkit is a collection of programs designed with the intent of infecting and monitoring the victim computer without the user’s permission. After the victim has been compromised, the remote attacker can easily cause further damage. In order to infect, compromise and monitor, rootkits adopt Native Application Programming Interface (API hooking technique. To reveal the hidden rootkits, current rootkit detection techniques check different data structures which hold reference to Native APIs. To verify these data structures, a large amount of system resources are required. This is because of the number of APIs in these data structures being quite large. Game theoretic approach is a useful mathematical tool to simulate network attacks. In this paper, a mathematical model is framed to optimize resource consumption using game-theory. To the best of our knowledge, this is the first work to be proposed for optimizing resource consumption while revealing rootkit presence using game theory. Non-cooperative game model is taken to discuss the problem. Analysis and simulation results show that our game theoretic model can effectively reduce the resource consumption by selectively monitoring the number of APIs in windows platform.

  13. A Scalable Permutation Approach Reveals Replication and Preservation Patterns of Network Modules in Large Datasets.

    Science.gov (United States)

    Ritchie, Scott C; Watts, Stephen; Fearnley, Liam G; Holt, Kathryn E; Abraham, Gad; Inouye, Michael

    2016-07-01

    Network modules-topologically distinct groups of edges and nodes-that are preserved across datasets can reveal common features of organisms, tissues, cell types, and molecules. Many statistics to identify such modules have been developed, but testing their significance requires heuristics. Here, we demonstrate that current methods for assessing module preservation are systematically biased and produce skewed p values. We introduce NetRep, a rapid and computationally efficient method that uses a permutation approach to score module preservation without assuming data are normally distributed. NetRep produces unbiased p values and can distinguish between true and false positives during multiple hypothesis testing. We use NetRep to quantify preservation of gene coexpression modules across murine brain, liver, adipose, and muscle tissues. Complex patterns of multi-tissue preservation were revealed, including a liver-derived housekeeping module that displayed adipose- and muscle-specific association with body weight. Finally, we demonstrate the broader applicability of NetRep by quantifying preservation of bacterial networks in gut microbiota between men and women. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  14. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  15. Do Energy Efficiency Standards Improve Quality? Evidence from a Revealed Preference Approach

    Energy Technology Data Exchange (ETDEWEB)

    Houde, Sebastien [Univ. of Maryland, College Park, MD (United States); Spurlock, C. Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-06-01

    Minimum energy efficiency standards have occupied a central role in U.S. energy policy for more than three decades, but little is known about their welfare effects. In this paper, we employ a revealed preference approach to quantify the impact of past revisions in energy efficiency standards on product quality. The micro-foundation of our approach is a discrete choice model that allows us to compute a price-adjusted index of vertical quality. Focusing on the appliance market, we show that several standard revisions during the period 2001-2011 have led to an increase in quality. We also show that these standards have had a modest effect on prices, and in some cases they even led to decreases in prices. For revision events where overall quality increases and prices decrease, the consumer welfare effect of tightening the standards is unambiguously positive. Finally, we show that after controlling for the effect of improvement in energy efficiency, standards have induced an expansion of quality in the non-energy dimension. We discuss how imperfect competition can rationalize these results.

  16. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  17. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  18. Cloud Computing - A Unified Approach for Surveillance Issues

    Science.gov (United States)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  19. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    Directory of Open Access Journals (Sweden)

    Grover Kearns

    2010-06-01

    Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

  20. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  1. A computational approach to chemical etiologies of diabetes

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Brunak, Søren; Grandjean, Philippe

    2013-01-01

    Computational meta-analysis can link environmental chemicals to genes and proteins involved in human diseases, thereby elucidating possible etiologies and pathogeneses of non-communicable diseases. We used an integrated computational systems biology approach to examine possible pathogenetic...... linkages in type 2 diabetes (T2D) through genome-wide associations, disease similarities, and published empirical evidence. Ten environmental chemicals were found to be potentially linked to T2D, the highest scores were observed for arsenic, 2,3,7,8-tetrachlorodibenzo-p-dioxin, hexachlorobenzene...

  2. The soft computing-based approach to investigate allergic diseases: a systematic review.

    Science.gov (United States)

    Tartarisco, Gennaro; Tonacci, Alessandro; Minciullo, Paola Lucia; Billeci, Lucia; Pioggia, Giovanni; Incorvaia, Cristoforo; Gangemi, Sebastiano

    2017-01-01

    Early recognition of inflammatory markers and their relation to asthma, adverse drug reactions, allergic rhinitis, atopic dermatitis and other allergic diseases is an important goal in allergy. The vast majority of studies in the literature are based on classic statistical methods; however, developments in computational techniques such as soft computing-based approaches hold new promise in this field. The aim of this manuscript is to systematically review the main soft computing-based techniques such as artificial neural networks, support vector machines, bayesian networks and fuzzy logic to investigate their performances in the field of allergic diseases. The review was conducted following PRISMA guidelines and the protocol was registered within PROSPERO database (CRD42016038894). The research was performed on PubMed and ScienceDirect, covering the period starting from September 1, 1990 through April 19, 2016. The review included 27 studies related to allergic diseases and soft computing performances. We observed promising results with an overall accuracy of 86.5%, mainly focused on asthmatic disease. The review reveals that soft computing-based approaches are suitable for big data analysis and can be very powerful, especially when dealing with uncertainty and poorly characterized parameters. Furthermore, they can provide valuable support in case of lack of data and entangled cause-effect relationships, which make it difficult to assess the evolution of disease. Although most works deal with asthma, we believe the soft computing approach could be a real breakthrough and foster new insights into other allergic diseases as well.

  3. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  4. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  5. A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS

    OpenAIRE

    Monika Raghuvanshi*, Rahul Patel

    2016-01-01

    In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...

  6. Spontaneous Movements of a Computer Mouse Reveal Egoism and In-group Favoritism.

    Science.gov (United States)

    Maliszewski, Norbert; Wojciechowski, Łukasz; Suszek, Hubert

    2017-01-01

    The purpose of the project was to assess whether the first spontaneous movements of a computer mouse, when making an assessment on a scale presented on the screen, may express a respondent's implicit attitudes. In Study 1, the altruistic behaviors of 66 students were assessed. The students were led to believe that the task they were performing was also being performed by another person and they were asked to distribute earnings between themselves and the partner. The participants performed the tasks under conditions with and without distractors. With the distractors, in the first few seconds spontaneous mouse movements on the scale expressed a selfish distribution of money, while later the movements gravitated toward more altruism. In Study 2, 77 Polish students evaluated a painting by a Polish/Jewish painter on a scale. They evaluated it under conditions of full or distracted cognitive abilities. Spontaneous movements of the mouse on the scale were analyzed. In addition, implicit attitudes toward both Poles and Jews were measured with the Implicit Association Test (IAT). A significant association between implicit attitudes (IAT) and spontaneous evaluation of images using a computer mouse was observed in the group with the distractor. The participants with strong implicit in-group favoritism of Poles revealed stronger preference for the Polish painter's work in the first few seconds of mouse movement. Taken together, these results suggest that spontaneous mouse movements may reveal egoism (in-group favoritism), i.e., processes that were not observed in the participants' final decisions (clicking on the scale).

  7. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  8. Essential algorithms a practical approach to computer algorithms

    CERN Document Server

    Stephens, Rod

    2013-01-01

    A friendly and accessible introduction to the most useful algorithms Computer algorithms are the basic recipes for programming. Professional programmers need to know how to use algorithms to solve difficult programming problems. Written in simple, intuitive English, this book describes how and when to use the most practical classic algorithms, and even how to create new algorithms to meet future needs. The book also includes a collection of questions that can help readers prepare for a programming job interview. Reveals methods for manipulating common data structures s

  9. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  10. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...... of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype....

  11. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  12. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  13. Computer-oriented approach to fault-tree construction

    International Nuclear Information System (INIS)

    Salem, S.L.; Apostolakis, G.E.; Okrent, D.

    1976-11-01

    A methodology for systematically constructing fault trees for general complex systems is developed and applied, via the Computer Automated Tree (CAT) program, to several systems. A means of representing component behavior by decision tables is presented. The method developed allows the modeling of components with various combinations of electrical, fluid and mechanical inputs and outputs. Each component can have multiple internal failure mechanisms which combine with the states of the inputs to produce the appropriate output states. The generality of this approach allows not only the modeling of hardware, but human actions and interactions as well. A procedure for constructing and editing fault trees, either manually or by computer, is described. The techniques employed result in a complete fault tree, in standard form, suitable for analysis by current computer codes. Methods of describing the system, defining boundary conditions and specifying complex TOP events are developed in order to set up the initial configuration for which the fault tree is to be constructed. The approach used allows rapid modifications of the decision tables and systems to facilitate the analysis and comparison of various refinements and changes in the system configuration and component modeling

  14. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Perrin H. Beatty

    2016-10-01

    Full Text Available A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields.

  15. Computational approaches in the design of synthetic receptors - A review.

    Science.gov (United States)

    Cowen, Todd; Karim, Kal; Piletsky, Sergey

    2016-09-14

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as "plastic antibodies" - high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller-Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Analytical and computational approaches to define the Aspergillus niger secretome

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, Adrian; Butler, Gregory D.; Powlowski, Justin; Panisko, Ellen A.; Baker, Scott E.

    2009-03-01

    We used computational and mass spectrometric approaches to characterize the Aspergillus niger secretome. The 11,200 gene models predicted in the genome of A. niger strain ATCC 1015 were the data source for the analysis. Depending on the computational methods used, 691 to 881 proteins were predicted to be secreted proteins. We cultured A. niger in six different media and analyzed the extracellular proteins produced using mass spectrometry. A total of 222 proteins were identified, with 39 proteins expressed under all six conditions and 74 proteins expressed under only one condition. The secreted proteins identified by mass spectrometry were used to guide the correction of about 20 gene models. Additional analysis focused on extracellular enzymes of interest for biomass processing. Of the 63 glycoside hydrolases predicted to be capable of hydrolyzing cellulose, hemicellulose or pectin, 94% of the exo-acting enzymes and only 18% of the endo-acting enzymes were experimentally detected.

  17. Fast reactor safety and computational thermo-fluid dynamics approaches

    International Nuclear Information System (INIS)

    Ninokata, Hisashi; Shimizu, Takeshi

    1993-01-01

    This article provides a brief description of the safety principle on which liquid metal cooled fast breeder reactors (LMFBRs) is based and the roles of computations in the safety practices. A number of thermohydraulics models have been developed to date that successfully describe several of the important types of fluids and materials motion encountered in the analysis of postulated accidents in LMFBRs. Most of these models use a mixture of implicit and explicit numerical solution techniques in solving a set of conservation equations formulated in Eulerian coordinates, with special techniques included to specific situations. Typical computational thermo-fluid dynamics approaches are discussed in particular areas of analyses of the physical phenomena relevant to the fuel subassembly thermohydraulics design and that involve describing the motion of molten materials in the core over a large scale. (orig.)

  18. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.; Cook, J.R.

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  19. Revealing −1 Programmed Ribosomal Frameshifting Mechanisms by Single-Molecule Techniques and Computational Methods

    Directory of Open Access Journals (Sweden)

    Kai-Chun Chang

    2012-01-01

    Full Text Available Programmed ribosomal frameshifting (PRF serves as an intrinsic translational regulation mechanism employed by some viruses to control the ratio between structural and enzymatic proteins. Most viral mRNAs which use PRF adapt an H-type pseudoknot to stimulate −1 PRF. The relationship between the thermodynamic stability and the frameshifting efficiency of pseudoknots has not been fully understood. Recently, single-molecule force spectroscopy has revealed that the frequency of −1 PRF correlates with the unwinding forces required for disrupting pseudoknots, and that some of the unwinding work dissipates irreversibly due to the torsional restraint of pseudoknots. Complementary to single-molecule techniques, computational modeling provides insights into global motions of the ribosome, whose structural transitions during frameshifting have not yet been elucidated in atomic detail. Taken together, recent advances in biophysical tools may help to develop antiviral therapies that target the ubiquitous −1 PRF mechanism among viruses.

  20. Spontaneous Movements of a Computer Mouse Reveal Egoism and In-group Favoritism

    Science.gov (United States)

    Maliszewski, Norbert; Wojciechowski, Łukasz; Suszek, Hubert

    2017-01-01

    The purpose of the project was to assess whether the first spontaneous movements of a computer mouse, when making an assessment on a scale presented on the screen, may express a respondent’s implicit attitudes. In Study 1, the altruistic behaviors of 66 students were assessed. The students were led to believe that the task they were performing was also being performed by another person and they were asked to distribute earnings between themselves and the partner. The participants performed the tasks under conditions with and without distractors. With the distractors, in the first few seconds spontaneous mouse movements on the scale expressed a selfish distribution of money, while later the movements gravitated toward more altruism. In Study 2, 77 Polish students evaluated a painting by a Polish/Jewish painter on a scale. They evaluated it under conditions of full or distracted cognitive abilities. Spontaneous movements of the mouse on the scale were analyzed. In addition, implicit attitudes toward both Poles and Jews were measured with the Implicit Association Test (IAT). A significant association between implicit attitudes (IAT) and spontaneous evaluation of images using a computer mouse was observed in the group with the distractor. The participants with strong implicit in-group favoritism of Poles revealed stronger preference for the Polish painter’s work in the first few seconds of mouse movement. Taken together, these results suggest that spontaneous mouse movements may reveal egoism (in-group favoritism), i.e., processes that were not observed in the participants’ final decisions (clicking on the scale). PMID:28163689

  1. Approaching multiphase flows from the perspective of computational fluid dynamics

    International Nuclear Information System (INIS)

    Banas, A.O.

    1992-01-01

    Thermalhydraulic simulation methodologies based on subchannel and porous-medium concepts are briefly reviewed and contrasted with the general approach of Computational Fluid Dynamics (CFD). An outline of the advanced CFD methods for single-phase turbulent flows is followed by a short discussion of the unified formulation of averaged equations for turbulent and multiphase flows. Some of the recent applications of CFD at Chalk River Laboratories are discussed, and the complementary role of CFD with regard to the established thermalhydraulic methods of analysis is indicated. (author). 8 refs

  2. In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway

    Science.gov (United States)

    Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun

    2016-12-01

    HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.

  3. Computational redesign reveals allosteric mutation hotspots of organophosphate hydrolase that enhance organophosphate hydrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Reed B. [Univ. of North Carolina, Chapel Hill, NC (United States); Ding, Feng [Clemson Univ., SC (United States); Ye, Dongmei [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ackerman, Eric [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dokholyan, Nikolay V. [Univ. of North Carolina, Chapel Hill, NC (United States)

    2015-04-01

    Organophosphates are widely used for peaceful (agriculture) and military purposes (chemical warfare agents). The extraordinary toxicity of organophosphates and the risk of deployment, make it critical to develop means for their rapid and efficient deactivation. Organophosphate hydrolase (OPH) already plays an important role in organophosphate remediation, but is insufficient for therapeutic or prophylactic purposes primarily due to low substrate affinity. Current efforts focus on directly modifying the active site to differentiate substrate specificity and increase catalytic activity. Here, we present a novel strategy for enhancing the general catalytic efficiency of OPH through computational redesign of the residues that are allosterically coupled to the active site and validated our design by mutagenesis. Specifically, we identify five such hot-spot residues for allosteric regulation and assay these mutants for hydrolysis activity against paraoxon, a chemical-weapons simulant. A high percentage of the predicted mutants exhibit enhanced activity over wild-type (kcat =16.63 s-1), such as T199I/T54I (899.5 s-1) and C227V/T199I/T54I (848 s-1), while the Km remains relatively unchanged in our high-throughput cell-free expression system. Further computational studies of protein dynamics reveal four distinct distal regions coupled to the active site that display significant changes in conformation dynamics upon these identified mutations. These results validate a computational design method that is both efficient and easily adapted as a general procedure for enzymatic enhancement.

  4. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    International Nuclear Information System (INIS)

    Khan, Junaid Ali; Raja, Muhammad Asif Zahoor; Qureshi, Ijaz Mansoor

    2011-01-01

    We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed. (general)

  5. Microarray-based cancer prediction using soft computing approach.

    Science.gov (United States)

    Wang, Xiaosheng; Gotoh, Osamu

    2009-05-26

    One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.

  6. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  7. Error characterization for asynchronous computations: Proxy equation approach

    Science.gov (United States)

    Sallai, Gabriella; Mittal, Ankita; Girimaji, Sharath

    2017-11-01

    Numerical techniques for asynchronous fluid flow simulations are currently under development to enable efficient utilization of massively parallel computers. These numerical approaches attempt to accurately solve time evolution of transport equations using spatial information at different time levels. The truncation error of asynchronous methods can be divided into two parts: delay dependent (EA) or asynchronous error and delay independent (ES) or synchronous error. The focus of this study is a specific asynchronous error mitigation technique called proxy-equation approach. The aim of this study is to examine these errors as a function of the characteristic wavelength of the solution. Mitigation of asynchronous effects requires that the asynchronous error be smaller than synchronous truncation error. For a simple convection-diffusion equation, proxy-equation error analysis identifies critical initial wave-number, λc. At smaller wave numbers, synchronous error are larger than asynchronous errors. We examine various approaches to increase the value of λc in order to improve the range of applicability of proxy-equation approach.

  8. Optimization of rootkit revealing system resources – A game theoretic approach

    OpenAIRE

    Muthumanickam, K.; Ilavarasan, E.

    2015-01-01

    Malicious rootkit is a collection of programs designed with the intent of infecting and monitoring the victim computer without the user’s permission. After the victim has been compromised, the remote attacker can easily cause further damage. In order to infect, compromise and monitor, rootkits adopt Native Application Programming Interface (API) hooking technique. To reveal the hidden rootkits, current rootkit detection techniques check different data structures which hold reference to Native...

  9. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    Science.gov (United States)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  10. Novel computational approaches for the analysis of cosmic magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)

    2016-07-01

    In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.

  11. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  12. Radiological management of blunt polytrauma with computed tomography and angiography: an integrated approach

    Energy Technology Data Exchange (ETDEWEB)

    Kurdziel, J.C.; Dondelinger, R.F.; Hemmer, M.

    1987-01-01

    107 polytraumatized patients, who had experienced blunt trauma have been worked up at admission with computed tomography of the thorax, abdomen and pelvis following computed tomography study of the brain: significant lesions were revealed in 98 (90%) patients. 79 (74%) patients showed trauma to the thorax, in 69 (64%) patients abdominal or pelvic trauma was evidenced. No false positive diagnosis was established. 5 traumatic findings were missed. Emergency angiography was indicated in 3 (3%) patients, following computed tomography examination. 3 other trauma patients were submitted directly to angiography without computed tomography examination during the time period this study was completed. Embolization was carried out in 5/6 patients. No thoracotomy was needed. 13 (12%) patients underwent laparotomy following computed tomography. Overall mortality during hospital stay was 14% (15/107). No patient died from visceral bleeding. Conservative management of blunt polytrauma patients can be advocated in almost 90% of visceral lesions. Computed tomography coupled with angiography and embolization represent an adequate integrated approach to the management of blunt polytrauma patients.

  13. Radiological management of blunt polytrauma with computed tomography and angiography: an integrated approach

    International Nuclear Information System (INIS)

    Kurdziel, J.C.; Dondelinger, R.F.; Hemmer, M.

    1987-01-01

    107 polytraumatized patients, who had experienced blunt trauma have been worked up at admission with computed tomography of the thorax, abdomen and pelvis following computed tomography study of the brain: significant lesions were revealed in 98 (90%) patients. 79 (74%) patients showed trauma to the thorax, in 69 (64%) patients abdominal or pelvic trauma was evidenced. No false positive diagnosis was established. 5 traumatic findings were missed. Emergency angiography was indicated in 3 (3%) patients, following computed tomography examination. 3 other trauma patients were submitted directly to angiography without computed tomography examination during the time period this study was completed. Embolization was carried out in 5/6 patients. No thoracotomy was needed. 13 (12%) patients underwent laparotomy following computed tomography. Overall mortality during hospital stay was 14% (15/107). No patient died from visceral bleeding. Conservative management of blunt polytrauma patients can be advocated in almost 90% of visceral lesions. Computed tomography coupled with angiography and embolization represent an adequate integrated approach to the management of blunt polytrauma patients

  14. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  15. Computational Diagnostic: A Novel Approach to View Medical Data.

    Energy Technology Data Exchange (ETDEWEB)

    Mane, K. K. (Ketan Kirtiraj); Börner, K. (Katy)

    2007-01-01

    A transition from traditional paper-based medical records to electronic health record is largely underway. The use of electronic records offers tremendous potential to personalize patient diagnosis and treatment. In this paper, we discuss a computational diagnostic tool that uses digital medical records to help doctors gain better insight about a patient's medical condition. The paper details different interactive features of the tool which offer potential to practice evidence-based medicine and advance patient diagnosis practices. The healthcare industry is a constantly evolving domain. Research from this domain is often translated into better understanding of different medical conditions. This new knowledge often contributes towards improved diagnosis and treatment solutions for patients. But the healthcare industry lags behind to seek immediate benefits of the new knowledge as it still adheres to the traditional paper-based approach to keep track of medical records. However recently we notice a drive that promotes a transition towards electronic health record (EHR). An EHR stores patient medical records in digital format and offers potential to replace the paper health records. Earlier attempts of an EHR replicated the paper layout on the screen, representation of medical history of a patient in a graphical time-series format, interactive visualization with 2D/3D generated images from an imaging device. But an EHR can be much more than just an 'electronic view' of the paper record or a collection of images from an imaging device. In this paper, we present an EHR called 'Computational Diagnostic Tool', that provides a novel computational approach to look at patient medical data. The developed EHR system is knowledge driven and acts as clinical decision support tool. The EHR tool provides two visual views of the medical data. Dynamic interaction with data is supported to help doctors practice evidence-based decisions and make judicious

  16. Solvent effect on indocyanine dyes: A computational approach

    International Nuclear Information System (INIS)

    Bertolino, Chiara A.; Ferrari, Anna M.; Barolo, Claudia; Viscardi, Guido; Caputo, Giuseppe; Coluccia, Salvatore

    2006-01-01

    The solvatochromic behaviour of a series of indocyanine dyes (Dyes I-VIII) was investigated by quantum chemical calculations. The effect of the polymethine chain length and of the indolenine structure has been satisfactorily reproduced by semiempirical Pariser-Parr-Pople (PPP) calculations. The solvatochromism of 3,3,3',3'-tetramethyl-N,N'-diethylindocarbocyanine iodide (Dye I) has been deeply investigated within the ab initio time-dependent density functional theory (TD-DFT) approach. Dye I undergoes non-polar solvation and a linear correlation has been individuated between absorption shifts and refractive index. Computed absorption λ max and oscillator strengths obtained by TD-DFT are in good agreement with the experimental data

  17. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  18. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes...... of action in humans, to group them according to their modes of action, and to hypothesize on their potential effects on human health. We extracted human proteins associated to prochloraz, tebuconazole, epoxiconazole, procymidone, and mancozeb and enriched each protein set by using a high confidence human......, and procymidone exerted their effects mainly via interference with steroidogenesis and nuclear receptors. Prochloraz was associated to a large number of human diseases, and together with tebuconazole showed several significant associations to Testicular Dysgenesis Syndrome. Mancozeb showed a differential mode...

  19. Vehicular traffic noise prediction using soft computing approach.

    Science.gov (United States)

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. An Organic Computing Approach to Self-organising Robot Ensembles

    Directory of Open Access Journals (Sweden)

    Sebastian Albrecht von Mammen

    2016-11-01

    Full Text Available Similar to the Autonomous Computing initiative, that has mainly been advancing techniques for self-optimisation focussing on computing systems and infrastructures, Organic Computing (OC has been driving the development of system design concepts and algorithms for self-adaptive systems at large. Examples of application domains include, for instance, traffic management and control, cloud services, communication protocols, and robotic systems. Such an OC system typically consists of a potentially large set of autonomous and self-managed entities, where each entity acts with a local decision horizon. By means of cooperation of the individual entities, the behaviour of the entire ensemble system is derived. In this article, we present our work on how autonomous, adaptive robot ensembles can benefit from OC technology. Our elaborations are aligned with the different layers of an observer/controller framework which provides the foundation for the individuals' adaptivity at system design-level. Relying on an extended Learning Classifier System (XCS in combination with adequate simulation techniques, this basic system design empowers robot individuals to improve their individual and collaborative performances, e.g. by means of adapting to changing goals and conditions.Not only for the sake of generalisability, but also because of its enormous transformative potential, we stage our research in the domain of robot ensembles that are typically comprised of several quad-rotors and that organise themselves to fulfil spatial tasks such as maintenance of building facades or the collaborative search for mobile targets. Our elaborations detail the architectural concept, provide examples of individual self-optimisation as well as of the optimisation of collaborative efforts, and we show how the user can control the ensembles at multiple levels of abstraction. We conclude with a summary of our approach and an outlook on possible future steps.

  1. Computational dissection of human episodic memory reveals mental process-specific genetic profiles

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G.; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J.-F.

    2015-01-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory. PMID:26261317

  2. Computational integration of homolog and pathway gene module expression reveals general stemness signatures.

    Directory of Open Access Journals (Sweden)

    Martina Koeva

    Full Text Available The stemness hypothesis states that all stem cells use common mechanisms to regulate self-renewal and multi-lineage potential. However, gene expression meta-analyses at the single gene level have failed to identify a significant number of genes selectively expressed by a broad range of stem cell types. We hypothesized that stemness may be regulated by modules of homologs. While the expression of any single gene within a module may vary from one stem cell type to the next, it is possible that the expression of the module as a whole is required so that the expression of different, yet functionally-synonymous, homologs is needed in different stem cells. Thus, we developed a computational method to test for stem cell-specific gene expression patterns from a comprehensive collection of 49 murine datasets covering 12 different stem cell types. We identified 40 individual genes and 224 stemness modules with reproducible and specific up-regulation across multiple stem cell types. The stemness modules included families regulating chromatin remodeling, DNA repair, and Wnt signaling. Strikingly, the majority of modules represent evolutionarily related homologs. Moreover, a score based on the discovered modules could accurately distinguish stem cell-like populations from other cell types in both normal and cancer tissues. This scoring system revealed that both mouse and human metastatic populations exhibit higher stemness indices than non-metastatic populations, providing further evidence for a stem cell-driven component underlying the transformation to metastatic disease.

  3. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

    2015-09-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

  4. A computational approach to climate science education with CLIMLAB

    Science.gov (United States)

    Rose, B. E. J.

    2017-12-01

    CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format

  5. Towards scalable quantum communication and computation: Novel approaches and realizations

    Science.gov (United States)

    Jiang, Liang

    Quantum information science involves exploration of fundamental laws of quantum mechanics for information processing tasks. This thesis presents several new approaches towards scalable quantum information processing. First, we consider a hybrid approach to scalable quantum computation, based on an optically connected network of few-qubit quantum registers. Specifically, we develop a novel scheme for scalable quantum computation that is robust against various imperfections. To justify that nitrogen-vacancy (NV) color centers in diamond can be a promising realization of the few-qubit quantum register, we show how to isolate a few proximal nuclear spins from the rest of the environment and use them for the quantum register. We also demonstrate experimentally that the nuclear spin coherence is only weakly perturbed under optical illumination, which allows us to implement quantum logical operations that use the nuclear spins to assist the repetitive-readout of the electronic spin. Using this technique, we demonstrate more than two-fold improvement in signal-to-noise ratio. Apart from direct application to enhance the sensitivity of the NV-based nano-magnetometer, this experiment represents an important step towards the realization of robust quantum information processors using electronic and nuclear spin qubits. We then study realizations of quantum repeaters for long distance quantum communication. Specifically, we develop an efficient scheme for quantum repeaters based on atomic ensembles. We use dynamic programming to optimize various quantum repeater protocols. In addition, we propose a new protocol of quantum repeater with encoding, which efficiently uses local resources (about 100 qubits) to identify and correct errors, to achieve fast one-way quantum communication over long distances. Finally, we explore quantum systems with topological order. Such systems can exhibit remarkable phenomena such as quasiparticles with anyonic statistics and have been proposed as

  6. A computational approach to finding novel targets for existing drugs.

    Directory of Open Access Journals (Sweden)

    Yvonne Y Li

    2011-09-01

    Full Text Available Repositioning existing drugs for new therapeutic uses is an efficient approach to drug discovery. We have developed a computational drug repositioning pipeline to perform large-scale molecular docking of small molecule drugs against protein drug targets, in order to map the drug-target interaction space and find novel interactions. Our method emphasizes removing false positive interaction predictions using criteria from known interaction docking, consensus scoring, and specificity. In all, our database contains 252 human protein drug targets that we classify as reliable-for-docking as well as 4621 approved and experimental small molecule drugs from DrugBank. These were cross-docked, then filtered through stringent scoring criteria to select top drug-target interactions. In particular, we used MAPK14 and the kinase inhibitor BIM-8 as examples where our stringent thresholds enriched the predicted drug-target interactions with known interactions up to 20 times compared to standard score thresholds. We validated nilotinib as a potent MAPK14 inhibitor in vitro (IC50 40 nM, suggesting a potential use for this drug in treating inflammatory diseases. The published literature indicated experimental evidence for 31 of the top predicted interactions, highlighting the promising nature of our approach. Novel interactions discovered may lead to the drug being repositioned as a therapeutic treatment for its off-target's associated disease, added insight into the drug's mechanism of action, and added insight into the drug's side effects.

  7. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  8. Computed tomography of the lung. A pattern approach. 2. ed.

    International Nuclear Information System (INIS)

    Verschakelen, Johny A.; Wever, Walter de

    2018-01-01

    Computed Tomography of the Lung: A Pattern Approach aims to enable the reader to recognize and understand the CT signs of lung diseases and diseases with pulmonary involvement as a sound basis for diagnosis. After an introductory chapter, basic anatomy and its relevance to the interpretation of CT appearances is discussed. Advice is then provided on how to approach a CT scan of the lungs, and the different distribution and appearance patterns of disease are described. Subsequent chapters focus on the nature of these patterns, identify which diseases give rise to them, and explain how to differentiate between the diseases. The concluding chapter presents a large number of typical and less typical cases that will help the reader to practice application of the knowledge gained from the earlier chapters. Since the first edition, the book has been adapted and updated, with the inclusion of many new figures and case studies. It will be an invaluable asset both for radiologists and pulmonologists in training and for more experienced specialists wishing to update their knowledge.

  9. Optical computing - an alternate approach to trigger processing

    International Nuclear Information System (INIS)

    Cleland, W.E.

    1981-01-01

    The enormous rate reduction factors required by most ISABELLE experiments suggest that we should examine every conceivable approach to trigger processing. One approach that has not received much attention by high energy physicists is optical data processing. The past few years have seen rapid advances in optoelectronic technology, stimulated mainly by the military and the communications industry. An intriguing question is whether one can utilize this technology together with the optical computing techniques that have been developed over the past two decades to develop a rapid trigger processor for high energy physics experiments. Optical data processing is a method for performing a few very specialized operations on data which is inherently two dimensional. Typical operations are the formation of convolution or correlation integrals between the input data and information stored in the processor in the form of an optical filter. Optical processors are classed as coherent or incoherent, according to the spatial coherence of the input wavefront. Typically, in a coherent processor a laser beam is modulated with a photographic transparency which represents the input data. In an incoherent processor, the input may be an incoherently illuminated transparency, but self-luminous objects, such as an oscilloscope trace, have also been used. We consider here an incoherent processor in which the input data is converted into an optical wavefront through the excitation of an array of point sources - either light emitting diodes or injection lasers

  10. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  11. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  12. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P assessment of FFR. Average execution time went down from 196.3 ± 78.5 s for the CFD model to ∼2.4 ± 0.44 s for the machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  13. A Representation-Theoretic Approach to Reversible Computation with Applications

    DEFF Research Database (Denmark)

    Maniotis, Andreas Milton

    Reversible computing is a sub-discipline of computer science that helps to understand the foundations of the interplay between physics, algebra, and logic in the context of computation. Its subjects of study are computational devices and abstract models of computation that satisfy the constraint ......, there is still no uniform and consistent theory that is general in the sense of giving a model-independent account to the field....... of information conservation. Such machine models, which are known as reversible models of computation, have been examined both from a theoretical perspective and from an engineering perspective. While a bundle of many isolated successful findings and applications concerning reversible computing exists...

  14. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  15. Computational modeling reveals dendritic origins of GABA(A-mediated excitation in CA1 pyramidal neurons.

    Directory of Open Access Journals (Sweden)

    Naomi Lewin

    Full Text Available GABA is the key inhibitory neurotransmitter in the adult central nervous system, but in some circumstances can lead to a paradoxical excitation that has been causally implicated in diverse pathologies from endocrine stress responses to diseases of excitability including neuropathic pain and temporal lobe epilepsy. We undertook a computational modeling approach to determine plausible ionic mechanisms of GABA(A-dependent excitation in isolated post-synaptic CA1 hippocampal neurons because it may constitute a trigger for pathological synchronous epileptiform discharge. In particular, the interplay intracellular chloride accumulation via the GABA(A receptor and extracellular potassium accumulation via the K/Cl co-transporter KCC2 in promoting GABA(A-mediated excitation is complex. Experimentally it is difficult to determine the ionic mechanisms of depolarizing current since potassium transients are challenging to isolate pharmacologically and much GABA signaling occurs in small, difficult to measure, dendritic compartments. To address this problem and determine plausible ionic mechanisms of GABA(A-mediated excitation, we built a detailed biophysically realistic model of the CA1 pyramidal neuron that includes processes critical for ion homeostasis. Our results suggest that in dendritic compartments, but not in the somatic compartments, chloride buildup is sufficient to cause dramatic depolarization of the GABA(A reversal potential and dominating bicarbonate currents that provide a substantial current source to drive whole-cell depolarization. The model simulations predict that extracellular K(+ transients can augment GABA(A-mediated excitation, but not cause it. Our model also suggests the potential for GABA(A-mediated excitation to promote network synchrony depending on interneuron synapse location - excitatory positive-feedback can occur when interneurons synapse onto distal dendritic compartments, while interneurons projecting to the perisomatic

  16. A computationally efficient approach for template matching-based ...

    Indian Academy of Sciences (India)

    In this paper, a new computationally efficient image registration method is ...... the proposed method requires less computational time as compared to traditional methods. ... Zitová B and Flusser J 2003 Image registration methods: A survey.

  17. Computational approach for a pair of bubble coalescence process

    International Nuclear Information System (INIS)

    Nurul Hasan; Zalinawati binti Zakaria

    2011-01-01

    The coalescence of bubbles has great value in mineral recovery and oil industry. In this paper, two co-axial bubbles rising in a cylinder is modelled to study the coalescence of bubbles for four computational experimental test cases. The Reynolds' (Re) number is chosen in between 8.50 and 10, Bond number, Bo ∼4.25-50, Morton number, M 0.0125-14.7. The viscosity ratio (μ r ) and density ratio (ρ r ) of liquid to bubble are kept constant (100 and 850 respectively). It was found that the Bo number has significant effect on the coalescence process for constant Re, μ r and ρ r . The bubble-bubble distance over time was validated against published experimental data. The results show that VOF approach can be used to model these phenomena accurately. The surface tension was changed to alter the Bo and density of the fluids to alter the Re and M, keeping the μ r and ρ r the same. It was found that for lower Bo, the bubble coalesce is slower and the pocket at the lower part of the leading bubble is less concave (towards downward) which is supported by the experimental data.

  18. An Integrated Soft Computing Approach to Hughes Syndrome Risk Assessment.

    Science.gov (United States)

    Vilhena, João; Rosário Martins, M; Vicente, Henrique; Grañeda, José M; Caldeira, Filomena; Gusmão, Rodrigo; Neves, João; Neves, José

    2017-03-01

    The AntiPhospholipid Syndrome (APS) is an acquired autoimmune disorder induced by high levels of antiphospholipid antibodies that cause arterial and veins thrombosis, as well as pregnancy-related complications and morbidity, as clinical manifestations. This autoimmune hypercoagulable state, usually known as Hughes syndrome, has severe consequences for the patients, being one of the main causes of thrombotic disorders and death. Therefore, it is required to be preventive; being aware of how probable is to have that kind of syndrome. Despite the updated of antiphospholipid syndrome classification, the diagnosis remains difficult to establish. Additional research on clinically relevant antibodies and standardization of their quantification are required in order to improve the antiphospholipid syndrome risk assessment. Thus, this work will focus on the development of a diagnosis decision support system in terms of a formal agenda built on a Logic Programming approach to knowledge representation and reasoning, complemented with a computational framework based on Artificial Neural Networks. The proposed model allows for improving the diagnosis, classifying properly the patients that really presented this pathology (sensitivity higher than 85%), as well as classifying the absence of APS (specificity close to 95%).

  19. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hong; Dwaraknath, Shyam S.; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A.

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available.

  20. An Educational Approach to Computationally Modeling Dynamical Systems

    Science.gov (United States)

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  1. Teaching Pervasive Computing to CS Freshmen: A Multidisciplinary Approach

    NARCIS (Netherlands)

    Silvis-Cividjian, Natalia

    2015-01-01

    Pervasive Computing is a growing area in research and commercial reality. Despite this extensive growth, there is no clear consensus on how and when to teach it to students. We report on an innovative attempt to teach this subject to first year Computer Science students. Our course combines computer

  2. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  3. Elucidating Ligand-Modulated Conformational Landscape of GPCRs Using Cloud-Computing Approaches.

    Science.gov (United States)

    Shukla, Diwakar; Lawrenz, Morgan; Pande, Vijay S

    2015-01-01

    G-protein-coupled receptors (GPCRs) are a versatile family of membrane-bound signaling proteins. Despite the recent successes in obtaining crystal structures of GPCRs, much needs to be learned about the conformational changes associated with their activation. Furthermore, the mechanism by which ligands modulate the activation of GPCRs has remained elusive. Molecular simulations provide a way of obtaining detailed an atomistic description of GPCR activation dynamics. However, simulating GPCR activation is challenging due to the long timescales involved and the associated challenge of gaining insights from the "Big" simulation datasets. Here, we demonstrate how cloud-computing approaches have been used to tackle these challenges and obtain insights into the activation mechanism of GPCRs. In particular, we review the use of Markov state model (MSM)-based sampling algorithms for sampling milliseconds of dynamics of a major drug target, the G-protein-coupled receptor β2-AR. MSMs of agonist and inverse agonist-bound β2-AR reveal multiple activation pathways and how ligands function via modulation of the ensemble of activation pathways. We target this ensemble of conformations with computer-aided drug design approaches, with the goal of designing drugs that interact more closely with diverse receptor states, for overall increased efficacy and specificity. We conclude by discussing how cloud-based approaches present a powerful and broadly available tool for studying the complex biological systems routinely. © 2015 Elsevier Inc. All rights reserved.

  4. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    Frolov, I.; Vaguine, A.; Silin, A.

    1992-01-01

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  5. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  6. A computational intelligence approach to the Mars Precision Landing problem

    Science.gov (United States)

    Birge, Brian Kent, III

    Various proposed Mars missions, such as the Mars Sample Return Mission (MRSR) and the Mars Smart Lander (MSL), require precise re-entry terminal position and velocity states. This is to achieve mission objectives including rendezvous with a previous landed mission, or reaching a particular geographic landmark. The current state of the art footprint is in the magnitude of kilometers. For this research a Mars Precision Landing is achieved with a landed footprint of no more than 100 meters, for a set of initial entry conditions representing worst guess dispersions. Obstacles to reducing the landed footprint include trajectory dispersions due to initial atmospheric entry conditions (entry angle, parachute deployment height, etc.), environment (wind, atmospheric density, etc.), parachute deployment dynamics, unavoidable injection error (propagated error from launch on), etc. Weather and atmospheric models have been developed. Three descent scenarios have been examined. First, terminal re-entry is achieved via a ballistic parachute with concurrent thrusting events while on the parachute, followed by a gravity turn. Second, terminal re-entry is achieved via a ballistic parachute followed by gravity turn to hover and then thrust vector to desired location. Third, a guided parafoil approach followed by vectored thrusting to reach terminal velocity is examined. The guided parafoil is determined to be the best architecture. The purpose of this study is to examine the feasibility of using a computational intelligence strategy to facilitate precision planetary re-entry, specifically to take an approach that is somewhat more intuitive and less rigid, and see where it leads. The test problems used for all research are variations on proposed Mars landing mission scenarios developed by NASA. A relatively recent method of evolutionary computation is Particle Swarm Optimization (PSO), which can be considered to be in the same general class as Genetic Algorithms. An improvement over

  7. Three-dimension visualization of transnasal approach for revealing the metasellar organization

    Directory of Open Access Journals (Sweden)

    Liang XUE

    2012-07-01

    Full Text Available Objective To elevate the anatomical cognitive level by investigating the metasellar organization viatransnasal approach in a virtual-reality (VR setting. Methods Twenty-eight patients, with spontaneous subarachnoid hemorrhage but without pathological changes of nasal cavity and sella turcica, underwent the lamellar imaging examination and CT angiogram with Discovery Ultra 16. The data were collected and entered in the Dextroscope in DICOM format. Visualization research was carried out viathe transnasal approach in a virtual-reality (VR setting. Results The anatomic structures of transnasal approach were allowed to be observed dynamically and spatially. When exposing the lateral border of cavernous carotid artery, it was important to excise the ethmoid cornu, open posterior ethmoid sinus and sphenopalatine foramen, control sphenopalatine artery, properly drill out pterygoid process and reveal pterygoid canal. Conclusion It is the key point to remove the ethmoid cornu, uncinate process and bone of the anterior region of sphenoidal sinus, and control sphenopalatine artery viatransnasal approach to expose the metasellar structure. The cavernous carotid arteries are the most important anatomic structure, should be adequately exposed and conserved.

  8. A Soft Computing Approach to Kidney Diseases Evaluation.

    Science.gov (United States)

    Neves, José; Martins, M Rosário; Vilhena, João; Neves, João; Gomes, Sabino; Abelha, António; Machado, José; Vicente, Henrique

    2015-10-01

    Kidney renal failure means that one's kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient's history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis. The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the

  9. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    OpenAIRE

    Grover Kearns

    2010-01-01

    Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants). Accounting stu...

  10. Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.

    Science.gov (United States)

    Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal

    2016-12-01

    In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for

  11. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  12. Gesture Recognition by Computer Vision : An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  13. Thermodynamic and relative approach to compute glass-forming ...

    Indian Academy of Sciences (India)

    models) characteristic: the isobaric heat capacity (Cp) of oxides, and execute a mathematical treatment of oxides thermodynamic data. We note this coefficient as thermodynamical relative glass-forming ability (ThRGFA) and for- mulate a model to compute it. Computed values of 2nd, 3rd, 4th and 5th period metal oxides ...

  14. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  15. Reading Emotion From Mouse Cursor Motions: Affective Computing Approach.

    Science.gov (United States)

    Yamauchi, Takashi; Xiao, Kunchen

    2018-04-01

    Affective computing research has advanced emotion recognition systems using facial expressions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16-26 trajectory features during a choice-reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10-fold cross-validation analysis shows that statistical models formed from "known" participants (training data) could predict nearly 10%-20% of the variance of positive affect and attentiveness ratings of "unknown" participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users. Copyright © 2017 Cognitive Science Society, Inc.

  16. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  17. The Promise of Systems Biology Approaches for Revealing Host Pathogen Interactions in Malaria

    Directory of Open Access Journals (Sweden)

    Meghan Zuck

    2017-11-01

    Full Text Available Despite global eradication efforts over the past century, malaria remains a devastating public health burden, causing almost half a million deaths annually (WHO, 2016. A detailed understanding of the mechanisms that control malaria infection has been hindered by technical challenges of studying a complex parasite life cycle in multiple hosts. While many interventions targeting the parasite have been implemented, the complex biology of Plasmodium poses a major challenge, and must be addressed to enable eradication. New approaches for elucidating key host-parasite interactions, and predicting how the parasite will respond in a variety of biological settings, could dramatically enhance the efficacy and longevity of intervention strategies. The field of systems biology has developed methodologies and principles that are well poised to meet these challenges. In this review, we focus our attention on the Liver Stage of the Plasmodium lifecycle and issue a “call to arms” for using systems biology approaches to forge a new era in malaria research. These approaches will reveal insights into the complex interplay between host and pathogen, and could ultimately lead to novel intervention strategies that contribute to malaria eradication.

  18. What Computational Approaches Should be Taught for Physics?

    Science.gov (United States)

    Landau, Rubin

    2005-03-01

    The standard Computational Physics courses are designed for upper-level physics majors who already have some computational skills. We believe that it is important for first-year physics students to learn modern computing techniques that will be useful throughout their college careers, even before they have learned the math and science required for Computational Physics. To teach such Introductory Scientific Computing courses requires that some choices be made as to what subjects and computer languages wil be taught. Our survey of colleagues active in Computational Physics and Physics Education show no predominant choice, with strong positions taken for the compiled languages Java, C, C++ and Fortran90, as well as for problem-solving environments like Maple and Mathematica. Over the last seven years we have developed an Introductory course and have written up those courses as text books for others to use. We will describe our model of using both a problem-solving environment and a compiled language. The developed materials are available in both Maple and Mathaematica, and Java and Fortran90ootnotetextPrinceton University Press, to be published; www.physics.orst.edu/˜rubin/IntroBook/.

  19. Computer Tutors: An Innovative Approach to Computer Literacy. Part I: The Early Stages.

    Science.gov (United States)

    Targ, Joan

    1981-01-01

    In Part I of this two-part article, the author describes the evolution of the Computer Tutor project in Palo Alto, California, and the strategies she incorporated into a successful student-taught computer literacy program. Journal availability: Educational Computer, P.O. Box 535, Cupertino, CA 95015. (Editor/SJL)

  20. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  1. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  2. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  3. A Computer-Supported Method to Reveal and Assess Personal Professional Theories in Vocational Education

    Science.gov (United States)

    van den Bogaart, Antoine C. M.; Bilderbeek, Richel J. C.; Schaap, Harmen; Hummel, Hans G. K.; Kirschner, Paul A.

    2016-01-01

    This article introduces a dedicated, computer-supported method to construct and formatively assess open, annotated concept maps of Personal Professional Theories (PPTs). These theories are internalised, personal bodies of formal and practical knowledge, values, norms and convictions that professionals use as a reference to interpret and acquire…

  4. Towards an Approach of Semantic Access Control for Cloud Computing

    Science.gov (United States)

    Hu, Luokai; Ying, Shi; Jia, Xiangyang; Zhao, Kai

    With the development of cloud computing, the mutual understandability among distributed Access Control Policies (ACPs) has become an important issue in the security field of cloud computing. Semantic Web technology provides the solution to semantic interoperability of heterogeneous applications. In this paper, we analysis existing access control methods and present a new Semantic Access Control Policy Language (SACPL) for describing ACPs in cloud computing environment. Access Control Oriented Ontology System (ACOOS) is designed as the semantic basis of SACPL. Ontology-based SACPL language can effectively solve the interoperability issue of distributed ACPs. This study enriches the research that the semantic web technology is applied in the field of security, and provides a new way of thinking of access control in cloud computing.

  5. Diffusive Wave Approximation to the Shallow Water Equations: Computational Approach

    KAUST Repository

    Collier, Nathan; Radwan, Hany; Dalcin, Lisandro; Calo, Victor M.

    2011-01-01

    We discuss the use of time adaptivity applied to the one dimensional diffusive wave approximation to the shallow water equations. A simple and computationally economical error estimator is discussed which enables time-step size adaptivity

  6. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  7. New Approaches to Quantum Computing using Nuclear Magnetic Resonance Spectroscopy

    International Nuclear Information System (INIS)

    Colvin, M; Krishnan, V V

    2003-01-01

    The power of a quantum computer (QC) relies on the fundamental concept of the superposition in quantum mechanics and thus allowing an inherent large-scale parallelization of computation. In a QC, binary information embodied in a quantum system, such as spin degrees of freedom of a spin-1/2 particle forms the qubits (quantum mechanical bits), over which appropriate logical gates perform the computation. In classical computers, the basic unit of information is the bit, which can take a value of either 0 or 1. Bits are connected together by logic gates to form logic circuits to implement complex logical operations. The expansion of modern computers has been driven by the developments of faster, smaller and cheaper logic gates. As the size of the logic gates become smaller toward the level of atomic dimensions, the performance of such a system is no longer considered classical but is rather governed by quantum mechanics. Quantum computers offer the potentially superior prospect of solving computational problems that are intractable to classical computers such as efficient database searches and cryptography. A variety of algorithms have been developed recently, most notably Shor's algorithm for factorizing long numbers into prime factors in polynomial time and Grover's quantum search algorithm. The algorithms that were of only theoretical interest as recently, until several methods were proposed to build an experimental QC. These methods include, trapped ions, cavity-QED, coupled quantum dots, Josephson junctions, spin resonance transistors, linear optics and nuclear magnetic resonance. Nuclear magnetic resonance (NMR) is uniquely capable of constructing small QCs and several algorithms have been implemented successfully. NMR-QC differs from other implementations in one important way that it is not a single QC, but a statistical ensemble of them. Thus, quantum computing based on NMR is considered as ensemble quantum computing. In NMR quantum computing, the spins with

  8. Educational games for brain health: revealing their unexplored potential through a neurocognitive approach

    Directory of Open Access Journals (Sweden)

    Patrick eFissler

    2015-07-01

    Full Text Available Educational games link the motivational nature of games with learning of knowledge and skills. Here, we go beyond effects on these learning outcomes. We review two lines of evidence which indicate the currently unexplored potential of educational games to promote brain health: First, gaming with specific neurocognitive demands (e.g., executive control, and second, educational learning experiences (e.g., studying foreign languages improve brain health markers. These markers include cognitive ability, brain function, and brain structure. As educational games allow the combination of specific neurocognitive demands with educational learning experiences, they seem to be optimally suited for promoting brain health. We propose a neurocognitive approach to reveal this unexplored potential of educational games in future research.

  9. Revealing the functions of the transketolase enzyme isoforms in Rhodopseudomonas palustris using a systems biology approach.

    Directory of Open Access Journals (Sweden)

    Chia-Wei Hu

    Full Text Available BACKGROUND: Rhodopseudomonas palustris (R. palustris is a purple non-sulfur anoxygenic phototrophic bacterium that belongs to the class of proteobacteria. It is capable of absorbing atmospheric carbon dioxide and converting it to biomass via the process of photosynthesis and the Calvin-Benson-Bassham (CBB cycle. Transketolase is a key enzyme involved in the CBB cycle. Here, we reveal the functions of transketolase isoforms I and II in R. palustris using a systems biology approach. METHODOLOGY/PRINCIPAL FINDINGS: By measuring growth ability, we found that transketolase could enhance the autotrophic growth and biomass production of R. palustris. Microarray and real-time quantitative PCR revealed that transketolase isoforms I and II were involved in different carbon metabolic pathways. In addition, immunogold staining demonstrated that the two transketolase isoforms had different spatial localizations: transketolase I was primarily associated with the intracytoplasmic membrane (ICM but transketolase II was mostly distributed in the cytoplasm. Comparative proteomic analysis and network construction of transketolase over-expression and negative control (NC strains revealed that protein folding, transcriptional regulation, amino acid transport and CBB cycle-associated carbon metabolism were enriched in the transketolase I over-expressed strain. In contrast, ATP synthesis, carbohydrate transport, glycolysis-associated carbon metabolism and CBB cycle-associated carbon metabolism were enriched in the transketolase II over-expressed strain. Furthermore, ATP synthesis assays showed a significant increase in ATP synthesis in the transketolase II over-expressed strain. A PEPCK activity assay showed that PEPCK activity was higher in transketolase over-expressed strains than in the negative control strain. CONCLUSIONS/SIGNIFICANCE: Taken together, our results indicate that the two isoforms of transketolase in R. palustris could affect photoautotrophic growth

  10. Demand side management scheme in smart grid with cloud computing approach using stochastic dynamic programming

    Directory of Open Access Journals (Sweden)

    S. Sofana Reka

    2016-09-01

    Full Text Available This paper proposes a cloud computing framework in smart grid environment by creating small integrated energy hub supporting real time computing for handling huge storage of data. A stochastic programming approach model is developed with cloud computing scheme for effective demand side management (DSM in smart grid. Simulation results are obtained using GUI interface and Gurobi optimizer in Matlab in order to reduce the electricity demand by creating energy networks in a smart hub approach.

  11. Computer aided approach for qualitative risk assessment of engineered systems

    International Nuclear Information System (INIS)

    Crowley, W.K.; Arendt, J.S.; Fussell, J.B.; Rooney, J.J.; Wagner, D.P.

    1978-01-01

    This paper outlines a computer aided methodology for determining the relative contributions of various subsystems and components to the total risk associated with an engineered system. Major contributors to overall task risk are identified through comparison of an expected frequency density function with an established risk criterion. Contributions that are inconsistently high are also identified. The results from this analysis are useful for directing efforts for improving system safety and performance. An analysis of uranium hexafluoride handling risk at a gaseous diffusion uranium enrichment plant using a preliminary version of the computer program EXCON is briefly described and illustrated

  12. Environmental sciences and computations: a modular data based systems approach

    International Nuclear Information System (INIS)

    Crawford, T.V.; Bailey, C.E.

    1975-07-01

    A major computer code for environmental calculations is under development at the Savannah River Laboratory. The primary aim is to develop a flexible, efficient capability to calculate, for all significant pathways, the dose to man resulting from releases of radionuclides from the Savannah River Plant and from other existing and potential radioactive sources in the southeastern United States. The environmental sciences programs at SRP are described, with emphasis on the development of the calculational system. It is being developed as a modular data-based system within the framework of the larger JOSHUA Computer System, which provides data management, terminal, and job execution facilities. (U.S.)

  13. Computer assisted pyeloplasty in children the retroperitoneal approach

    DEFF Research Database (Denmark)

    Olsen, L H; Jorgensen, T M

    2004-01-01

    PURPOSE: We describe the first series of computer assisted retroperitoneoscopic pyeloplasty in children using the Da Vinci Surgical System (Intuitive Surgical, Inc., Mountainview, California) with regard to setup, method, operation time, complications and preliminary outcome. The small space...... with the Da Vinci Surgical System. With the patient in a lateral semiprone position the retroperitoneal space was developed by blunt and balloon dissection. Three ports were placed for the computer assisted system and 1 for assistance. Pyeloplasty was performed with the mounted system placed behind...

  14. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  15. A Systems Biology Approach Reveals Converging Molecular Mechanisms that Link Different POPs to Common Metabolic Diseases.

    Science.gov (United States)

    Ruiz, Patricia; Perlina, Ally; Mumtaz, Moiz; Fowler, Bruce A

    2016-07-01

    A number of epidemiological studies have identified statistical associations between persistent organic pollutants (POPs) and metabolic diseases, but testable hypotheses regarding underlying molecular mechanisms to explain these linkages have not been published. We assessed the underlying mechanisms of POPs that have been associated with metabolic diseases; three well-known POPs [2,3,7,8-tetrachlorodibenzodioxin (TCDD), 2,2´,4,4´,5,5´-hexachlorobiphenyl (PCB 153), and 4,4´-dichlorodiphenyldichloroethylene (p,p´-DDE)] were studied. We used advanced database search tools to delineate testable hypotheses and to guide laboratory-based research studies into underlying mechanisms by which this POP mixture could produce or exacerbate metabolic diseases. For our searches, we used proprietary systems biology software (MetaCore™/MetaDrug™) to conduct advanced search queries for the underlying interactions database, followed by directional network construction to identify common mechanisms for these POPs within two or fewer interaction steps downstream of their primary targets. These common downstream pathways belong to various cytokine and chemokine families with experimentally well-documented causal associations with type 2 diabetes. Our systems biology approach allowed identification of converging pathways leading to activation of common downstream targets. To our knowledge, this is the first study to propose an integrated global set of step-by-step molecular mechanisms for a combination of three common POPs using a systems biology approach, which may link POP exposure to diseases. Experimental evaluation of the proposed pathways may lead to development of predictive biomarkers of the effects of POPs, which could translate into disease prevention and effective clinical treatment strategies. Ruiz P, Perlina A, Mumtaz M, Fowler BA. 2016. A systems biology approach reveals converging molecular mechanisms that link different POPs to common metabolic diseases. Environ

  16. Thermodynamic and relative approach to compute glass-forming

    Indian Academy of Sciences (India)

    This study deals with the evaluation of glass-forming ability (GFA) of oxides and is a critical reading of Sun and Rawson thermodynamic approach to quantify this aptitude. Both approaches are adequate but ambiguous regarding the behaviour of some oxides (tendency to amorphization or crystallization). Indeed, ZrO2 and ...

  17. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  18. A Cellular Automata Approach to Computer Vision and Image Processing.

    Science.gov (United States)

    1980-09-01

    the ACM, vol. 15, no. 9, pp. 827-837. [ Duda and Hart] R. 0. Duda and P. E. Hart, Pattern Classification and Scene Analysis, Wiley, New York, 1973...Center TR-738, 1979. [Farley] Arthur M. Farley and Andrzej Proskurowski, "Gossiping in Grid Graphs", University of Oregon Computer Science Department CS-TR

  19. New approach for virtual machines consolidation in heterogeneous computing systems

    Czech Academy of Sciences Publication Activity Database

    Fesl, Jan; Cehák, J.; Doležalová, Marie; Janeček, J.

    2016-01-01

    Roč. 9, č. 12 (2016), s. 321-332 ISSN 1738-9968 Institutional support: RVO:60077344 Keywords : consolidation * virtual machine * distributed Subject RIV: JD - Computer Applications, Robotics http://www.sersc.org/journals/IJHIT/vol9_no12_2016/29.pdf

  20. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  1. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  2. Computational approaches to cognition: the bottom-up view.

    Science.gov (United States)

    Koch, C

    1993-04-01

    How can higher level aspects of cognition, such as figure-ground segregation, object recognition, selective focal attention and ultimately even awareness, be implemented at the level of synapses and neurons? A number of theoretical studies emerging out of the connectionist and the computational neuroscience communities are starting to address these issues using neural plausible models.

  3. Linguistics, Computers, and the Language Teacher. A Communicative Approach.

    Science.gov (United States)

    Underwood, John H.

    This analysis of the state of the art of computer programs and programming for language teaching has two parts. In the first part, an overview of the theory and practice of language teaching, Noam Chomsky's view of language, and the implications and problems of generative theory are presented. The theory behind the input model of language…

  4. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... computer modeling used as a research method applied in the process ... conclusions discuss the benefits for students who analyzed the ... accounting education process the case study method should not .... providing travel safety information to passengers ... from literature readings with practical problems.

  5. R for cloud computing an approach for data scientists

    CERN Document Server

    Ohri, A

    2014-01-01

    R for Cloud Computing looks at some of the tasks performed by business analysts on the desktop (PC era)  and helps the user navigate the wealth of information in R and its 4000 packages as well as transition the same analytics using the cloud.  With this information the reader can select both cloud vendors  and the sometimes confusing cloud ecosystem as well  as the R packages that can help process the analytical tasks with minimum effort and cost, and maximum usefulness and customization. The use of Graphical User Interfaces (GUI)  and Step by Step screenshot tutorials is emphasized in this book to lessen the famous learning curve in learning R and some of the needless confusion created in cloud computing that hinders its widespread adoption. This will help you kick-start analytics on the cloud including chapters on cloud computing, R, common tasks performed in analytics, scrutiny of big data analytics, and setting up and navigating cloud providers. Readers are exposed to a breadth of cloud computing ch...

  6. A "Service-Learning Approach" to Teaching Computer Graphics

    Science.gov (United States)

    Hutzel, Karen

    2007-01-01

    The author taught a computer graphics course through a service-learning framework to undergraduate and graduate students in the spring of 2003 at Florida State University (FSU). The students in this course participated in learning a software program along with youths from a neighboring, low-income, primarily African-American community. Together,…

  7. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  8. Probing the mutational interplay between primary and promiscuous protein functions: a computational-experimental approach.

    Science.gov (United States)

    Garcia-Seisdedos, Hector; Ibarra-Molero, Beatriz; Sanchez-Ruiz, Jose M

    2012-01-01

    Protein promiscuity is of considerable interest due its role in adaptive metabolic plasticity, its fundamental connection with molecular evolution and also because of its biotechnological applications. Current views on the relation between primary and promiscuous protein activities stem largely from laboratory evolution experiments aimed at increasing promiscuous activity levels. Here, on the other hand, we attempt to assess the main features of the simultaneous modulation of the primary and promiscuous functions during the course of natural evolution. The computational/experimental approach we propose for this task involves the following steps: a function-targeted, statistical coupling analysis of evolutionary data is used to determine a set of positions likely linked to the recruitment of a promiscuous activity for a new function; a combinatorial library of mutations on this set of positions is prepared and screened for both, the primary and the promiscuous activities; a partial-least-squares reconstruction of the full combinatorial space is carried out; finally, an approximation to the Pareto set of variants with optimal primary/promiscuous activities is derived. Application of the approach to the emergence of folding catalysis in thioredoxin scaffolds reveals an unanticipated scenario: diverse patterns of primary/promiscuous activity modulation are possible, including a moderate (but likely significant in a biological context) simultaneous enhancement of both activities. We show that this scenario can be most simply explained on the basis of the conformational diversity hypothesis, although alternative interpretations cannot be ruled out. Overall, the results reported may help clarify the mechanisms of the evolution of new functions. From a different viewpoint, the partial-least-squares-reconstruction/Pareto-set-prediction approach we have introduced provides the computational basis for an efficient directed-evolution protocol aimed at the simultaneous

  9. Computational models reveal a passive mechanism for cell migration in the crypt.

    Directory of Open Access Journals (Sweden)

    Sara-Jane Dunn

    Full Text Available Cell migration in the intestinal crypt is essential for the regular renewal of the epithelium, and the continued upward movement of cells is a key characteristic of healthy crypt dynamics. However, the driving force behind this migration is unknown. Possibilities include mitotic pressure, active movement driven by motility cues, or negative pressure arising from cell loss at the crypt collar. It is possible that a combination of factors together coordinate migration. Here, three different computational models are used to provide insight into the mechanisms that underpin cell movement in the crypt, by examining the consequence of eliminating cell division on cell movement. Computational simulations agree with existing experimental results, confirming that migration can continue in the absence of mitosis. Importantly, however, simulations allow us to infer mechanisms that are sufficient to generate cell movement, which is not possible through experimental observation alone. The results produced by the three models agree and suggest that cell loss due to apoptosis and extrusion at the crypt collar relieves cell compression below, allowing cells to expand and move upwards. This finding suggests that future experiments should focus on the role of apoptosis and cell extrusion in controlling cell migration in the crypt.

  10. A Discrete Approach to Computer-Oriented Calculus.

    Science.gov (United States)

    Gordon, Sheldon P.

    1979-01-01

    Some of the implications and advantages of an instructional approach using results from the calculus of finite differences and finite sums, both for motivation and as tools leading to applications, are discussed. (MP)

  11. TOWARD HIGHLY SECURE AND AUTONOMIC COMPUTING SYSTEMS: A HIERARCHICAL APPROACH

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hsien-Hsin S

    2010-05-11

    The overall objective of this research project is to develop novel architectural techniques as well as system software to achieve a highly secure and intrusion-tolerant computing system. Such system will be autonomous, self-adapting, introspective, with self-healing capability under the circumstances of improper operations, abnormal workloads, and malicious attacks. The scope of this research includes: (1) System-wide, unified introspection techniques for autonomic systems, (2) Secure information-flow microarchitecture, (3) Memory-centric security architecture, (4) Authentication control and its implication to security, (5) Digital right management, (5) Microarchitectural denial-of-service attacks on shared resources. During the period of the project, we developed several architectural techniques and system software for achieving a robust, secure, and reliable computing system toward our goal.

  12. Software approach to automatic patching of analog computer

    Science.gov (United States)

    1973-01-01

    The Automatic Patching Verification program (APV) is described which provides the hybrid computer programmer with a convenient method of performing a static check of the analog portion of his study. The static check insures that the program is patched as specified, and that the computing components being used are operating correctly. The APV language the programmer uses to specify his conditions and interconnections is similar to the FORTRAN language in syntax. The APV control program reads APV source program statements from an assigned input device. Each source program statement is processed immediately after it is read. A statement may select an analog console, set an analog mode, set a potentiometer or DAC, or read from the analog console and perform a test. Statements are read and processed sequentially. If an error condition is detected, an output occurs on an assigned output device. When an end statement is read, the test is terminated.

  13. New Computational Approach to Electron Transport in Irregular Graphene Nanostructures

    Science.gov (United States)

    Mason, Douglas; Heller, Eric; Prendergast, David; Neaton, Jeffrey

    2009-03-01

    For novel graphene devices of nanoscale-to-macroscopic scale, many aspects of their transport properties are not easily understood due to difficulties in fabricating devices with regular edges. Here we develop a framework to efficiently calculate and potentially screen electronic transport properties of arbitrary nanoscale graphene device structures. A generalization of the established recursive Green's function method is presented, providing access to arbitrary device and lead geometries with substantial computer-time savings. Using single-orbital nearest-neighbor tight-binding models and the Green's function-Landauer scattering formalism, we will explore the transmission function of irregular two-dimensional graphene-based nanostructures with arbitrary lead orientation. Prepared by LBNL under contract DE-AC02-05CH11231 and supported by the U.S. Dept. of Energy Computer Science Graduate Fellowship under grant DE-FG02-97ER25308.

  14. Computer animations of color markings reveal the function of visual threat signals in Neolamprologus pulcher.

    Science.gov (United States)

    Balzarini, Valentina; Taborsky, Michael; Villa, Fabienne; Frommen, Joachim G

    2017-02-01

    Visual signals, including changes in coloration and color patterns, are frequently used by animals to convey information. During contests, body coloration and its changes can be used to assess an opponent's state or motivation. Communication of aggressive propensity is particularly important in group-living animals with a stable dominance hierarchy, as the outcome of aggressive interactions determines the social rank of group members. Neolamprologus pulcher is a cooperatively breeding cichlid showing frequent within-group aggression. Both sexes exhibit two vertical black stripes on the operculum that vary naturally in shape and darkness. During frontal threat displays these patterns are actively exposed to the opponent, suggesting a signaling function. To investigate the role of operculum stripes during contests we manipulated their darkness in computer animated pictures of the fish. We recorded the responses in behavior and stripe darkness of test subjects to which these animated pictures were presented. Individuals with initially darker stripes were more aggressive against the animations and showed more operculum threat displays. Operculum stripes of test subjects became darker after exposure to an animation exhibiting a pale operculum than after exposure to a dark operculum animation, highlighting the role of the darkness of this color pattern in opponent assessment. We conclude that (i) the black stripes on the operculum of N. pulcher are a reliable signal of aggression and dominance, (ii) these markings play an important role in opponent assessment, and (iii) 2D computer animations are well suited to elicit biologically meaningful short-term aggressive responses in this widely used model system of social evolution.

  15. Computer Simulations Reveal Multiple Functions for Aromatic Residues in Cellulase Enzymes (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2012-07-01

    NREL researchers use high-performance computing to demonstrate fundamental roles of aromatic residues in cellulase enzyme tunnels. National Renewable Energy Laboratory (NREL) computer simulations of a key industrial enzyme, the Trichoderma reesei Family 6 cellulase (Cel6A), predict that aromatic residues near the enzyme's active site and at the entrance and exit tunnel perform different functions in substrate binding and catalysis, depending on their location in the enzyme. These results suggest that nature employs aromatic-carbohydrate interactions with a wide variety of binding affinities for diverse functions. Outcomes also suggest that protein engineering strategies in which mutations are made around the binding sites may require tailoring specific to the enzyme family. Cellulase enzymes ubiquitously exhibit tunnels or clefts lined with aromatic residues for processing carbohydrate polymers to monomers, but the molecular-level role of these aromatic residues remains unknown. In silico mutation of the aromatic residues near the catalytic site of Cel6A has little impact on the binding affinity, but simulation suggests that these residues play a major role in the glucopyranose ring distortion necessary for cleaving glycosidic bonds to produce fermentable sugars. Removal of aromatic residues at the entrance and exit of the cellulase tunnel, however, dramatically impacts the binding affinity. This suggests that these residues play a role in acquiring cellulose chains from the cellulose crystal and stabilizing the reaction product, respectively. These results illustrate that the role of aromatic-carbohydrate interactions varies dramatically depending on the position in the enzyme tunnel. As aromatic-carbohydrate interactions are present in all carbohydrate-active enzymes, the results have implications for understanding protein structure-function relationships in carbohydrate metabolism and recognition, carbon turnover in nature, and protein engineering

  16. A Neural Information Field Approach to Computational Cognition

    Science.gov (United States)

    2016-11-18

    effects of distraction during list memory . These distractions include short and long delays before recall, and continuous distraction (forced rehearsal... memory encoding and replay in hippocampus. Computational Neuroscience Society (CNS), p. 166, 2014. D. A. Pinotsis, Neural Field Coding of Short Term ...performance of children learning to count in a SPA model; proposed a new SPA model of cognitive load using the N-back task; developed a new model of the

  17. A Novel Biometric Approach for Authentication In Pervasive Computing Environments

    OpenAIRE

    Rachappa,; Divyajyothi M G; D H Rao

    2016-01-01

    The paradigm of embedding computing devices in our surrounding environment has gained more interest in recent days. Along with contemporary technology comes challenges, the most important being the security and privacy aspect. Keeping the aspect of compactness and memory constraints of pervasive devices in mind, the biometric techniques proposed for identification should be robust and dynamic. In this work, we propose an emerging scheme that is based on few exclusive human traits and characte...

  18. Modeling Cu{sup 2+}-Aβ complexes from computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Alí-Torres, Jorge [Departamento de Química, Universidad Nacional de Colombia- Sede Bogotá, 111321 (Colombia); Mirats, Andrea; Maréchal, Jean-Didier; Rodríguez-Santiago, Luis; Sodupe, Mariona, E-mail: Mariona.Sodupe@uab.cat [Departament de Química, Universitat Autònoma de Barcelona, 08193 Bellaterra, Barcelona (Spain)

    2015-09-15

    Amyloid plaques formation and oxidative stress are two key events in the pathology of the Alzheimer disease (AD), in which metal cations have been shown to play an important role. In particular, the interaction of the redox active Cu{sup 2+} metal cation with Aβ has been found to interfere in amyloid aggregation and to lead to reactive oxygen species (ROS). A detailed knowledge of the electronic and molecular structure of Cu{sup 2+}-Aβ complexes is thus important to get a better understanding of the role of these complexes in the development and progression of the AD disease. The computational treatment of these systems requires a combination of several available computational methodologies, because two fundamental aspects have to be addressed: the metal coordination sphere and the conformation adopted by the peptide upon copper binding. In this paper we review the main computational strategies used to deal with the Cu{sup 2+}-Aβ coordination and build plausible Cu{sup 2+}-Aβ models that will afterwards allow determining physicochemical properties of interest, such as their redox potential.

  19. Mind the gap: an attempt to bridge computational and neuroscientific approaches to study creativity

    Science.gov (United States)

    Wiggins, Geraint A.; Bhattacharya, Joydeep

    2014-01-01

    Creativity is the hallmark of human cognition and is behind every innovation, scientific discovery, piece of music, artwork, and idea that have shaped our lives, from ancient times till today. Yet scientific understanding of creative processes is quite limited, mostly due to the traditional belief that considers creativity as a mysterious puzzle, a paradox, defying empirical enquiry. Recently, there has been an increasing interest in revealing the neural correlates of human creativity. Though many of these studies, pioneering in nature, help demystification of creativity, but the field is still dominated by popular beliefs in associating creativity with “right brain thinking”, “divergent thinking”, “altered states” and so on (Dietrich and Kanso, 2010). In this article, we discuss a computational framework for creativity based on Baars’ Global Workspace Theory (GWT; Baars, 1988) enhanced with mechanisms based on information theory. Next we propose a neurocognitive architecture of creativity with a strong focus on various facets (i.e., unconscious thought theory, mind wandering, spontaneous brain states) of un/pre-conscious brain responses. Our principal argument is that pre-conscious creativity happens prior to conscious creativity and the proposed computational model may provide a mechanism by which this transition is managed. This integrative approach, albeit unconventional, will hopefully stimulate future neuroscientific studies of the inscrutable phenomenon of creativity. PMID:25104930

  20. Mind the Gap: An attempt to bridge computational and neuroscientific approaches to study creativity

    Directory of Open Access Journals (Sweden)

    Geraint eWiggins

    2014-07-01

    Full Text Available Creativity is the hallmark of human cognition, yet scientific understanding of creative processes is limited. However, there is increasing interest in revealing the neural correlates of human creativity. Though many of these studies, pioneering in nature, help demystification of creativity, but the field is still dominated by popular beliefs in associating creativity with right brain thinking, divergent thinking, altered states and so on (Dietrich and Kanso, 2010 . In this article, we discuss a computational framework for creativity based on Baars' global workspace theory (Baars, 1988 enhanced with mechanisms based on information theory. Next we propose a neurocognitive architecture of creativity with a strong focus on various facets (i.e., unconscious thought theory, mind wandering, spontaneous brain states of un/pre-conscious brain responses. Our principal argument is that pre-conscious creativity happens prior to conscious creativity happens prior to conscious creativity and the proposed computational model may provide a mechanism by which this transition is managed. This integrative approach, albeit unconventional, will hopefully stimulate future neuroscientific studies of the inscrutable phenomenon of creativity.

  1. A Biologically-Based Computational Approach to Drug Repurposing for Anthrax Infection

    Directory of Open Access Journals (Sweden)

    Jane P. F. Bai

    2017-03-01

    Full Text Available Developing drugs to treat the toxic effects of lethal toxin (LT and edema toxin (ET produced by B. anthracis is of global interest. We utilized a computational approach to score 474 drugs/compounds for their ability to reverse the toxic effects of anthrax toxins. For each toxin or drug/compound, we constructed an activity network by using its differentially expressed genes, molecular targets, and protein interactions. Gene expression profiles of drugs were obtained from the Connectivity Map and those of anthrax toxins in human alveolar macrophages were obtained from the Gene Expression Omnibus. Drug rankings were based on the ability of a drug/compound’s mode of action in the form of a signaling network to reverse the effects of anthrax toxins; literature reports were used to verify the top 10 and bottom 10 drugs/compounds identified. Simvastatin and bepridil with reported in vitro potency for protecting cells from LT and ET toxicities were computationally ranked fourth and eighth. The other top 10 drugs were fenofibrate, dihydroergotamine, cotinine, amantadine, mephenytoin, sotalol, ifosfamide, and mefloquine; literature mining revealed their potential protective effects from LT and ET toxicities. These drugs are worthy of investigation for their therapeutic benefits and might be used in combination with antibiotics for treating B. anthracis infection.

  2. Conformational Dynamics of apo-GlnBP Revealed by Experimental and Computational Analysis

    KAUST Repository

    Feng, Yitao

    2016-10-13

    The glutamine binding protein (GlnBP) binds l-glutamine and cooperates with its cognate transporters during glutamine uptake. Crystal structure analysis has revealed an open and a closed conformation for apo- and holo-GlnBP, respectively. However, the detailed conformational dynamics have remained unclear. Herein, we combined NMR spectroscopy, MD simulations, and single-molecule FRET techniques to decipher the conformational dynamics of apo-GlnBP. The NMR residual dipolar couplings of apo-GlnBP were in good agreement with a MD-derived structure ensemble consisting of four metastable states. The open and closed conformations are the two major states. This four-state model was further validated by smFRET experiments and suggests the conformational selection mechanism in ligand recognition of GlnBP. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim

  3. Conformational Dynamics of apo-GlnBP Revealed by Experimental and Computational Analysis

    KAUST Repository

    Feng, Yitao; Zhang, Lu; Wu, Shaowen; Liu, Zhijun; Gao, Xin; Zhang, Xu; Liu, Maili; Liu, Jianwei; Huang, Xuhui; Wang, Wenning

    2016-01-01

    The glutamine binding protein (GlnBP) binds l-glutamine and cooperates with its cognate transporters during glutamine uptake. Crystal structure analysis has revealed an open and a closed conformation for apo- and holo-GlnBP, respectively. However, the detailed conformational dynamics have remained unclear. Herein, we combined NMR spectroscopy, MD simulations, and single-molecule FRET techniques to decipher the conformational dynamics of apo-GlnBP. The NMR residual dipolar couplings of apo-GlnBP were in good agreement with a MD-derived structure ensemble consisting of four metastable states. The open and closed conformations are the two major states. This four-state model was further validated by smFRET experiments and suggests the conformational selection mechanism in ligand recognition of GlnBP. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim

  4. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    International Nuclear Information System (INIS)

    Walton, S.

    1987-01-01

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  5. Revealing Soil Structure and Functional Macroporosity along a Clay Gradient Using X-ray Computed Tomography

    DEFF Research Database (Denmark)

    Naveed, Muhammad; Møldrup, Per; Arthur, Emmanuel

    2013-01-01

    clay content, respectively) at a field site in Lerbjerg, Denmark. The water-holding capacity of soils markedly increased with increasing soil clay content, while significantly higher air permeability was observed for the L1 to L3 soils than for the L4 to L6 soils. Higher air permeability values......The influence of clay content in soil-pore structure development and the relative importance of macroporosity in governing convective fluid flow are two key challenges toward better understanding and quantifying soil ecosystem functions. In this study, soil physical measurements (soil-water...... retention and air permeability) and x-ray computed tomography (CT) scanning were combined and used from two scales on intact soil columns (100 and 580 cm3). The columns were sampled along a natural clay gradient at six locations (L1, L2, L3, L4, L5 and L6 with 0.11, 0.16, 0.21, 0.32, 0.38 and 0.46 kg kg−1...

  6. Heterogeneous vesiculation of 2011 El Hierro xeno-pumice revealed by X-ray computed microtomography

    Science.gov (United States)

    Berg, S. E.; Troll, V. R.; Deegan, F. M.; Burchardt, S.; Krumbholz, M.; Mancini, L.; Polacci, M.; Carracedo, J. C.; Soler, V.; Arzilli, F.; Brun, F.

    2016-12-01

    During the first week of the 2011 El Hierro submarine eruption, abundant light-coloured pumiceous, high-silica volcanic bombs coated in dark basanite were found floating on the sea. The composition of the light-coloured frothy material (`xeno-pumice') is akin to that of sedimentary rocks from the region, but the textures resemble felsic magmatic pumice, leaving their exact mode of formation unclear. To help decipher their origin, we investigated representative El Hierro xeno-pumice samples using X-ray computed microtomography for their internal vesicle shapes, volumes, and bulk porosity, as well as for the spatial arrangement and size distributions of vesicles in three dimensions (3D). We find a wide range of vesicle morphologies, which are especially variable around small fragments of rock contained in the xeno-pumice samples. Notably, these rock fragments are almost exclusively of sedimentary origin, and we therefore interpret them as relicts an the original sedimentary ocean crust protolith(s). The irregular vesiculation textures observed probably resulted from pulsatory release of volatiles from multiple sources during xeno-pumice formation, most likely by successive release of pore water and mineral water during incremental heating and decompression of the sedimentary protoliths.

  7. Clinical study on eating disorders. Brain atrophy revealed by cranial computed tomography scans

    Energy Technology Data Exchange (ETDEWEB)

    Nishiwaki, Shinichi

    1988-06-01

    Cranial computed tomography (CT) scans were reviewed in 34 patients with anorexia nervosa (Group I) and 22 with bulimia (Group II) to elucidate the cause and pathological significance of morphological brain alterations. The findings were compared with those from 47 normal women. The incidence of brain atrophy was significantly higher in Group I (17/34, 50%) and Group II (11/22, 50%) than the control group (3/47, 6%). In Group I, there was a significant increase in the left septum-caudate distance, the maximum width of interhemispheric fissure, the width of the both-side Sylvian fissures adjacent to the skull, and the maximum width of the third ventricle. A significant increase in the maximum width of interhemispheric fissure and the width of the left-side Sylvian fissure adjacent to the skull were noted as well in Group II. Ventricular brain ratios were significantly higher in Groups I and II than the control group (6.76 and 7.29 vs 4.55). Brain atrophy did not correlate with age, body weight, malnutrition, eating behavior, depression, thyroid function, EEG findings, or intelligence scale. In Group I, serum cortisol levels after the administration of dexamethasone were correlated with ventricular brain ratio. (Namekawa, K) 51 refs.

  8. Changes to a modelling approach with the use of computer

    DEFF Research Database (Denmark)

    Andresen, Mette

    2006-01-01

    of teaching materials on differential equations. One of the objectives of the project was changes at two levels: 1) Changes at curriculum level and 2) Changes in the intentions of modelling and using models. The paper relates the changes at these two levels and discusses how the use of computer can serve......This paper reports on a Ph.D. project, which was part of a larger research- and development project (see www.matnatverdensklasse.dk). In the reported part of the project, each student had had a laptop at his disposal for at least two years. The Ph.D. project inquires the try out in four classes...

  9. Diffusive Wave Approximation to the Shallow Water Equations: Computational Approach

    KAUST Repository

    Collier, Nathan

    2011-05-14

    We discuss the use of time adaptivity applied to the one dimensional diffusive wave approximation to the shallow water equations. A simple and computationally economical error estimator is discussed which enables time-step size adaptivity. This robust adaptive time discretization corrects the initial time step size to achieve a user specified bound on the discretization error and allows time step size variations of several orders of magnitude. In particular, in the one dimensional results presented in this work feature a change of four orders of magnitudes for the time step over the entire simulation.

  10. Sinc-Approximations of Fractional Operators: A Computing Approach

    Directory of Open Access Journals (Sweden)

    Gerd Baumann

    2015-06-01

    Full Text Available We discuss a new approach to represent fractional operators by Sinc approximation using convolution integrals. A spin off of the convolution representation is an effective inverse Laplace transform. Several examples demonstrate the application of the method to different practical problems.

  11. Computational Approaches Reveal New Insights into Regulation and Function of Non; coding RNAs and their Targets

    KAUST Repository

    Alam, Tanvir

    2016-01-01

    Regulation and function of protein-coding genes are increasingly well-understood, but no comparable evidence exists for non-coding RNA (ncRNA) genes, which appear to be more numerous than protein-coding genes. We developed a novel machine

  12. Novel Polyurethane Matrix Systems Reveal a Particular Sustained Release Behavior Studied by Imaging and Computational Modeling.

    Science.gov (United States)

    Campiñez, María Dolores; Caraballo, Isidoro; Puchkov, Maxim; Kuentz, Martin

    2017-07-01

    The aim of the present work was to better understand the drug-release mechanism from sustained release matrices prepared with two new polyurethanes, using a novel in silico formulation tool based on 3-dimensional cellular automata. For this purpose, two polymers and theophylline as model drug were used to prepare binary matrix tablets. Each formulation was simulated in silico, and its release behavior was compared to the experimental drug release profiles. Furthermore, the polymer distributions in the tablets were imaged by scanning electron microscopy (SEM) and the changes produced by the tortuosity were quantified and verified using experimental data. The obtained results showed that the polymers exhibited a surprisingly high ability for controlling drug release at low excipient concentrations (only 10% w/w of excipient controlled the release of drug during almost 8 h). The mesoscopic in silico model helped to reveal how the novel biopolymers were controlling drug release. The mechanism was found to be a special geometrical arrangement of the excipient particles, creating an almost continuous barrier surrounding the drug in a very effective way, comparable to lipid or waxy excipients but with the advantages of a much higher compactability, stability, and absence of excipient polymorphism.

  13. Mechanisms of Neurofeedback: A Computation-theoretic Approach.

    Science.gov (United States)

    Davelaar, Eddy J

    2018-05-15

    Neurofeedback training is a form of brain training in which information about a neural measure is fed back to the trainee who is instructed to increase or decrease the value of that particular measure. This paper focuses on electroencephalography (EEG) neurofeedback in which the neural measures of interest are the brain oscillations. To date, the neural mechanisms that underlie successful neurofeedback training are still unexplained. Such an understanding would benefit researchers, funding agencies, clinicians, regulatory bodies, and insurance firms. Based on recent empirical work, an emerging theory couched firmly within computational neuroscience is proposed that advocates a critical role of the striatum in modulating EEG frequencies. The theory is implemented as a computer simulation of peak alpha upregulation, but in principle any frequency band at one or more electrode sites could be addressed. The simulation successfully learns to increase its peak alpha frequency and demonstrates the influence of threshold setting - the threshold that determines whether positive or negative feedback is provided. Analyses of the model suggest that neurofeedback can be likened to a search process that uses importance sampling to estimate the posterior probability distribution over striatal representational space, with each representation being associated with a distribution of values of the target EEG band. The model provides an important proof of concept to address pertinent methodological questions about how to understand and improve EEG neurofeedback success. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  14. A uniform approach for programming distributed heterogeneous computing systems.

    Science.gov (United States)

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-12-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.

  15. Novel approach for dam break flow modeling using computational intelligence

    Science.gov (United States)

    Seyedashraf, Omid; Mehrabi, Mohammad; Akhtari, Ali Akbar

    2018-04-01

    A new methodology based on the computational intelligence (CI) system is proposed and tested for modeling the classic 1D dam-break flow problem. The reason to seek for a new solution lies in the shortcomings of the existing analytical and numerical models. This includes the difficulty of using the exact solutions and the unwanted fluctuations, which arise in the numerical results. In this research, the application of the radial-basis-function (RBF) and multi-layer-perceptron (MLP) systems is detailed for the solution of twenty-nine dam-break scenarios. The models are developed using seven variables, i.e. the length of the channel, the depths of the up-and downstream sections, time, and distance as the inputs. Moreover, the depths and velocities of each computational node in the flow domain are considered as the model outputs. The models are validated against the analytical, and Lax-Wendroff and MacCormack FDM schemes. The findings indicate that the employed CI models are able to replicate the overall shape of the shock- and rarefaction-waves. Furthermore, the MLP system outperforms RBF and the tested numerical schemes. A new monolithic equation is proposed based on the best fitting model, which can be used as an efficient alternative to the existing piecewise analytic equations.

  16. A 3D computer graphics approach to brachytherapy planning.

    Science.gov (United States)

    Weichert, Frank; Wawro, Martin; Wilke, Carsten

    2004-06-01

    Intravascular brachytherapy (IVB) can significantly reduce the risk of restenosis after interventional treatment of stenotic arteries, if planned and applied correctly. In order to facilitate computer-based IVB planning, a three-dimensional reconstruction of the stenotic artery based on intravascular ultrasound (IVUS) sequences is desirable. For this purpose, the frames of the IVUS sequence are properly aligned in space, possible gaps inbetween the IVUS frames are filled by interpolation with radial basis functions known from scattered data interpolation. The alignment procedure uses additional information which is obtained from biplane X-ray angiography performed simultaneously during the capturing of the IVUS sequence. After IVUS images and biplane angiography data are acquired from the patient, the vessel-wall borders and the IVUS catheter are detected by an active contour algorithm. Next, the twist (relative orientation) between adjacent IVUS frames is determined by a sequential triangulation method. The absolute orientation of each frame is established by a stochastic analysis based on anatomical landmarks. Finally, the reconstructed 3D vessel model is visualized by methods of combined volume and polygon rendering. The reconstruction is then used for the computation of the radiation-distribution within the tissue, emitted from a beta-radiation source. All these steps are performed during the percutaneous intervention.

  17. Large-scale computations on histology images reveal grade-differentiating parameters for breast cancer

    Directory of Open Access Journals (Sweden)

    Katsinis Constantine

    2006-10-01

    Full Text Available Abstract Background Tumor classification is inexact and largely dependent on the qualitative pathological examination of the images of the tumor tissue slides. In this study, our aim was to develop an automated computational method to classify Hematoxylin and Eosin (H&E stained tissue sections based on cancer tissue texture features. Methods Image processing of histology slide images was used to detect and identify adipose tissue, extracellular matrix, morphologically distinct cell nuclei types, and the tubular architecture. The texture parameters derived from image analysis were then applied to classify images in a supervised classification scheme using histologic grade of a testing set as guidance. Results The histologic grade assigned by pathologists to invasive breast carcinoma images strongly correlated with both the presence and extent of cell nuclei with dispersed chromatin and the architecture, specifically the extent of presence of tubular cross sections. The two parameters that differentiated tumor grade found in this study were (1 the number density of cell nuclei with dispersed chromatin and (2 the number density of tubular cross sections identified through image processing as white blobs that were surrounded by a continuous string of cell nuclei. Classification based on subdivisions of a whole slide image containing a high concentration of cancer cell nuclei consistently agreed with the grade classification of the entire slide. Conclusion The automated image analysis and classification presented in this study demonstrate the feasibility of developing clinically relevant classification of histology images based on micro- texture. This method provides pathologists an invaluable quantitative tool for evaluation of the components of the Nottingham system for breast tumor grading and avoid intra-observer variability thus increasing the consistency of the decision-making process.

  18. Charge Transport in LDPE Nanocomposites Part II—Computational Approach

    Directory of Open Access Journals (Sweden)

    Anh T. Hoang

    2016-03-01

    Full Text Available A bipolar charge transport model is employed to investigate the remarkable reduction in dc conductivity of low-density polyethylene (LDPE based material filled with uncoated nanofillers (reported in the first part of this work. The effect of temperature on charge transport is considered and the model outcomes are compared with measured conduction currents. The simulations reveal that the contribution of charge carrier recombination to the total transport process becomes more significant at elevated temperatures. Among the effects caused by the presence of nanoparticles, a reduced charge injection at electrodes has been found as the most essential one. Possible mechanisms for charge injection at different temperatures are therefore discussed.

  19. A Computational Approach to the Quantification of Animal Camouflage

    Science.gov (United States)

    2014-06-01

    and Norm Farr, for providing great feedback on my research and encouragement along the way. Finally, I thank my dad and my sister, for their love...that live different habitats. Another approach, albeit logistically difficult, would be to transport cuttlefish native to a chromatically poor ...habitat to a chromatically rich habitat. Many such challenges remain in the field of sensory ecology, not just of cephalopods in marine habitats but many

  20. Engineering approach to model and compute electric power markets settlements

    International Nuclear Information System (INIS)

    Kumar, J.; Petrov, V.

    2006-01-01

    Back-office accounting settlement activities are an important part of market operations in Independent System Operator (ISO) organizations. A potential way to measure ISO market design correctness is to analyze how well market price signals create incentives or penalties for creating an efficient market to achieve market design goals. Market settlement rules are an important tool for implementing price signals which are fed back to participants via the settlement activities of the ISO. ISO's are currently faced with the challenge of high volumes of data resulting from the increasing size of markets and ever-changing market designs, as well as the growing complexity of wholesale energy settlement business rules. This paper analyzed the problem and presented a practical engineering solution using an approach based on mathematical formulation and modeling of large scale calculations. The paper also presented critical comments on various differences in settlement design approaches to electrical power market design, as well as further areas of development. The paper provided a brief introduction to the wholesale energy market settlement systems and discussed problem formulation. An actual settlement implementation framework and discussion of the results and conclusions were also presented. It was concluded that a proper engineering approach to this domain can yield satisfying results by formalizing wholesale energy settlements. Significant improvements were observed in the initial preparation phase, scoping and effort estimation, implementation and testing. 5 refs., 2 figs

  1. Understanding Rifampicin Resistance in Tuberculosis through a Computational Approach

    Directory of Open Access Journals (Sweden)

    Satish Kumar

    2014-12-01

    Full Text Available The disease tuberculosis, caused by Mycobacterium tuberculosis (MTB, remains a major cause of morbidity and mortality in developing countries. The evolution of drug-resistant tuberculosis causes a foremost threat to global health. Most drug-resistant MTB clinical strains are showing resistance to isoniazid and rifampicin (RIF, the frontline anti-tuberculosis drugs. Mutation in rpoB, the beta subunit of DNA-directed RNA polymerase of MTB, is reported to be a major cause of RIF resistance. Amongst mutations in the well-defined 81-base-pair central region of the rpoB gene, mutation at codon 450 (S450L and 445 (H445Y is mainly associated with RIF resistance. In this study, we modeled two resistant mutants of rpoB (S450L and H445Y using Modeller9v10 and performed a docking analysis with RIF using AutoDock4.2 and compared the docking results of these mutants with the wild-type rpoB. The docking results revealed that RIF more effectively inhibited the wild-type rpoB with low binding energy than rpoB mutants. The rpoB mutants interacted with RIF with positive binding energy, revealing the incapableness of RIF inhibition and thus showing resistance. Subsequently, this was verified by molecular dynamics simulations. This in silico evidence may help us understand RIF resistance in rpoB mutant strains.

  2. Computer aided fixture design - A case based approach

    Science.gov (United States)

    Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom

    2017-11-01

    Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.

  3. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    Science.gov (United States)

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  4. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  5. A dynamical-systems approach for computing ice-affected streamflow

    Science.gov (United States)

    Holtschlag, David J.

    1996-01-01

    A dynamical-systems approach was developed and evaluated for computing ice-affected streamflow. The approach provides for dynamic simulation and parameter estimation of site-specific equations relating ice effects to routinely measured environmental variables. Comparison indicates that results from the dynamical-systems approach ranked higher than results from 11 analytical methods previously investigated on the basis of accuracy and feasibility criteria. Additional research will likely lead to further improvements in the approach.

  6. Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.

    Science.gov (United States)

    Scheeline, Alexander; Mork, Brian J.

    1988-01-01

    Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)

  7. Visualising the invisible: a network approach to reveal the informal social side of student learning.

    Science.gov (United States)

    Hommes, J; Rienties, B; de Grave, W; Bos, G; Schuwirth, L; Scherpbier, A

    2012-12-01

    World-wide, universities in health sciences have transformed their curriculum to include collaborative learning and facilitate the students' learning process. Interaction has been acknowledged to be the synergistic element in this learning context. However, students spend the majority of their time outside their classroom and interaction does not stop outside the classroom. Therefore we studied how informal social interaction influences student learning. Moreover, to explore what really matters in the students learning process, a model was tested how the generally known important constructs-prior performance, motivation and social integration-relate to informal social interaction and student learning. 301 undergraduate medical students participated in this cross-sectional quantitative study. Informal social interaction was assessed using self-reported surveys following the network approach. Students' individual motivation, social integration and prior performance were assessed by the Academic Motivation Scale, the College Adaption Questionnaire and students' GPA respectively. A factual knowledge test represented student' learning. All social networks were positively associated with student learning significantly: friendships (β = 0.11), providing information to other students (β = 0.16), receiving information from other students (β = 0.25). Structural equation modelling revealed a model in which social networks increased student learning (r = 0.43), followed by prior performance (r = 0.31). In contrast to prior literature, students' academic motivation and social integration were not associated with students' learning. Students' informal social interaction is strongly associated with students' learning. These findings underline the need to change our focus from the formal context (classroom) to the informal context to optimize student learning and deliver modern medics.

  8. Phylogeny of haemosporidian blood parasites revealed by a multi-gene approach.

    Science.gov (United States)

    Borner, Janus; Pick, Christian; Thiede, Jenny; Kolawole, Olatunji Matthew; Kingsley, Manchang Tanyi; Schulze, Jana; Cottontail, Veronika M; Wellinghausen, Nele; Schmidt-Chanasit, Jonas; Bruchhaus, Iris; Burmester, Thorsten

    2016-01-01

    The apicomplexan order Haemosporida is a clade of unicellular blood parasites that infect a variety of reptilian, avian and mammalian hosts. Among them are the agents of human malaria, parasites of the genus Plasmodium, which pose a major threat to human health. Illuminating the evolutionary history of Haemosporida may help us in understanding their enormous biological diversity, as well as tracing the multiple host switches and associated acquisitions of novel life-history traits. However, the deep-level phylogenetic relationships among major haemosporidian clades have remained enigmatic because the datasets employed in phylogenetic analyses were severely limited in either gene coverage or taxon sampling. Using a PCR-based approach that employs a novel set of primers, we sequenced fragments of 21 nuclear genes from seven haemosporidian parasites of the genera Leucocytozoon, Haemoproteus, Parahaemoproteus, Polychromophilus and Plasmodium. After addition of genomic data from 25 apicomplexan species, the unreduced alignment comprised 20,580 bp from 32 species. Phylogenetic analyses were performed based on nucleotide, codon and amino acid data employing Bayesian inference, maximum likelihood and maximum parsimony. All analyses resulted in highly congruent topologies. We found consistent support for a basal position of Leucocytozoon within Haemosporida. In contrast to all previous studies, we recovered a sister group relationship between the genera Polychromophilus and Plasmodium. Within Plasmodium, the sauropsid and mammal-infecting lineages were recovered as sister clades. Support for these relationships was high in nearly all trees, revealing a novel phylogeny of Haemosporida, which is robust to the choice of the outgroup and the method of tree inference. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. The occurrence of Toxocara species in naturally infected broiler chickens revealed by molecular approaches.

    Science.gov (United States)

    Zibaei, M; Sadjjadi, S M; Maraghi, S

    2017-09-01

    Consuming raw and undercooked meat is known to enhance the risk of human toxocariasis because Toxocara species have a wide range of paratenic hosts, including chickens. The aim of this study was to identify species of Toxocara in naturally infected broiler chickens using molecular approaches. A polymerase chain reaction (PCR) method was used for the differentiation of Toxocara canis and Toxocara cati larvae recovered from tissues and organs, and identified by microscopic observations. Thirty-three 35- to 47-day-old broiler chickens were used for examination of Toxocara larvae. The duodenum, liver, lungs, heart, kidneys, skeletal muscles and brain of each chicken were examined using the pepsin method, and DNA from each tissue was extracted as the template for PCR assay. The findings revealed that 5 of 33 (15.2%) broiler chickens were infected with Toxocara larvae. Larvae were recovered from the liver (n = 19), duodenum (n = 8), skeletal muscles (n = 8) and brain (n = 2) of broiler chickens naturally infected with Toxocara spp. The results showed that the frequencies of the species in the chickens were T. canis larvae (n = 5, 83.3%) and T. cati larvae (n = 1, 16.7%). Our data from the present study demonstrated the importance of broiler chickens as a paratenic host for the parasite's life cycle in the environment. The implementation of DNA amplification as a routine diagnostic technique is a specific and alternative method for identification of Toxocara larvae, and allowed the observation of specific species under field conditions within the locations where broiler chickens are typically raised and exposed to Toxocara spp. eggs or larvae.

  10. Combining on-chip synthesis of a focused combinatorial library with computational target prediction reveals imidazopyridine GPCR ligands.

    Science.gov (United States)

    Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Schneider, Gisbert

    2014-01-07

    Using the example of the Ugi three-component reaction we report a fast and efficient microfluidic-assisted entry into the imidazopyridine scaffold, where building block prioritization was coupled to a new computational method for predicting ligand-target associations. We identified an innovative GPCR-modulating combinatorial chemotype featuring ligand-efficient adenosine A1/2B and adrenergic α1A/B receptor antagonists. Our results suggest the tight integration of microfluidics-assisted synthesis with computer-based target prediction as a viable approach to rapidly generate bioactivity-focused combinatorial compound libraries with high success rates. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Perturbation approach for nuclear magnetic resonance solid-state quantum computation

    Directory of Open Access Journals (Sweden)

    G. P. Berman

    2003-01-01

    Full Text Available A dynamics of a nuclear-spin quantum computer with a large number (L=1000 of qubits is considered using a perturbation approach. Small parameters are introduced and used to compute the error in an implementation of an entanglement between remote qubits, using a sequence of radio-frequency pulses. The error is computed up to the different orders of the perturbation theory and tested using exact numerical solution.

  12. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  13. Safe manning of merchant ships: an approach and computer tool

    DEFF Research Database (Denmark)

    Alapetite, Alexandre; Kozin, Igor

    2017-01-01

    In the shipping industry, staffing expenses have become a vital competition parameter. In this paper, an approach and a software tool are presented to support decisions on the staffing of merchant ships. The tool is implemented in the form of a Web user interface that makes use of discrete......-event simulation and allows estimation of the workload and of whether different scenarios are successfully performed taking account of the number of crewmembers, watch schedules, distribution of competencies, and others. The software library ‘SimManning’ at the core of the project is provided as open source...

  14. A Novel Approach for ATC Computation in Deregulated Environment

    Directory of Open Access Journals (Sweden)

    C. K. Babulal

    2006-09-01

    Full Text Available This paper presents a novel method for determination of Available Transfer Capability (ATC based on fuzzy logic. Adaptive Neuro-Fuzzy Inference System (ANFIS is used to determine the step length of Homotophy continuation power flow method by considering the values of load bus voltage and change in load bus voltage. The approach is compared with the already available method. The proposed method determines ATC for various transactions by considering thermal limit, voltage limit and static voltage stability limit and tested in WSCC 9 bus system, New England 39 bus system and Indian 181 bus system

  15. Сlassification of methods of production of computer forensic by usage approach of graph theory

    Directory of Open Access Journals (Sweden)

    Anna Ravilyevna Smolina

    2016-06-01

    Full Text Available Сlassification of methods of production of computer forensic by usage approach of graph theory is proposed. If use this classification, it is possible to accelerate and simplify the search of methods of production of computer forensic and this process to automatize.

  16. Сlassification of methods of production of computer forensic by usage approach of graph theory

    OpenAIRE

    Anna Ravilyevna Smolina; Alexander Alexandrovich Shelupanov

    2016-01-01

    Сlassification of methods of production of computer forensic by usage approach of graph theory is proposed. If use this classification, it is possible to accelerate and simplify the search of methods of production of computer forensic and this process to automatize.

  17. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...

  18. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  19. A functional analytic approach to computer-interactive mathematics.

    Science.gov (United States)

    Ninness, Chris; Rumph, Robin; McCuller, Glen; Harrison, Carol; Ford, Angela M; Ninness, Sharon K

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on particular formula-to-formula and formula-to-graph relations as these formulas pertain to reflections and vertical and horizontal shifts. In training A-B, standard formulas served as samples and factored formulas served as comparisons. In training B-C, factored formulas served as samples and graphs served as comparisons. Subsequently, the program assessed for mutually entailed B-A and C-B relations as well as combinatorially entailed C-A and A-C relations. After all participants demonstrated mutual entailment and combinatorial entailment, we employed a test of novel relations to assess 40 different and complex variations of the original training formulas and their respective graphs. Six of 10 participants who completed training demonstrated perfect or near-perfect performance in identifying novel formula-to-graph relations. Three of the 4 participants who made more than three incorrect responses during the assessment of novel relations showed some commonality among their error patterns. Derived transfer of stimulus control using mathematical relations is discussed.

  20. Computationally efficient model predictive control algorithms a neural network approach

    CERN Document Server

    Ławryńczuk, Maciej

    2014-01-01

    This book thoroughly discusses computationally efficient (suboptimal) Model Predictive Control (MPC) techniques based on neural models. The subjects treated include: ·         A few types of suboptimal MPC algorithms in which a linear approximation of the model or of the predicted trajectory is successively calculated on-line and used for prediction. ·         Implementation details of the MPC algorithms for feedforward perceptron neural models, neural Hammerstein models, neural Wiener models and state-space neural models. ·         The MPC algorithms based on neural multi-models (inspired by the idea of predictive control). ·         The MPC algorithms with neural approximation with no on-line linearization. ·         The MPC algorithms with guaranteed stability and robustness. ·         Cooperation between the MPC algorithms and set-point optimization. Thanks to linearization (or neural approximation), the presented suboptimal algorithms do not require d...

  1. Granular computing and decision-making interactive and iterative approaches

    CERN Document Server

    Chen, Shyi-Ming

    2015-01-01

    This volume is devoted to interactive and iterative processes of decision-making– I2 Fuzzy Decision Making, in brief. Decision-making is inherently interactive. Fuzzy sets help realize human-machine communication in an efficient way by facilitating a two-way interaction in a friendly and transparent manner. Human-centric interaction is of paramount relevance as a leading guiding design principle of decision support systems.   The volume provides the reader with an updated and in-depth material on the conceptually appealing and practically sound methodology and practice of I2 Fuzzy Decision Making. The book engages a wealth of methods of fuzzy sets and Granular Computing, brings new concepts, architectures and practice of fuzzy decision-making providing the reader with various application studies.   The book is aimed at a broad audience of researchers and practitioners in numerous disciplines in which decision-making processes play a pivotal role and serve as a vehicle to produce solutions to existing prob...

  2. Strategic Cognitive Sequencing: A Computational Cognitive Neuroscience Approach

    Directory of Open Access Journals (Sweden)

    Seth A. Herd

    2013-01-01

    Full Text Available We address strategic cognitive sequencing, the “outer loop” of human cognition: how the brain decides what cognitive process to apply at a given moment to solve complex, multistep cognitive tasks. We argue that this topic has been neglected relative to its importance for systematic reasons but that recent work on how individual brain systems accomplish their computations has set the stage for productively addressing how brain regions coordinate over time to accomplish our most impressive thinking. We present four preliminary neural network models. The first addresses how the prefrontal cortex (PFC and basal ganglia (BG cooperate to perform trial-and-error learning of short sequences; the next, how several areas of PFC learn to make predictions of likely reward, and how this contributes to the BG making decisions at the level of strategies. The third models address how PFC, BG, parietal cortex, and hippocampus can work together to memorize sequences of cognitive actions from instruction (or “self-instruction”. The last shows how a constraint satisfaction process can find useful plans. The PFC maintains current and goal states and associates from both of these to find a “bridging” state, an abstract plan. We discuss how these processes could work together to produce strategic cognitive sequencing and discuss future directions in this area.

  3. Promises and Pitfalls of Computer-Supported Mindfulness: Exploring a Situated Mobile Approach

    Directory of Open Access Journals (Sweden)

    Ralph Vacca

    2017-12-01

    Full Text Available Computer-supported mindfulness (CSM is a burgeoning area filled with varied approaches such as mobile apps and EEG headbands. However, many of the approaches focus on providing meditation guidance. The ubiquity of mobile devices may provide new opportunities to support mindfulness practices that are more situated in everyday life. In this paper, a new situated mindfulness approach is explored through a specific mobile app design. Through an experimental design, the approach is compared to traditional audio-based mindfulness meditation, and a mind wandering control, over a one-week period. The study demonstrates the viability for a situated mobile mindfulness approach to induce mindfulness states. However, phenomenological aspects of the situated mobile approach suggest both promises and pitfalls for computer-supported mindfulness using a situated approach.

  4. Advanced Computational Modeling Approaches for Shock Response Prediction

    Science.gov (United States)

    Derkevorkian, Armen; Kolaini, Ali R.; Peterson, Lee

    2015-01-01

    Motivation: (1) The activation of pyroshock devices such as explosives, separation nuts, pin-pullers, etc. produces high frequency transient structural response, typically from few tens of Hz to several hundreds of kHz. (2) Lack of reliable analytical tools makes the prediction of appropriate design and qualification test levels a challenge. (3) In the past few decades, several attempts have been made to develop methodologies that predict the structural responses to shock environments. (4) Currently, there is no validated approach that is viable to predict shock environments overt the full frequency range (i.e., 100 Hz to 10 kHz). Scope: (1) Model, analyze, and interpret space structural systems with complex interfaces and discontinuities, subjected to shock loads. (2) Assess the viability of a suite of numerical tools to simulate transient, non-linear solid mechanics and structural dynamics problems, such as shock wave propagation.

  5. A zero-dimensional approach to compute real radicals

    Directory of Open Access Journals (Sweden)

    Silke J. Spang

    2008-04-01

    Full Text Available The notion of real radicals is a fundamental tool in Real Algebraic Geometry. It takes the role of the radical ideal in Complex Algebraic Geometry. In this article I shall describe the zero-dimensional approach and efficiency improvement I have found during the work on my diploma thesis at the University of Kaiserslautern (cf. [6]. The main focus of this article is on maximal ideals and the properties they have to fulfil to be real. New theorems and properties about maximal ideals are introduced which yield an heuristic prepare_max which splits the maximal ideals into three classes, namely real, not real and the class where we can't be sure whether they are real or not. For the latter we have to apply a coordinate change into general position until we are sure about realness. Finally this constructs a randomized algorithm for real radicals. The underlying theorems and algorithms are described in detail.

  6. A Review of Lightweight Thread Approaches for High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Castello, Adrian; Pena, Antonio J.; Seo, Sangmin; Mayo, Rafael; Balaji, Pavan; Quintana-Orti, Enrique S.

    2016-09-12

    High-level, directive-based solutions are becoming the programming models (PMs) of the multi/many-core architectures. Several solutions relying on operating system (OS) threads perfectly work with a moderate number of cores. However, exascale systems will spawn hundreds of thousands of threads in order to exploit their massive parallel architectures and thus conventional OS threads are too heavy for that purpose. Several lightweight thread (LWT) libraries have recently appeared offering lighter mechanisms to tackle massive concurrency. In order to examine the suitability of LWTs in high-level runtimes, we develop a set of microbenchmarks consisting of commonlyfound patterns in current parallel codes. Moreover, we study the semantics offered by some LWT libraries in order to expose the similarities between different LWT application programming interfaces. This study reveals that a reduced set of LWT functions can be sufficient to cover the common parallel code patterns and that those LWT libraries perform better than OS threads-based solutions in cases where task and nested parallelism are becoming more popular with new architectures.

  7. Effects of artificial gravity on the cardiovascular system: Computational approach

    Science.gov (United States)

    Diaz Artiles, Ana; Heldt, Thomas; Young, Laurence R.

    2016-09-01

    steady-state cardiovascular behavior during sustained artificial gravity and exercise. Further validation of the model was performed using experimental data from the combined exercise and artificial gravity experiments conducted on the MIT CRC, and these results will be presented separately in future publications. This unique computational framework can be used to simulate a variety of centrifuge configuration and exercise intensities to improve understanding and inform decisions about future implementation of artificial gravity in space.

  8. Implementation of a Novel Educational Modeling Approach for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sara Ouahabi

    2014-12-01

    Full Text Available The Cloud model is cost-effective because customers pay for their actual usage without upfront costs, and scalable because it can be used more or less depending on the customers’ needs. Due to its advantages, Cloud has been increasingly adopted in many areas, such as banking, e-commerce, retail industry, and academy. For education, cloud is used to manage the large volume of educational resources produced across many universities in the cloud. Keep interoperability between content in an inter-university Cloud is not always easy. Diffusion of pedagogical contents on the Cloud by different E-Learning institutions leads to heterogeneous content which influence the quality of teaching offered by university to teachers and learners. From this reason, comes the idea of using IMS-LD coupled with metadata in the cloud. This paper presents the implementation of our previous educational modeling by combining an application in J2EE with Reload editor that consists of modeling heterogeneous content in the cloud. The new approach that we followed focuses on keeping interoperability between Educational Cloud content for teachers and learners and facilitates the task of identification, reuse, sharing, adapting teaching and learning resources in the Cloud.

  9. A Hybrid Soft Computing Approach for Subset Problems

    Directory of Open Access Journals (Sweden)

    Broderick Crawford

    2013-01-01

    Full Text Available Subset problems (set partitioning, packing, and covering are formal models for many practical optimization problems. A set partitioning problem determines how the items in one set (S can be partitioned into smaller subsets. All items in S must be contained in one and only one partition. Related problems are set packing (all items must be contained in zero or one partitions and set covering (all items must be contained in at least one partition. Here, we present a hybrid solver based on ant colony optimization (ACO combined with arc consistency for solving this kind of problems. ACO is a swarm intelligence metaheuristic inspired on ants behavior when they search for food. It allows to solve complex combinatorial problems for which traditional mathematical techniques may fail. By other side, in constraint programming, the solving process of Constraint Satisfaction Problems can dramatically reduce the search space by means of arc consistency enforcing constraint consistencies either prior to or during search. Our hybrid approach was tested with set covering and set partitioning dataset benchmarks. It was observed that the performance of ACO had been improved embedding this filtering technique in its constructive phase.

  10. Driving profile modeling and recognition based on soft computing approach.

    Science.gov (United States)

    Wahab, Abdul; Quek, Chai; Tan, Chin Keong; Takeda, Kazuya

    2009-04-01

    Advancements in biometrics-based authentication have led to its increasing prominence and are being incorporated into everyday tasks. Existing vehicle security systems rely only on alarms or smart card as forms of protection. A biometric driver recognition system utilizing driving behaviors is a highly novel and personalized approach and could be incorporated into existing vehicle security system to form a multimodal identification system and offer a greater degree of multilevel protection. In this paper, detailed studies have been conducted to model individual driving behavior in order to identify features that may be efficiently and effectively used to profile each driver. Feature extraction techniques based on Gaussian mixture models (GMMs) are proposed and implemented. Features extracted from the accelerator and brake pedal pressure were then used as inputs to a fuzzy neural network (FNN) system to ascertain the identity of the driver. Two fuzzy neural networks, namely, the evolving fuzzy neural network (EFuNN) and the adaptive network-based fuzzy inference system (ANFIS), are used to demonstrate the viability of the two proposed feature extraction techniques. The performances were compared against an artificial neural network (NN) implementation using the multilayer perceptron (MLP) network and a statistical method based on the GMM. Extensive testing was conducted and the results show great potential in the use of the FNN for real-time driver identification and verification. In addition, the profiling of driver behaviors has numerous other potential applications for use by law enforcement and companies dealing with buses and truck drivers.

  11. Fibre recruitment and shape changes of knee ligaments during motion: as revealed by a computer graphics-based model.

    Science.gov (United States)

    Lu, T W; O'Connor, J J

    1996-01-01

    A computer graphics-based model of the knee ligaments in the sagittal plane was developed for the simulation and visualization of the shape changes and fibre recruitment process of the ligaments during motion under unloaded and loaded conditions. The cruciate and collateral ligaments were modelled as ordered arrays of fibres which link attachment areas on the tibia and femur. Fibres slacken and tighten as the ligament attachment areas on the bones rotate and translate relative to each other. A four-bar linkage, composed of the femur, tibia and selected isometric fibres of the two cruciates, was used to determine the motion of the femur relative to the tibia during passive (unloaded) movement. Fibres were assumed to slacken in a Euler buckling mode when the distances between their attachments are less than chosen reference lengths. The ligament shape changes and buckling patterns are demonstrated with computer graphics. When the tibia is translated anteriorly or posteriorly relative to the femur by muscle forces and external loads, some ligament fibres tighten and are recruited progressively to transmit increasing shear forces. The shape changes and fibre recruitment patterns predicted by the model compare well qualitatively with experimental results reported in the literature. The computer graphics approach provides insight into the micro behaviour of the knee ligaments. It may help to explain ligament injury mechanisms and provide useful information to guide the design of ligament replacements.

  12. An Augmented Incomplete Factorization Approach for Computing the Schur Complement in Stochastic Optimization

    KAUST Repository

    Petra, Cosmin G.; Schenk, Olaf; Lubin, Miles; Gä ertner, Klaus

    2014-01-01

    We present a scalable approach and implementation for solving stochastic optimization problems on high-performance computers. In this work we revisit the sparse linear algebra computations of the parallel solver PIPS with the goal of improving the shared-memory performance and decreasing the time to solution. These computations consist of solving sparse linear systems with multiple sparse right-hand sides and are needed in our Schur-complement decomposition approach to compute the contribution of each scenario to the Schur matrix. Our novel approach uses an incomplete augmented factorization implemented within the PARDISO linear solver and an outer BiCGStab iteration to efficiently absorb pivot perturbations occurring during factorization. This approach is capable of both efficiently using the cores inside a computational node and exploiting sparsity of the right-hand sides. We report on the performance of the approach on highperformance computers when solving stochastic unit commitment problems of unprecedented size (billions of variables and constraints) that arise in the optimization and control of electrical power grids. Our numerical experiments suggest that supercomputers can be efficiently used to solve power grid stochastic optimization problems with thousands of scenarios under the strict "real-time" requirements of power grid operators. To our knowledge, this has not been possible prior to the present work. © 2014 Society for Industrial and Applied Mathematics.

  13. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    Ashby, S.F.; Falgout, R.D.; Tompson, A.F.B.

    1995-12-01

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  14. A Knowledge Engineering Approach to Developing Educational Computer Games for Improving Students' Differentiating Knowledge

    Science.gov (United States)

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Yang, Li-Hsueh; Huang, Iwen

    2013-01-01

    Educational computer games have been recognized as being a promising approach for motivating students to learn. Nevertheless, previous studies have shown that without proper learning strategies or supportive models, the learning achievement of students might not be as good as expected. In this study, a knowledge engineering approach is proposed…

  15. Energy-aware memory management for embedded multimedia systems a computer-aided design approach

    CERN Document Server

    Balasa, Florin

    2011-01-01

    Energy-Aware Memory Management for Embedded Multimedia Systems: A Computer-Aided Design Approach presents recent computer-aided design (CAD) ideas that address memory management tasks, particularly the optimization of energy consumption in the memory subsystem. It explains how to efficiently implement CAD solutions, including theoretical methods and novel algorithms. The book covers various energy-aware design techniques, including data-dependence analysis techniques, memory size estimation methods, extensions of mapping approaches, and memory banking approaches. It shows how these techniques

  16. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model.

    Science.gov (United States)

    Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  17. A systems level approach reveals new gene regulatory modules in the developing ear

    OpenAIRE

    Chen, Jingchen; Tambalo, Monica; Barembaum, Meyer; Ranganathan, Ramya; Simões-Costa, Marcos; Bronner, Marianne E.; Streit, Andrea

    2017-01-01

    The inner ear is a complex vertebrate sense organ, yet it arises from a simple epithelium, the otic placode. Specification towards otic fate requires diverse signals and transcriptional inputs that act sequentially and/or in parallel. Using the chick embryo, we uncover novel genes in the gene regulatory network underlying otic commitment and reveal dynamic changes in gene expression. Functional analysis of selected transcription factors reveals the genetic hierarchy underlying the transition ...

  18. Data fusion in X-ray computed tomography using a superiorization approach.

    Science.gov (United States)

    Schrapp, Michael J; Herman, Gabor T

    2014-05-01

    X-ray computed tomography (CT) is an important and widespread inspection technique in industrial non-destructive testing. However, large-sized and heavily absorbing objects cause artifacts due to either the lack of penetration of the specimen in specific directions or by having data from only a limited angular range of views. In such cases, valuable information about the specimen is not revealed by the CT measurements alone. Further imaging modalities, such as optical scanning and ultrasonic testing, are able to provide data (such as an edge map) that are complementary to the CT acquisition. In this paper, a superiorization approach (a newly developed method for constrained optimization) is used to incorporate the complementary data into the CT reconstruction; this allows precise localization of edges that are not resolvable from the CT data by itself. Superiorization, as presented in this paper, exploits the fact that the simultaneous algebraic reconstruction technique (SART), often used for CT reconstruction, is resilient to perturbations; i.e., it can be modified to produce an output that is as consistent with the CT measurements as the output of unmodified SART, but is more consistent with the complementary data. The application of this superiorized SART method to measured data of a turbine blade demonstrates a clear improvement in the quality of the reconstructed image.

  19. Integration of computational modeling with membrane transport studies reveals new insights into amino acid exchange transport mechanisms

    Science.gov (United States)

    Widdows, Kate L.; Panitchob, Nuttanont; Crocker, Ian P.; Please, Colin P.; Hanson, Mark A.; Sibley, Colin P.; Johnstone, Edward D.; Sengers, Bram G.; Lewis, Rohan M.; Glazier, Jocelyn D.

    2015-01-01

    Uptake of system L amino acid substrates into isolated placental plasma membrane vesicles in the absence of opposing side amino acid (zero-trans uptake) is incompatible with the concept of obligatory exchange, where influx of amino acid is coupled to efflux. We therefore hypothesized that system L amino acid exchange transporters are not fully obligatory and/or that amino acids are initially present inside the vesicles. To address this, we combined computational modeling with vesicle transport assays and transporter localization studies to investigate the mechanisms mediating [14C]l-serine (a system L substrate) transport into human placental microvillous plasma membrane (MVM) vesicles. The carrier model provided a quantitative framework to test the 2 hypotheses that l-serine transport occurs by either obligate exchange or nonobligate exchange coupled with facilitated transport (mixed transport model). The computational model could only account for experimental [14C]l-serine uptake data when the transporter was not exclusively in exchange mode, best described by the mixed transport model. MVM vesicle isolates contained endogenous amino acids allowing for potential contribution to zero-trans uptake. Both L-type amino acid transporter (LAT)1 and LAT2 subtypes of system L were distributed to MVM, with l-serine transport attributed to LAT2. These findings suggest that exchange transporters do not function exclusively as obligate exchangers.—Widdows, K. L., Panitchob, N., Crocker, I. P., Please, C. P., Hanson, M. A., Sibley, C. P., Johnstone, E. D., Sengers, B. G., Lewis, R. M., Glazier, J. D. Integration of computational modeling with membrane transport studies reveals new insights into amino acid exchange transport mechanisms. PMID:25761365

  20. ROMANIA’S SPECIALIZATION IN TRADE TOWARDS EU-27 - A REVEALED COMPARATIVE ADVANTAGE APPROACH

    Directory of Open Access Journals (Sweden)

    Popa Angela Cristina

    2012-07-01

    Full Text Available "International competitiveness" is a complex topic which raised over time many questions and theories on key factors that underpin it and is still subject to wide debate. Such analysis proves to be necessary under the new requirements raised by the participation of Romanian organizations in the European and global competitive environment in which competiting for new markets can be a platform of economic recovery. As companies compete for markets and resources, national economies compete with each other to achieve performance in a specific activity: for example, we can say that Romania has become less competitive in clothing production, and competitive in cars production. But it makes sense to say that "Romania has become more or less competitive as the economy?". The answer is no."Competitiveness" is a meaningless word when referring to national economies. Deniying Romanian competitiveness in a particular industry does not mean that Romania's economy is less competitive. The decline in these industries may be a manifestation of their change in production factors endowment or necessary reallocation these factors from old activities with comparative advantage to new ones. This paper aims to examine the structural competitiveness of Romania vis-a-vis EU-27. Empirical analysis is based on Revealed Comparative Advantage (RCA, an indicator often used in international trade analysis. Section II reviews the empirical literature on the comparative advantage and the competitiveness of Romania, highlighting various theories and approaches, alternative measures of RCA indices are presented in the section III, section IV reports empirical results and the final section draws some conclusions based on the findings. In 2009, in terms of orientation of the foreign investors towards the economic sectors, according to NACE Rev. 2 Classification, the direct foreign investments were directed mainly to Manufactured goods (31,1% of total, within its best represented

  1. Synchrotron-radiation-based X-ray micro-computed tomography reveals dental bur debris under dental composite restorations.

    Science.gov (United States)

    Hedayat, Assem; Nagy, Nicole; Packota, Garnet; Monteith, Judy; Allen, Darcy; Wysokinski, Tomasz; Zhu, Ning

    2016-05-01

    Dental burs are used extensively in dentistry to mechanically prepare tooth structures for restorations (fillings), yet little has been reported on the bur debris left behind in the teeth, and whether it poses potential health risks to patients. Here it is aimed to image dental bur debris under dental fillings, and allude to the potential health hazards that can be caused by this debris when left in direct contact with the biological surroundings, specifically when the debris is made of a non-biocompatible material. Non-destructive micro-computed tomography using the BioMedical Imaging & Therapy facility 05ID-2 beamline at the Canadian Light Source was pursued at 50 keV and at a pixel size of 4 µm to image dental bur fragments under a composite resin dental filling. The bur's cutting edges that produced the fragment were also chemically analyzed. The technique revealed dental bur fragments of different sizes in different locations on the floor of the prepared surface of the teeth and under the filling, which places them in direct contact with the dentinal tubules and the dentinal fluid circulating within them. Dispersive X-ray spectroscopy elemental analysis of the dental bur edges revealed that the fragments are made of tungsten carbide-cobalt, which is bio-incompatible.

  2. Mechanical influences on morphogenesis of the knee joint revealed through morphological, molecular and computational analysis of immobilised embryos.

    Directory of Open Access Journals (Sweden)

    Karen A Roddy

    2011-02-01

    Full Text Available Very little is known about the regulation of morphogenesis in synovial joints. Mechanical forces generated from muscle contractions are required for normal development of several aspects of normal skeletogenesis. Here we show that biophysical stimuli generated by muscle contractions impact multiple events during chick knee joint morphogenesis influencing differential growth of the skeletal rudiment epiphyses and patterning of the emerging tissues in the joint interzone. Immobilisation of chick embryos was achieved through treatment with the neuromuscular blocking agent Decamethonium Bromide. The effects on development of the knee joint were examined using a combination of computational modelling to predict alterations in biophysical stimuli, detailed morphometric analysis of 3D digital representations, cell proliferation assays and in situ hybridisation to examine the expression of a selected panel of genes known to regulate joint development. This work revealed the precise changes to shape, particularly in the distal femur, that occur in an altered mechanical environment, corresponding to predicted changes in the spatial and dynamic patterns of mechanical stimuli and region specific changes in cell proliferation rates. In addition, we show altered patterning of the emerging tissues of the joint interzone with the loss of clearly defined and organised cell territories revealed by loss of characteristic interzone gene expression and abnormal expression of cartilage markers. This work shows that local dynamic patterns of biophysical stimuli generated from muscle contractions in the embryo act as a source of positional information guiding patterning and morphogenesis of the developing knee joint.

  3. Mechanical Influences on Morphogenesis of the Knee Joint Revealed through Morphological, Molecular and Computational Analysis of Immobilised Embryos

    Science.gov (United States)

    Roddy, Karen A.; Prendergast, Patrick J.; Murphy, Paula

    2011-01-01

    Very little is known about the regulation of morphogenesis in synovial joints. Mechanical forces generated from muscle contractions are required for normal development of several aspects of normal skeletogenesis. Here we show that biophysical stimuli generated by muscle contractions impact multiple events during chick knee joint morphogenesis influencing differential growth of the skeletal rudiment epiphyses and patterning of the emerging tissues in the joint interzone. Immobilisation of chick embryos was achieved through treatment with the neuromuscular blocking agent Decamethonium Bromide. The effects on development of the knee joint were examined using a combination of computational modelling to predict alterations in biophysical stimuli, detailed morphometric analysis of 3D digital representations, cell proliferation assays and in situ hybridisation to examine the expression of a selected panel of genes known to regulate joint development. This work revealed the precise changes to shape, particularly in the distal femur, that occur in an altered mechanical environment, corresponding to predicted changes in the spatial and dynamic patterns of mechanical stimuli and region specific changes in cell proliferation rates. In addition, we show altered patterning of the emerging tissues of the joint interzone with the loss of clearly defined and organised cell territories revealed by loss of characteristic interzone gene expression and abnormal expression of cartilage markers. This work shows that local dynamic patterns of biophysical stimuli generated from muscle contractions in the embryo act as a source of positional information guiding patterning and morphogenesis of the developing knee joint. PMID:21386908

  4. Chest computed tomography of a patient revealing severe hypoxia due to amniotic fluid embolism: a case report

    Directory of Open Access Journals (Sweden)

    Inui Daisuke

    2010-02-01

    Full Text Available Abstract Introduction Amniotic fluid embolism is one of the most severe complications in the peripartum period. Because its onset is abrupt and fulminant, it is unlikely that there will be time to examine the condition using thoracic computed tomography (CT. We report a case of life-threatening amniotic fluid embolism, where chest CT in the acute phase was obtained. Case presentation A 22-year-old Asian Japanese primiparous woman was suspected of having an amniotic fluid embolism. After a Cesarean section for cephalopelvic disproportion, her respiratory condition deteriorated. Her chest CT images were examined. CT findings revealed diffuse homogeneous ground-glass shadow in her bilateral peripheral lung fields. She was therefore transferred to our hospital. On admission to our hospital's intensive care unit, she was found to have severe hypoxemia, with SpO2 of 50% with a reservoir mask of 15 L/min oxygen. She was intubated with the support of noninvasive positive pressure ventilation. She was successfully extubated on the sixth day, and discharged from the hospital on the twentieth day. Conclusion This is the first case report describing amniotic fluid embolism in which CT revealed an acute respiratory distress syndrome-like shadow.

  5. Scientific and methodic approaches to reveal stability essence at the industrial enterprises and its functional components

    Directory of Open Access Journals (Sweden)

    Lyulyov Oleksii Valentynovych

    2016-12-01

    Full Text Available The article deals with theoretical analysis of the scientific approaches concerning definition of the concept “stability”, which exist in the scientific literature. There are five different approaches to interpret the concept “enterprise stability”, as an open economic system. On this base, the author’s definition of the enterprise stability is formed. Due to the carried out analysis of the main tendencies in changes of the industry development factors for 2006-2015 and future expectations, the main functional constituents of the enterprise stability. The author suggests to use an approach of self-organizational artificial neural networks to evaluate stability degree at the industrial enterprises.

  6. Cultural Distance-Aware Service Recommendation Approach in Mobile Edge Computing

    Directory of Open Access Journals (Sweden)

    Yan Li

    2018-01-01

    Full Text Available In the era of big data, traditional computing systems and paradigms are not efficient and even difficult to use. For high performance big data processing, mobile edge computing is emerging as a complement framework of cloud computing. In this new computing architecture, services are provided within a close proximity of mobile users by servers at the edge of network. Traditional collaborative filtering recommendation approach only focuses on the similarity extracted from the rating data, which may lead to an inaccuracy expression of user preference. In this paper, we propose a cultural distance-aware service recommendation approach which focuses on not only the similarity but also the local characteristics and preference of users. Our approach employs the cultural distance to express the user preference and combines it with similarity to predict the user ratings and recommend the services with higher rating. In addition, considering the extreme sparsity of the rating data, missing rating prediction based on collaboration filtering is introduced in our approach. The experimental results based on real-world datasets show that our approach outperforms the traditional recommendation approaches in terms of the reliability of recommendation.

  7. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    International Nuclear Information System (INIS)

    Higdon, Dave; McDonnell, Jordan D; Schunck, Nicolas; Sarich, Jason; Wild, Stefan M

    2015-01-01

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model η(θ), where θ denotes the uncertain, best input setting. Hence the statistical model is of the form y=η(θ)+ϵ, where ϵ accounts for measurement, and possibly other, error sources. When nonlinearity is present in η(⋅), the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model η(⋅). This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. We also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory. (paper)

  8. Medical imaging in clinical applications algorithmic and computer-based approaches

    CERN Document Server

    Bhateja, Vikrant; Hassanien, Aboul

    2016-01-01

    This volume comprises of 21 selected chapters, including two overview chapters devoted to abdominal imaging in clinical applications supported computer aided diagnosis approaches as well as different techniques for solving the pectoral muscle extraction problem in the preprocessing part of the CAD systems for detecting breast cancer in its early stage using digital mammograms. The aim of this book is to stimulate further research in medical imaging applications based algorithmic and computer based approaches and utilize them in real-world clinical applications. The book is divided into four parts, Part-I: Clinical Applications of Medical Imaging, Part-II: Classification and clustering, Part-III: Computer Aided Diagnosis (CAD) Tools and Case Studies and Part-IV: Bio-inspiring based Computer Aided diagnosis techniques. .

  9. Letting the ‘cat’ out of the bag: pouch young development of the extinct Tasmanian tiger revealed by X-ray computed tomography

    Science.gov (United States)

    Spoutil, Frantisek; Prochazka, Jan; Black, Jay R.; Medlock, Kathryn; Paddle, Robert N.; Knitlova, Marketa; Hipsley, Christy A.

    2018-01-01

    The Tasmanian tiger or thylacine (Thylacinus cynocephalus) was an iconic Australian marsupial predator that was hunted to extinction in the early 1900s. Despite sharing striking similarities with canids, they failed to evolve many of the specialized anatomical features that characterize carnivorous placental mammals. These evolutionary limitations are thought to arise from functional constraints associated with the marsupial mode of reproduction, in which otherwise highly altricial young use their well-developed forelimbs to climb to the pouch and mouth to suckle. Here we present the first three-dimensional digital developmental series of the thylacine throughout its pouch life using X-ray computed tomography on all known ethanol-preserved specimens. Based on detailed skeletal measurements, we refine the species growth curve to improve age estimates for the individuals. Comparison of allometric growth trends in the appendicular skeleton (fore- and hindlimbs) with that of other placental and marsupial mammals revealed that despite their unique adult morphologies, thylacines retained a generalized early marsupial ontogeny. Our approach also revealed mislabelled specimens that possessed large epipubic bones (vestigial in thylacine) and differing vertebral numbers. All of our generated CT models are publicly available, preserving their developmental morphology and providing a novel digital resource for future studies of this unique marsupial. PMID:29515893

  10. Computer based virtual reality approach towards its application in an accidental emergency at nuclear power plant

    International Nuclear Information System (INIS)

    Yan Jun; Yao Qingshan

    1999-01-01

    Virtual reality is a computer based system for creating and receiving virtual world. As an emerging branch of computer discipline, this approach is extensively expanding and widely used in variety of industries such as national defence, research, engineering, medicine and air navigation. The author intends to present the fundamentals of virtual reality, in attempt to study some interested aspects for use in nuclear power emergency planning

  11. A Representational Approach to Knowledge and Multiple Skill Levels for Broad Classes of Computer Generated Forces

    Science.gov (United States)

    1997-12-01

    that I’ll turn my attention to that computer game we’ve talked so much about... Dave Van Veldhuizen and Scott Brown (soon-to-be Drs. Van Veldhuizen and...Industry Training Systems Conference. 1988. 37. Van Veldhuizen , D. A. and L. J Hutson. "A Design Methodology for Domain Inde- pendent Computer...proposed by Van Veld- huizen and Hutson (37), extends the general architecture to support both a domain- independent approach to implementing CGFs and

  12. D-Wave's Approach to Quantum Computing: 1000-qubits and Counting!

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    In this talk I will describe D-Wave's approach to quantum computing, including the system architecture of our 1000-qubit D-Wave 2X, its programming model, and performance benchmarks. Furthermore, I will describe how the native optimization and sampling capabilities of the quantum processor can be exploited to tackle problems in a variety of fields including medicine, machine learning, physics, and computational finance.

  13. Mathematics of shape description a morphological approach to image processing and computer graphics

    CERN Document Server

    Ghosh, Pijush K

    2009-01-01

    Image processing problems are often not well defined because real images are contaminated with noise and other uncertain factors. In Mathematics of Shape Description, the authors take a mathematical approach to address these problems using the morphological and set-theoretic approach to image processing and computer graphics by presenting a simple shape model using two basic shape operators called Minkowski addition and decomposition. This book is ideal for professional researchers and engineers in Information Processing, Image Measurement, Shape Description, Shape Representation and Computer Graphics. Post-graduate and advanced undergraduate students in pure and applied mathematics, computer sciences, robotics and engineering will also benefit from this book.  Key FeaturesExplains the fundamental and advanced relationships between algebraic system and shape description through the set-theoretic approachPromotes interaction of image processing geochronology and mathematics in the field of algebraic geometryP...

  14. A Cognitive Computing Approach for Classification of Complaints in the Insurance Industry

    Science.gov (United States)

    Forster, J.; Entrup, B.

    2017-10-01

    In this paper we present and evaluate a cognitive computing approach for classification of dissatisfaction and four complaint specific complaint classes in correspondence documents between insurance clients and an insurance company. A cognitive computing approach includes the combination classical natural language processing methods, machine learning algorithms and the evaluation of hypothesis. The approach combines a MaxEnt machine learning algorithm with language modelling, tf-idf and sentiment analytics to create a multi-label text classification model. The result is trained and tested with a set of 2500 original insurance communication documents written in German, which have been manually annotated by the partnering insurance company. With a F1-Score of 0.9, a reliable text classification component has been implemented and evaluated. A final outlook towards a cognitive computing insurance assistant is given in the end.

  15. Anatomical and computed tomographic analysis of the transcochlear and endoscopic transclival approaches to the petroclival region.

    Science.gov (United States)

    Mason, Eric; Van Rompaey, Jason; Carrau, Ricardo; Panizza, Benedict; Solares, C Arturo

    2014-03-01

    Advances in the field of skull base surgery aim to maximize anatomical exposure while minimizing patient morbidity. The petroclival region of the skull base presents numerous challenges for surgical access due to the complex anatomy. The transcochlear approach to the region provides adequate access; however, the resection involved sacrifices hearing and results in at least a grade 3 facial palsy. An endoscopic endonasal approach could potentially avoid negative patient outcomes while providing a desirable surgical window in a select patient population. Cadaveric study. Endoscopic access to the petroclival region was achieved through an endonasal approach. For comparison, a transcochlear approach to the clivus was performed. Different facets of the dissections, such as bone removal volume and exposed surface area, were computed using computed tomography analysis. The endoscopic endonasal approach provided a sufficient corridor to the petroclival region with significantly less bone removal and nearly equivalent exposure of the surgical target, thus facilitating the identification of the relevant anatomy. The lateral approach allowed for better exposure from a posterolateral direction until the inferior petrosal sinus; however, the endonasal approach avoided labyrinthine/cochlear destruction and facial nerve manipulation while providing an anteromedial viewpoint. The endonasal approach also avoided external incisions and cosmetic deficits. The endonasal approach required significant sinonasal resection. Endoscopic access to the petroclival region is a feasible approach. It potentially avoids hearing loss, facial nerve manipulation, and cosmetic damage. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  16. Strong Plasmon-Phonon Splitting and Hybridization in 2D Materials Revealed through a Self-Energy Approach

    DEFF Research Database (Denmark)

    Settnes, Mikkel; Saavedra, J. R. M.; Thygesen, Kristian Sommer

    2017-01-01

    splitting due to this coupling, resulting in a characteristic avoided crossing scheme. We base our results on a computationally efficient approach consisting in including many-body interactions through the electron self-energy. We specify this formalism for a description of plasmons based upon a tight...... nanotriangles with varied size, where we predict remarkable peak splittings and other radical modifications in the spectra due to plasmon interactions with intrinsic optical phonons. Our method is equally applicable to other 2D materials and provides a simple approach for investigating coupling of plasmons...

  17. A Social Network Approach to Provisioning and Management of Cloud Computing Services for Enterprises

    DEFF Research Database (Denmark)

    Kuada, Eric; Olesen, Henning

    2011-01-01

    This paper proposes a social network approach to the provisioning and management of cloud computing services termed Opportunistic Cloud Computing Services (OCCS), for enterprises; and presents the research issues that need to be addressed for its implementation. We hypothesise that OCCS...... will facilitate the adoption process of cloud computing services by enterprises. OCCS deals with the concept of enterprises taking advantage of cloud computing services to meet their business needs without having to pay or paying a minimal fee for the services. The OCCS network will be modelled and implemented...... as a social network of enterprises collaborating strategically for the provisioning and consumption of cloud computing services without entering into any business agreements. We conclude that it is possible to configure current cloud service technologies and management tools for OCCS but there is a need...

  18. Medium-term generation programming in competitive environments: a new optimisation approach for market equilibrium computing

    International Nuclear Information System (INIS)

    Barquin, J.; Centeno, E.; Reneses, J.

    2004-01-01

    The paper proposes a model to represent medium-term hydro-thermal operation of electrical power systems in deregulated frameworks. The model objective is to compute the oligopolistic market equilibrium point in which each utility maximises its profit, based on other firms' behaviour. This problem is not an optimisation one. The main contribution of the paper is to demonstrate that, nevertheless, under some reasonable assumptions, it can be formulated as an equivalent minimisation problem. A computer program has been coded by using the proposed approach. It is used to compute the market equilibrium of a real-size system. (author)

  19. A Crisis Management Approach To Mission Survivability In Computational Multi-Agent Systems

    Directory of Open Access Journals (Sweden)

    Aleksander Byrski

    2010-01-01

    Full Text Available In this paper we present a biologically-inspired approach for mission survivability (consideredas the capability of fulfilling a task such as computation that allows the system to be aware ofthe possible threats or crises that may arise. This approach uses the notion of resources usedby living organisms to control their populations.We present the concept of energetic selectionin agent-based evolutionary systems as well as the means to manipulate the configuration ofthe computation according to the crises or user’s specific demands.

  20. On the sighting of unicorns: A variational approach to computing invariant sets in dynamical systems

    Science.gov (United States)

    Junge, Oliver; Kevrekidis, Ioannis G.

    2017-06-01

    We propose to compute approximations to invariant sets in dynamical systems by minimizing an appropriate distance between a suitably selected finite set of points and its image under the dynamics. We demonstrate, through computational experiments, that this approach can successfully converge to approximations of (maximal) invariant sets of arbitrary topology, dimension, and stability, such as, e.g., saddle type invariant sets with complicated dynamics. We further propose to extend this approach by adding a Lennard-Jones type potential term to the objective function, which yields more evenly distributed approximating finite point sets, and illustrate the procedure through corresponding numerical experiments.

  1. Theoretical triangulation as an approach for revealing the complexity of a classroom discussion

    NARCIS (Netherlands)

    van Drie, J.; Dekker, R.

    2013-01-01

    In this paper we explore the value of theoretical triangulation as a methodological approach for the analysis of classroom interaction. We analyze an excerpt of a whole-class discussion in history from three theoretical perspectives: interactivity of the discourse, conceptual level raising and

  2. A Computer-Aided FPS-Oriented Approach for Construction Briefing

    Institute of Scientific and Technical Information of China (English)

    Xiaochun Luo; Qiping Shen

    2008-01-01

    Function performance specification (FPS) is one of the value management (VM) techniques de- veloped for the explicit statement of optimum product definition. This technique is widely used in software engineering and manufacturing industry, and proved to be successful to perform product defining tasks. This paper describes an FPS-odented approach for construction briefing, which is critical to the successful deliv- ery of construction projects. Three techniques, i.e., function analysis system technique, shared space, and computer-aided toolkit, are incorporated into the proposed approach. A computer-aided toolkit is developed to facilitate the implementation of FPS in the briefing processes. This approach can facilitate systematic, ef- ficient identification, clarification, and representation of client requirements in trail running. The limitations of the approach and future research work are also discussed at the end of the paper.

  3. Linking Individual Learning Styles to Approach-Avoidance Motivational Traits and Computational Aspects of Reinforcement Learning.

    Directory of Open Access Journals (Sweden)

    Kristoffer Carl Aberg

    Full Text Available Learning how to gain rewards (approach learning and avoid punishments (avoidance learning is fundamental for everyday life. While individual differences in approach and avoidance learning styles have been related to genetics and aging, the contribution of personality factors, such as traits, remains undetermined. Moreover, little is known about the computational mechanisms mediating differences in learning styles. Here, we used a probabilistic selection task with positive and negative feedbacks, in combination with computational modelling, to show that individuals displaying better approach (vs. avoidance learning scored higher on measures of approach (vs. avoidance trait motivation, but, paradoxically, also displayed reduced learning speed following positive (vs. negative outcomes. These data suggest that learning different types of information depend on associated reward values and internal motivational drives, possibly determined by personality traits.

  4. Linking Individual Learning Styles to Approach-Avoidance Motivational Traits and Computational Aspects of Reinforcement Learning

    Science.gov (United States)

    Carl Aberg, Kristoffer; Doell, Kimberly C.; Schwartz, Sophie

    2016-01-01

    Learning how to gain rewards (approach learning) and avoid punishments (avoidance learning) is fundamental for everyday life. While individual differences in approach and avoidance learning styles have been related to genetics and aging, the contribution of personality factors, such as traits, remains undetermined. Moreover, little is known about the computational mechanisms mediating differences in learning styles. Here, we used a probabilistic selection task with positive and negative feedbacks, in combination with computational modelling, to show that individuals displaying better approach (vs. avoidance) learning scored higher on measures of approach (vs. avoidance) trait motivation, but, paradoxically, also displayed reduced learning speed following positive (vs. negative) outcomes. These data suggest that learning different types of information depend on associated reward values and internal motivational drives, possibly determined by personality traits. PMID:27851807

  5. A New Approach to Practical Active-Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Buus; Nordholt, Peter Sebastian; Orlandi, Claudio

    2012-01-01

    We propose a new approach to practical two-party computation secure against an active adversary. All prior practical protocols were based on Yao’s garbled circuits. We use an OT-based approach and get efficiency via OT extension in the random oracle model. To get a practical protocol we introduce...... a number of novel techniques for relating the outputs and inputs of OTs in a larger construction....

  6. Artificial intelligence and tutoring systems computational and cognitive approaches to the communication of knowledge

    CERN Document Server

    Wenger, Etienne

    2014-01-01

    Artificial Intelligence and Tutoring Systems: Computational and Cognitive Approaches to the Communication of Knowledge focuses on the cognitive approaches, methodologies, principles, and concepts involved in the communication of knowledge. The publication first elaborates on knowledge communication systems, basic issues, and tutorial dialogues. Concerns cover natural reasoning and tutorial dialogues, shift from local strategies to multiple mental models, domain knowledge, pedagogical knowledge, implicit versus explicit encoding of knowledge, knowledge communication, and practical and theoretic

  7. Cross-species transcriptomic approach reveals genes in hamster implantation sites.

    Science.gov (United States)

    Lei, Wei; Herington, Jennifer; Galindo, Cristi L; Ding, Tianbing; Brown, Naoko; Reese, Jeff; Paria, Bibhash C

    2014-12-01

    The mouse model has greatly contributed to understanding molecular mechanisms involved in the regulation of progesterone (P4) plus estrogen (E)-dependent blastocyst implantation process. However, little is known about contributory molecular mechanisms of the P4-only-dependent blastocyst implantation process that occurs in species such as hamsters, guineapigs, rabbits, pigs, rhesus monkeys, and perhaps humans. We used the hamster as a model of P4-only-dependent blastocyst implantation and carried out cross-species microarray (CSM) analyses to reveal differentially expressed genes at the blastocyst implantation site (BIS), in order to advance the understanding of molecular mechanisms of implantation. Upregulation of 112 genes and downregulation of 77 genes at the BIS were identified using a mouse microarray platform, while use of the human microarray revealed 62 up- and 38 down-regulated genes at the BIS. Excitingly, a sizable number of genes (30 up- and 11 down-regulated genes) were identified as a shared pool by both CSMs. Real-time RT-PCR and in situ hybridization validated the expression patterns of several up- and down-regulated genes identified by both CSMs at the hamster and mouse BIS to demonstrate the merit of CSM findings across species, in addition to revealing genes specific to hamsters. Functional annotation analysis found that genes involved in the spliceosome, proteasome, and ubiquination pathways are enriched at the hamster BIS, while genes associated with tight junction, SAPK/JNK signaling, and PPARα/RXRα signalings are repressed at the BIS. Overall, this study provides a pool of genes and evidence of their participation in up- and down-regulated cellular functions/pathways at the hamster BIS. © 2014 Society for Reproduction and Fertility.

  8. A trait-based approach reveals the feeding selectivity of a small endangered Mediterranean fish

    OpenAIRE

    Rodriguez-Lozano, Pablo; Verkaik, Iraima; Maceda Veiga, Alberto; Monroy, Mario; de Sostoa, Adolf; Rieradevall, Maria; Prat, Narcis

    2016-01-01

    Abstract Functional traits are growing in popularity in modern ecology, but feeding studies remain primarily rooted in a taxonomic?based perspective. However, consumers do not have any reason to select their prey using a taxonomic criterion, and prey assemblages are variable in space and time, which makes taxon?based studies assemblage?specific. To illustrate the benefits of the trait?based approach to assessing food choice, we studied the feeding ecology of the endangered freshwater fish Bar...

  9. Changes in bone macro- and microstructure in diabetic obese mice revealed by high resolution microfocus X-ray computed tomography

    Science.gov (United States)

    Kerckhofs, G.; Durand, M.; Vangoitsenhoven, R.; Marin, C.; van der Schueren, B.; Carmeliet, G.; Luyten, F. P.; Geris, L.; Vandamme, K.

    2016-10-01

    High resolution microfocus X-ray computed tomography (HR-microCT) was employed to characterize the structural alterations of the cortical and trabecular bone in a mouse model of obesity-driven type 2 diabetes (T2DM). C57Bl/6J mice were randomly assigned for 14 weeks to either a control diet-fed (CTRL) or a high fat diet (HFD)-fed group developing obesity, hyperglycaemia and insulin resistance. The HFD group showed an increased trabecular thickness and a decreased trabecular number compared to CTRL animals. Midshaft tibia intracortical porosity was assessed at two spatial image resolutions. At 2 μm scale, no change was observed in the intracortical structure. At 1 μm scale, a decrease in the cortical vascular porosity of the HFD bone was evidenced. The study of a group of 8 week old animals corresponding to animals at the start of the diet challenge revealed that the decreased vascular porosity was T2DM-dependant and not related to the ageing process. Our results offer an unprecedented ultra-characterization of the T2DM compromised skeletal micro-architecture and highlight an unrevealed T2DM-related decrease in the cortical vascular porosity, potentially affecting the bone health and fragility. Additionally, it provides some insights into the technical challenge facing the assessment of the rodent bone structure using HR-microCT imaging.

  10. Changes in bone macro- and microstructure in diabetic obese mice revealed by high resolution microfocus X-ray computed tomography

    Science.gov (United States)

    Kerckhofs, G.; Durand, M.; Vangoitsenhoven, R.; Marin, C.; Van der Schueren, B.; Carmeliet, G.; Luyten, F. P.; Geris, L.; Vandamme, K.

    2016-01-01

    High resolution microfocus X-ray computed tomography (HR-microCT) was employed to characterize the structural alterations of the cortical and trabecular bone in a mouse model of obesity-driven type 2 diabetes (T2DM). C57Bl/6J mice were randomly assigned for 14 weeks to either a control diet-fed (CTRL) or a high fat diet (HFD)-fed group developing obesity, hyperglycaemia and insulin resistance. The HFD group showed an increased trabecular thickness and a decreased trabecular number compared to CTRL animals. Midshaft tibia intracortical porosity was assessed at two spatial image resolutions. At 2 μm scale, no change was observed in the intracortical structure. At 1 μm scale, a decrease in the cortical vascular porosity of the HFD bone was evidenced. The study of a group of 8 week old animals corresponding to animals at the start of the diet challenge revealed that the decreased vascular porosity was T2DM-dependant and not related to the ageing process. Our results offer an unprecedented ultra-characterization of the T2DM compromised skeletal micro-architecture and highlight an unrevealed T2DM-related decrease in the cortical vascular porosity, potentially affecting the bone health and fragility. Additionally, it provides some insights into the technical challenge facing the assessment of the rodent bone structure using HR-microCT imaging. PMID:27759061

  11. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex

  12. Can Computers Be Used for Whole Language Approaches to Reading and Language Arts?

    Science.gov (United States)

    Balajthy, Ernest

    Holistic approaches to the teaching of reading and writing, most notably the Whole Language movement, reject the philosophy that language skills can be taught. Instead, holistic teachers emphasize process, and they structure the students' classroom activities to be rich in language experience. Computers can be used as tools for whole language…

  13. How people learn while playing serious games: A computational modelling approach

    NARCIS (Netherlands)

    Westera, Wim

    2017-01-01

    This paper proposes a computational modelling approach for investigating the interplay of learning and playing in serious games. A formal model is introduced that allows for studying the details of playing a serious game under diverse conditions. The dynamics of player action and motivation is based

  14. Relationships among Taiwanese Children's Computer Game Use, Academic Achievement and Parental Governing Approach

    Science.gov (United States)

    Yeh, Duen-Yian; Cheng, Ching-Hsue

    2016-01-01

    This study examined the relationships among children's computer game use, academic achievement and parental governing approach to propose probable answers for the doubts of Taiwanese parents. 355 children (ages 11-14) were randomly sampled from 20 elementary schools in a typically urbanised county in Taiwan. Questionnaire survey (five questions)…

  15. A Computer-Based Game That Promotes Mathematics Learning More than a Conventional Approach

    Science.gov (United States)

    McLaren, Bruce M.; Adams, Deanne M.; Mayer, Richard E.; Forlizzi, Jodi

    2017-01-01

    Excitement about learning from computer-based games has been papable in recent years and has led to the development of many educational games. However, there are relatively few sound empirical studies in the scientific literature that have shown the benefits of learning mathematics from games as opposed to more traditional approaches. The…

  16. Computer simulation of HTGR fuel microspheres using a Monte-Carlo statistical approach

    International Nuclear Information System (INIS)

    Hedrick, C.E.

    1976-01-01

    The concept and computational aspects of a Monte-Carlo statistical approach in relating structure of HTGR fuel microspheres to the uranium content of fuel samples have been verified. Results of the preliminary validation tests and the benefits to be derived from the program are summarized

  17. A computational approach to evaluate the androgenic affinity of iprodione, procymidone, vinclozolin and their metabolites.

    Directory of Open Access Journals (Sweden)

    Corrado Lodovico Galli

    Full Text Available Our research is aimed at devising and assessing a computational approach to evaluate the affinity of endocrine active substances (EASs and their metabolites towards the ligand binding domain (LBD of the androgen receptor (AR in three distantly related species: human, rat, and zebrafish. We computed the affinity for all the selected molecules following a computational approach based on molecular modelling and docking. Three different classes of molecules with well-known endocrine activity (iprodione, procymidone, vinclozolin, and a selection of their metabolites were evaluated. Our approach was demonstrated useful as the first step of chemical safety evaluation since ligand-target interaction is a necessary condition for exerting any biological effect. Moreover, a different sensitivity concerning AR LBD was computed for the tested species (rat being the least sensitive of the three. This evidence suggests that, in order not to over-/under-estimate the risks connected with the use of a chemical entity, further in vitro and/or in vivo tests should be carried out only after an accurate evaluation of the most suitable cellular system or animal species. The introduction of in silico approaches to evaluate hazard can accelerate discovery and innovation with a lower economic effort than with a fully wet strategy.

  18. A computational approach to evaluate the androgenic affinity of iprodione, procymidone, vinclozolin and their metabolites.

    Science.gov (United States)

    Galli, Corrado Lodovico; Sensi, Cristina; Fumagalli, Amos; Parravicini, Chiara; Marinovich, Marina; Eberini, Ivano

    2014-01-01

    Our research is aimed at devising and assessing a computational approach to evaluate the affinity of endocrine active substances (EASs) and their metabolites towards the ligand binding domain (LBD) of the androgen receptor (AR) in three distantly related species: human, rat, and zebrafish. We computed the affinity for all the selected molecules following a computational approach based on molecular modelling and docking. Three different classes of molecules with well-known endocrine activity (iprodione, procymidone, vinclozolin, and a selection of their metabolites) were evaluated. Our approach was demonstrated useful as the first step of chemical safety evaluation since ligand-target interaction is a necessary condition for exerting any biological effect. Moreover, a different sensitivity concerning AR LBD was computed for the tested species (rat being the least sensitive of the three). This evidence suggests that, in order not to over-/under-estimate the risks connected with the use of a chemical entity, further in vitro and/or in vivo tests should be carried out only after an accurate evaluation of the most suitable cellular system or animal species. The introduction of in silico approaches to evaluate hazard can accelerate discovery and innovation with a lower economic effort than with a fully wet strategy.

  19. Approach to Computer Implementation of Mathematical Model of 3-Phase Induction Motor

    Science.gov (United States)

    Pustovetov, M. Yu

    2018-03-01

    This article discusses the development of the computer model of an induction motor based on the mathematical model in a three-phase stator reference frame. It uses an approach that allows combining during preparation of the computer model dual methods: means of visual programming circuitry (in the form of electrical schematics) and logical one (in the form of block diagrams). The approach enables easy integration of the model of an induction motor as part of more complex models of electrical complexes and systems. The developed computer model gives the user access to the beginning and the end of a winding of each of the three phases of the stator and rotor. This property is particularly important when considering the asymmetric modes of operation or when powered by the special circuitry of semiconductor converters.

  20. Combinatorial computational chemistry approach to the design of metal catalysts for deNOx

    International Nuclear Information System (INIS)

    Endou, Akira; Jung, Changho; Kusagaya, Tomonori; Kubo, Momoji; Selvam, Parasuraman; Miyamoto, Akira

    2004-01-01

    Combinatorial chemistry is an efficient technique for the synthesis and screening of a large number of compounds. Recently, we introduced the combinatorial approach to computational chemistry for catalyst design and proposed a new method called ''combinatorial computational chemistry''. In the present study, we have applied this combinatorial computational chemistry approach to the design of precious metal catalysts for deNO x . As the first step of the screening of the metal catalysts, we studied Rh, Pd, Ag, Ir, Pt, and Au clusters regarding the adsorption properties towards NO molecule. It was demonstrated that the energetically most stable adsorption state of NO on Ir model cluster, which was irrespective of both the shape and number of atoms including the model clusters

  1. Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.

    Science.gov (United States)

    Montalvo-Acosta, Joel José; Cecchini, Marco

    2016-12-01

    The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Thermodynamic study of 2-aminothiazole and 2-aminobenzothiazole: Experimental and computational approaches

    International Nuclear Information System (INIS)

    Silva, Ana L.R.; Monte, Manuel J.S.; Morais, Victor M.F.; Ribeiro da Silva, Maria D.M.C.

    2014-01-01

    Highlights: • Combustion of 2-aminothiazole and 2-aminobenzothiazole by rotating bomb calorimetry. • Enthalpies of sublimation of 2-aminothiazole and 2-aminobenzothiazole. • Gaseous enthalpies of formation of 2-aminothiazole and 2-aminobenzothiazole. • Gaseous enthalpies of formation calculated from high-level MO calculations. • Gas-phase enthalpies of formation estimated from G3(MP2)//B3LYP approach. - Abstract: This work reports an experimental and computational thermochemical study of two aminothiazole derivatives, namely 2-aminothiazole and 2-aminobenzothiazole. The standard (p° = 0.1 MPa) molar energies of combustion of these compounds were measured by rotating bomb combustion calorimetry. The standard molar enthalpies of sublimation, at T = 298.15 K, were derived from the temperature dependence of the vapor pressures of these compounds, measured by the Knudsen-effusion technique and from high temperature Calvet microcalorimetry. The conjugation of these experimental results enabled the calculation of the standard molar enthalpies of formation in the gaseous state, at T = 298.15 K, for the compounds studied. The corresponding standard Gibbs free energies of formation in crystalline and gaseous phases were also derived, allowing the analysis of their stability, in these phases. We have also estimated the gas-phase enthalpies of formation from high-level molecular orbital calculations at the G3(MP2)//B3LYP level of theory, the estimates revealing very good agreement with the experimental ones. The importance of some stabilizing electronic interactions occurring in the title molecules has been studied and quantitatively evaluated through Natural Bonding Orbital (NBO) of the corresponding wavefunctions and their Nucleus Independent Chemical Shifts (NICS) parameters have been calculated in order to rationalize the effect of electronic delocalization upon stability

  3. A computationally efficient Bayesian sequential simulation approach for the assimilation of vast and diverse hydrogeophysical datasets

    Science.gov (United States)

    Nussbaumer, Raphaël; Gloaguen, Erwan; Mariéthoz, Grégoire; Holliger, Klaus

    2016-04-01

    of simulation path at various scales. The newly implemented search method for kriging reduces the computational cost from an exponential dependence with regard to the grid size in the original algorithm to a linear relationship, as each neighboring search becomes independent from the grid size. For the considered examples, our results show a sevenfold reduction in run time for each additional realization when a constant simulation path is used. The traditional criticism that constant path techniques introduce a bias to the simulations was explored and our findings do indeed reveal a minor reduction in the diversity of the simulations. This bias can, however, be largely eliminated by changing the path type at different scales through the use of the multi-grid approach. Finally, we show that adapting the aggregation weight at each scale considered in our multi-grid approach allows for reproducing both the variogram and histogram, and the spatial trend of the underlying data.

  4. ceRNAs in plants: computational approaches and associated challenges for target mimic research.

    Science.gov (United States)

    Paschoal, Alexandre Rossi; Lozada-Chávez, Irma; Domingues, Douglas Silva; Stadler, Peter F

    2017-05-30

    The competing endogenous RNA hypothesis has gained increasing attention as a potential global regulatory mechanism of microRNAs (miRNAs), and as a powerful tool to predict the function of many noncoding RNAs, including miRNAs themselves. Most studies have been focused on animals, although target mimic (TMs) discovery as well as important computational and experimental advances has been developed in plants over the past decade. Thus, our contribution summarizes recent progresses in computational approaches for research of miRNA:TM interactions. We divided this article in three main contributions. First, a general overview of research on TMs in plants is presented with practical descriptions of the available literature, tools, data, databases and computational reports. Second, we describe a common protocol for the computational and experimental analyses of TM. Third, we provide a bioinformatics approach for the prediction of TM motifs potentially cross-targeting both members within the same or from different miRNA families, based on the identification of consensus miRNA-binding sites from known TMs across sequenced genomes, transcriptomes and known miRNAs. This computational approach is promising because, in contrast to animals, miRNA families in plants are large with identical or similar members, several of which are also highly conserved. From the three consensus TM motifs found with our approach: MIM166, MIM171 and MIM159/319, the last one has found strong support on the recent experimental work by Reichel and Millar [Specificity of plant microRNA TMs: cross-targeting of mir159 and mir319. J Plant Physiol 2015;180:45-8]. Finally, we stress the discussion on the major computational and associated experimental challenges that have to be faced in future ceRNA studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. A New Approach to Practical Active-Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Buus; Nordholt, Peter Sebastian; Orlandi, Claudio

    2011-01-01

    We propose a new approach to practical two-party computation secure against an active adversary. All prior practical protocols were based on Yao's garbled circuits. We use an OT-based approach and get efficiency via OT extension in the random oracle model. To get a practical protocol we introduce...... a number of novel techniques for relating the outputs and inputs of OTs in a larger construction. We also report on an implementation of this approach, that shows that our protocol is more efficient than any previous one: For big enough circuits, we can evaluate more than 20000 Boolean gates per second...

  6. A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.

    Science.gov (United States)

    De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc

    2010-09-01

    In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.

  7. Current Trend Towards Using Soft Computing Approaches to Phase Synchronization in Communication Systems

    Science.gov (United States)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    1999-01-01

    This paper surveys recent advances in communications that utilize soft computing approaches to phase synchronization. Soft computing, as opposed to hard computing, is a collection of complementary methodologies that act in producing the most desirable control, decision, or estimation strategies. Recently, the communications area has explored the use of the principal constituents of soft computing, namely, fuzzy logic, neural networks, and genetic algorithms, for modeling, control, and most recently for the estimation of phase in phase-coherent communications. If the receiver in a digital communications system is phase-coherent, as is often the case, phase synchronization is required. Synchronization thus requires estimation and/or control at the receiver of an unknown or random phase offset.

  8. Wavelets-Computational Aspects of Sterian Realistic Approach to Uncertainty Principle in High Energy Physics: A Transient Approach

    Directory of Open Access Journals (Sweden)

    Cristian Toma

    2013-01-01

    Full Text Available This study presents wavelets-computational aspects of Sterian-realistic approach to uncertainty principle in high energy physics. According to this approach, one cannot make a device for the simultaneous measuring of the canonical conjugate variables in reciprocal Fourier spaces. However, such aspects regarding the use of conjugate Fourier spaces can be also noticed in quantum field theory, where the position representation of a quantum wave is replaced by momentum representation before computing the interaction in a certain point of space, at a certain moment of time. For this reason, certain properties regarding the switch from one representation to another in these conjugate Fourier spaces should be established. It is shown that the best results can be obtained using wavelets aspects and support macroscopic functions for computing (i wave-train nonlinear relativistic transformation, (ii reflection/refraction with a constant shift, (iii diffraction considered as interaction with a null phase shift without annihilation of associated wave, (iv deflection by external electromagnetic fields without phase loss, and (v annihilation of associated wave-train through fast and spatially extended phenomena according to uncertainty principle.

  9. A Single-Cell Biochemistry Approach Reveals PAR Complex Dynamics during Cell Polarization.

    Science.gov (United States)

    Dickinson, Daniel J; Schwager, Francoise; Pintard, Lionel; Gotta, Monica; Goldstein, Bob

    2017-08-21

    Regulated protein-protein interactions are critical for cell signaling, differentiation, and development. For the study of dynamic regulation of protein interactions in vivo, there is a need for techniques that can yield time-resolved information and probe multiple protein binding partners simultaneously, using small amounts of starting material. Here we describe a single-cell protein interaction assay. Single-cell lysates are generated at defined time points and analyzed using single-molecule pull-down, yielding information about dynamic protein complex regulation in vivo. We established the utility of this approach by studying PAR polarity proteins, which mediate polarization of many animal cell types. We uncovered striking regulation of PAR complex composition and stoichiometry during Caenorhabditis elegans zygote polarization, which takes place in less than 20 min. PAR complex dynamics are linked to the cell cycle by Polo-like kinase 1 and govern the movement of PAR proteins to establish polarity. Our results demonstrate an approach to study dynamic biochemical events in vivo. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A Single-Granule-Level Approach Reveals Ecological Heterogeneity in an Upflow Anaerobic Sludge Blanket Reactor.

    Directory of Open Access Journals (Sweden)

    Kyohei Kuroda

    Full Text Available Upflow anaerobic sludge blanket (UASB reactor has served as an effective process to treat industrial wastewater such as purified terephthalic acid (PTA wastewater. For optimal UASB performance, balanced ecological interactions between syntrophs, methanogens, and fermenters are critical. However, much of the interactions remain unclear because UASB have been studied at a "macro"-level perspective of the reactor ecosystem. In reality, such reactors are composed of a suite of granules, each forming individual micro-ecosystems treating wastewater. Thus, typical approaches may be oversimplifying the complexity of the microbial ecology and granular development. To identify critical microbial interactions at both macro- and micro- level ecosystem ecology, we perform community and network analyses on 300 PTA-degrading granules from a lab-scale UASB reactor and two full-scale reactors. Based on MiSeq-based 16S rRNA gene sequencing of individual granules, different granule-types co-exist in both full-scale reactors regardless of granule size and reactor sampling depth, suggesting that distinct microbial interactions occur in different granules throughout the reactor. In addition, we identify novel networks of syntrophic metabolic interactions in different granules, perhaps caused by distinct thermodynamic conditions. Moreover, unseen methanogenic relationships (e.g. "Candidatus Aminicenantes" and Methanosaeta are observed in UASB reactors. In total, we discover unexpected microbial interactions in granular micro-ecosystems supporting UASB ecology and treatment through a unique single-granule level approach.

  11. A Single-Granule-Level Approach Reveals Ecological Heterogeneity in an Upflow Anaerobic Sludge Blanket Reactor

    Science.gov (United States)

    Mei, Ran; Narihiro, Takashi; Bocher, Benjamin T. W.; Yamaguchi, Takashi; Liu, Wen-Tso

    2016-01-01

    Upflow anaerobic sludge blanket (UASB) reactor has served as an effective process to treat industrial wastewater such as purified terephthalic acid (PTA) wastewater. For optimal UASB performance, balanced ecological interactions between syntrophs, methanogens, and fermenters are critical. However, much of the interactions remain unclear because UASB have been studied at a “macro”-level perspective of the reactor ecosystem. In reality, such reactors are composed of a suite of granules, each forming individual micro-ecosystems treating wastewater. Thus, typical approaches may be oversimplifying the complexity of the microbial ecology and granular development. To identify critical microbial interactions at both macro- and micro- level ecosystem ecology, we perform community and network analyses on 300 PTA–degrading granules from a lab-scale UASB reactor and two full-scale reactors. Based on MiSeq-based 16S rRNA gene sequencing of individual granules, different granule-types co-exist in both full-scale reactors regardless of granule size and reactor sampling depth, suggesting that distinct microbial interactions occur in different granules throughout the reactor. In addition, we identify novel networks of syntrophic metabolic interactions in different granules, perhaps caused by distinct thermodynamic conditions. Moreover, unseen methanogenic relationships (e.g. “Candidatus Aminicenantes” and Methanosaeta) are observed in UASB reactors. In total, we discover unexpected microbial interactions in granular micro-ecosystems supporting UASB ecology and treatment through a unique single-granule level approach. PMID:27936088

  12. A Linear Dynamical Systems Approach to Streamflow Reconstruction Reveals History of Regime Shifts in Northern Thailand

    Science.gov (United States)

    Nguyen, Hung T. T.; Galelli, Stefano

    2018-03-01

    Catchment dynamics is not often modeled in streamflow reconstruction studies; yet, the streamflow generation process depends on both catchment state and climatic inputs. To explicitly account for this interaction, we contribute a linear dynamic model, in which streamflow is a function of both catchment state (i.e., wet/dry) and paleoclimatic proxies. The model is learned using a novel variant of the Expectation-Maximization algorithm, and it is used with a paleo drought record—the Monsoon Asia Drought Atlas—to reconstruct 406 years of streamflow for the Ping River (northern Thailand). Results for the instrumental period show that the dynamic model has higher accuracy than conventional linear regression; all performance scores improve by 45-497%. Furthermore, the reconstructed trajectory of the state variable provides valuable insights about the catchment history—e.g., regime-like behavior—thereby complementing the information contained in the reconstructed streamflow time series. The proposed technique can replace linear regression, since it only requires information on streamflow and climatic proxies (e.g., tree-rings, drought indices); furthermore, it is capable of readily generating stochastic streamflow replicates. With a marginal increase in computational requirements, the dynamic model brings more desirable features and value to streamflow reconstructions.

  13. Templet Web: the use of volunteer computing approach in PaaS-style cloud

    Science.gov (United States)

    Vostokin, Sergei; Artamonov, Yuriy; Tsarev, Daniil

    2018-03-01

    This article presents the Templet Web cloud service. The service is designed for high-performance scientific computing automation. The use of high-performance technology is specifically required by new fields of computational science such as data mining, artificial intelligence, machine learning, and others. Cloud technologies provide a significant cost reduction for high-performance scientific applications. The main objectives to achieve this cost reduction in the Templet Web service design are: (a) the implementation of "on-demand" access; (b) source code deployment management; (c) high-performance computing programs development automation. The distinctive feature of the service is the approach mainly used in the field of volunteer computing, when a person who has access to a computer system delegates his access rights to the requesting user. We developed an access procedure, algorithms, and software for utilization of free computational resources of the academic cluster system in line with the methods of volunteer computing. The Templet Web service has been in operation for five years. It has been successfully used for conducting laboratory workshops and solving research problems, some of which are considered in this article. The article also provides an overview of research directions related to service development.

  14. A Model for the Acceptance of Cloud Computing Technology Using DEMATEL Technique and System Dynamics Approach

    Directory of Open Access Journals (Sweden)

    seyyed mohammad zargar

    2018-03-01

    Full Text Available Cloud computing is a new method to provide computing resources and increase computing power in organizations. Despite the many benefits this method shares, it has not been universally used because of some obstacles including security issues and has become a concern for IT managers in organization. In this paper, the general definition of cloud computing is presented. In addition, having reviewed previous studies, the researchers identified effective variables on technology acceptance and, especially, cloud computing technology. Then, using DEMATEL technique, the effectiveness and permeability of the variable were determined. The researchers also designed a model to show the existing dynamics in cloud computing technology using system dynamics approach. The validity of the model was confirmed through evaluation methods in dynamics model by using VENSIM software. Finally, based on different conditions of the proposed model, a variety of scenarios were designed. Then, the implementation of these scenarios was simulated within the proposed model. The results showed that any increase in data security, government support and user training can lead to the increase in the adoption and use of cloud computing technology.

  15. Templet Web: the use of volunteer computing approach in PaaS-style cloud

    Directory of Open Access Journals (Sweden)

    Vostokin Sergei

    2018-03-01

    Full Text Available This article presents the Templet Web cloud service. The service is designed for high-performance scientific computing automation. The use of high-performance technology is specifically required by new fields of computational science such as data mining, artificial intelligence, machine learning, and others. Cloud technologies provide a significant cost reduction for high-performance scientific applications. The main objectives to achieve this cost reduction in the Templet Web service design are: (a the implementation of “on-demand” access; (b source code deployment management; (c high-performance computing programs development automation. The distinctive feature of the service is the approach mainly used in the field of volunteer computing, when a person who has access to a computer system delegates his access rights to the requesting user. We developed an access procedure, algorithms, and software for utilization of free computational resources of the academic cluster system in line with the methods of volunteer computing. The Templet Web service has been in operation for five years. It has been successfully used for conducting laboratory workshops and solving research problems, some of which are considered in this article. The article also provides an overview of research directions related to service development.

  16. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  17. An Efficient Approach for Fast and Accurate Voltage Stability Margin Computation in Large Power Grids

    Directory of Open Access Journals (Sweden)

    Heng-Yi Su

    2016-11-01

    Full Text Available This paper proposes an efficient approach for the computation of voltage stability margin (VSM in a large-scale power grid. The objective is to accurately and rapidly determine the load power margin which corresponds to voltage collapse phenomena. The proposed approach is based on the impedance match-based technique and the model-based technique. It combines the Thevenin equivalent (TE network method with cubic spline extrapolation technique and the continuation technique to achieve fast and accurate VSM computation for a bulk power grid. Moreover, the generator Q limits are taken into account for practical applications. Extensive case studies carried out on Institute of Electrical and Electronics Engineers (IEEE benchmark systems and the Taiwan Power Company (Taipower, Taipei, Taiwan system are used to demonstrate the effectiveness of the proposed approach.

  18. New quantitative approaches reveal the spatial preference of nuclear compartments in mammalian fibroblasts.

    Science.gov (United States)

    Weston, David J; Russell, Richard A; Batty, Elizabeth; Jensen, Kirsten; Stephens, David A; Adams, Niall M; Freemont, Paul S

    2015-03-06

    The nuclei of higher eukaryotic cells display compartmentalization and certain nuclear compartments have been shown to follow a degree of spatial organization. To date, the study of nuclear organization has often involved simple quantitative procedures that struggle with both the irregularity of the nuclear boundary and the problem of handling replicate images. Such studies typically focus on inter-object distance, rather than spatial location within the nucleus. The concern of this paper is the spatial preference of nuclear compartments, for which we have developed statistical tools to quantitatively study and explore nuclear organization. These tools combine replicate images to generate 'aggregate maps' which represent the spatial preferences of nuclear compartments. We present two examples of different compartments in mammalian fibroblasts (WI-38 and MRC-5) that demonstrate new knowledge of spatial preference within the cell nucleus. Specifically, the spatial preference of RNA polymerase II is preserved across normal and immortalized cells, whereas PML nuclear bodies exhibit a change in spatial preference from avoiding the centre in normal cells to exhibiting a preference for the centre in immortalized cells. In addition, we show that SC35 splicing speckles are excluded from the nuclear boundary and localize throughout the nucleoplasm and in the interchromatin space in non-transformed WI-38 cells. This new methodology is thus able to reveal the effect of large-scale perturbation on spatial architecture and preferences that would not be obvious from single cell imaging.

  19. A proteomic style approach to characterize a grass mix product reveals potential immunotherapeutic benefit.

    Science.gov (United States)

    Bullimore, Alan; Swan, Nicola; Alawode, Wemimo; Skinner, Murray

    2011-09-01

    Grass allergy immunotherapies often consist of a mix of different grass extracts, each containing several proteins of different physiochemical properties; however, the subtle contributions of each protein are difficult to elucidate. This study aimed to identify and characterize the group 1 and 5 allergens in a 13 grass extract and to standardize the extraction method. The grass pollens were extracted in isolation and pooled and also in combination and analyzed using a variety of techniques including enzyme-linked immunosorbent assay, liquid chromatog-raphy-mass spectrometry, and sodium dodecyl sulfate-polyacrylam-ide gel electrophoresis. Gold-staining and IgE immunoblotting revealed a high degree of homology of protein bands between the 13 species and the presence of a densely stained doublet at 25-35 kD along with protein bands at approximately 12.5, 17, and 50 kD. The doublet from each grass species demonstrated a high level of group 1 and 5 interspecies homology. However, there were a number of bands unique to specific grasses consistent with evolutionary change and indicative that a grass mix immunotherapeutic could be considered broad spectrum. Sodium dodecyl sulfate-polyacrylamide gel electro-phoresis and IgE immunoblotting showed all 13 grasses share a high degree of homology, particularly in terms of group 1 and 5 allergens. IgE and IgG enzyme-linked immunosorbent assay potencies were shown to be independent of extraction method.

  20. A computational approach to discovering the functions of bacterial phytochromes by analysis of homolog distributions

    Directory of Open Access Journals (Sweden)

    Lamparter Tilman

    2006-03-01

    bacterial phytochromes in ammonium assimilation and amino acid metabolism. Conclusion It was possible to identify several proteins that might share common functions with bacterial phytochromes by the co-distribution approach. This computational approach might also be helpful in other cases.

  1. Vertebral Pneumaticity in the Ornithomimosaur Archaeornithomimus (Dinosauria: Theropoda Revealed by Computed Tomography Imaging and Reappraisal of Axial Pneumaticity in Ornithomimosauria.

    Directory of Open Access Journals (Sweden)

    Akinobu Watanabe

    Full Text Available Among extant vertebrates, pneumatization of postcranial bones is unique to birds, with few known exceptions in other groups. Through reduction in bone mass, this feature is thought to benefit flight capacity in modern birds, but its prevalence in non-avian dinosaurs of variable sizes has generated competing hypotheses on the initial adaptive significance of postcranial pneumaticity. To better understand the evolutionary history of postcranial pneumaticity, studies have surveyed its distribution among non-avian dinosaurs. Nevertheless, the degree of pneumaticity in the basal coelurosaurian group Ornithomimosauria remains poorly known, despite their potential to greatly enhance our understanding of the early evolution of pneumatic bones along the lineage leading to birds. Historically, the identification of postcranial pneumaticity in non-avian dinosaurs has been based on examination of external morphology, and few studies thus far have focused on the internal architecture of pneumatic structures inside the bones. Here, we describe the vertebral pneumaticity of the ornithomimosaur Archaeornithomimus with the aid of X-ray computed tomography (CT imaging. Complementary examination of external and internal osteology reveals (1 highly pneumatized cervical vertebrae with an elaborate configuration of interconnected chambers within the neural arch and the centrum; (2 anterior dorsal vertebrae with pneumatic chambers inside the neural arch; (3 apneumatic sacral vertebrae; and (4 a subset of proximal caudal vertebrae with limited pneumatic invasion into the neural arch. Comparisons with other theropod dinosaurs suggest that ornithomimosaurs primitively exhibited a plesiomorphic theropod condition for axial pneumaticity that was extended among later taxa, such as Archaeornithomimus and large bodied Deinocheirus. This finding corroborates the notion that evolutionary increases in vertebral pneumaticity occurred in parallel among independent lineages of bird

  2. Alteration of cystatin C in cerebrospinal fluid of patients with sciatica revealed by a proteomical approach

    International Nuclear Information System (INIS)

    Liu, X; Zeng, B.; Xu, J.

    2005-01-01

    To better understand the pathophysiological mechanisms underlying sciatica induced by lumbar intervertebral disk herniation and to ascertain the protein that presents with the most observable changes in the cerebrospinal fluid (CSF) of patients with sciatica. We conducted the study in the Key Laboratory of Shanghai 6th People's Hospital, Shanghai Jiaotong University, Shanghai, Peoples Republic of China, during the period June 2004 to March 2005. In 2 separate experiments, we carried out the study involving the CSF of sciatica patients (the case group) and the CSF of otherwise, healthy volunteers (the control group). We utilized a proteomical analysis to compare the samples of 10 patients with sciatica with 10 volunteers in the control group. We individually separated each of the groups' CSF by 2-dimensional gel electrophoresis. We analyzed the harvested gel images with PD Quest 2D-gel software (Bio-Rad) to ascertain the differential proteins between the 2 groups. We based the enzyme linked immuno- absorbent assay (ELISA) experiment, which followed, on the results of the first experiment. We found 15 of the protein spots in the CSF differed appreciably in varying degrees between the 2 groups, and identification made by LC-MS/MS revealed that the most significant disparity was with cystatin C. The result of ELISA experiment confirmed a considerable decrease in the level of cystatin C (p<0.01) in the patients with sciatica. In the CSF of patients with sciatica, the volume of cystatin C increased markedly indicating that it may play an important role in the pathophysiological processes of sciatica. (author)

  3. University Students’ Reflections on Representations in Genetics and Stereochemistry Revealed by a Focus Group Approach

    Directory of Open Access Journals (Sweden)

    Inger Edfors

    2015-05-01

    Full Text Available Genetics and organic chemistry are areas of science that students regard as difficult to learn. Part of this difficulty is derived from the disciplines having representations as part of their discourses. In order to optimally support students’ meaning-making, teachers need to use representations to structure the meaning-making experience in thoughtful ways that consider the variation in students’ prior knowledge. Using a focus group setting, we explored 43 university students’ reasoning on representations in introductory chemistry and genetics courses. Our analysis of eight focus group discussions revealed how students can construct somewhat bewildered relations with disciplinary-specific representations. The students stated that they preferred familiar representations, but without asserting the meaning-making affordances of those representations. Also, the students were highly aware of the affordances of certain representations, but nonetheless chose not to use those representations in their problem solving. We suggest that an effective representation is one that, to some degree, is familiar to the students, but at the same time is challenging and not too closely related to “the usual one”. The focus group discussions led the students to become more aware of their own and others ways of interpreting different representations. Furthermore, feedback from the students’ focus group discussions enhanced the teachers’ awareness of the students’ prior knowledge and limitations in students’ representational literacy. Consequently, we posit that a focus group setting can be used in a university context to promote both student meaning-making and teacher professional development in a fruitful way.

  4. Groundwater Recharge Processes Revealed By Multi-Tracers Approach in a Headwater, North China Plain

    Science.gov (United States)

    Sakakibara, K.; Tsujimura, M.; Song, X.; Zhang, J.

    2014-12-01

    Groundwater recharge variation in space and time is crucial for effective water management especially in arid/ semi-arid regions. In order to reveal comprehensive groundwater recharge processes in a catchment with a large topographical relief and seasonal hydrological variations, intensive field surveys were conducted at 4 times in different seasons in Wangkuai watershed, Taihang Mountains, which is a main groundwater recharge zone of North China Plain. The groundwater, spring, stream water and lake water were sampled, and inorganic solute constituents and stable isotopes of oxygen-18 and deuterium were determined on all water samples. Also, the stream flow rate was observed in stable state condition. The stable isotopic compositions, silica and bicarbonate concentrations in the groundwater show close values as those in the surface water, suggesting main groundwater recharge occurs from surface water at mountain-plain transitional zone throughout a year. Also, the deuterium and oxgen-18 in the Wangkuai reservoir and the groundwater in the vicinity of the reservoir show higher values, suggesting the reservoir water, affected by evaporation effect, seems to have an important role for the groundwater recharge in alluvial plain. For specifying the groundwater recharge area and quantifying groundwater recharge rate from the reservoir, an inversion analysis and a simple mixing model were applied in Wangkuai watershed using stable isotopes of oxygen-18 and deuterium. The model results show that groundwater recharge occurs dominantly at the altitude from 357 m to 738 m corresponding to mountain-plain transitional zone, and groundwater recharge rate by Wangkuai reservoir is estimated to be 2.4 % of total groundwater recharge in Wangkuai watershed.

  5. Computation-aware algorithm selection approach for interlaced-to-progressive conversion

    Science.gov (United States)

    Park, Sang-Jun; Jeon, Gwanggil; Jeong, Jechang

    2010-05-01

    We discuss deinterlacing results in a computationally constrained and varied environment. The proposed computation-aware algorithm selection approach (CASA) for fast interlaced to progressive conversion algorithm consists of three methods: the line-averaging (LA) method for plain regions, the modified edge-based line-averaging (MELA) method for medium regions, and the proposed covariance-based adaptive deinterlacing (CAD) method for complex regions. The proposed CASA uses two criteria, mean-squared error (MSE) and CPU time, for assigning the method. We proposed a CAD method. The principle idea of CAD is based on the correspondence between the high and low-resolution covariances. We estimated the local covariance coefficients from an interlaced image using Wiener filtering theory and then used these optimal minimum MSE interpolation coefficients to obtain a deinterlaced image. The CAD method, though more robust than most known methods, was not found to be very fast compared to the others. To alleviate this issue, we proposed an adaptive selection approach using a fast deinterlacing algorithm rather than using only one CAD algorithm. The proposed hybrid approach of switching between the conventional schemes (LA and MELA) and our CAD was proposed to reduce the overall computational load. A reliable condition to be used for switching the schemes was presented after a wide set of initial training processes. The results of computer simulations showed that the proposed methods outperformed a number of methods presented in the literature.

  6. Empirical evidence for musical syntax processing? Computer simulations reveal the contribution of auditory short-term memory

    Directory of Open Access Journals (Sweden)

    Emmanuel eBigand

    2014-06-01

    Full Text Available During the last decade, it has been argued that 1 music processing involves syntactic representations similar to those observed in language, and 2 that music and language share similar syntactic-like processes and neural resources. This claim is important for understanding the origin of music and language abilities and, furthermore, it has clinical implications. The Western musical system, however, is rooted in psychoacoustic properties of sound, and this is not the case for linguistic syntax. Accordingly, musical syntax processing could be parsimoniously understood as an emergent property of auditory memory rather than a property of abstract processing similar to linguistic processing. To support this view, we simulated numerous empirical studies that investigated the processing of harmonic structures, using a model based on the accumulation of sensory information in auditory memory. The simulations revealed that most of the musical syntax manipulations used with behavioral and neurophysiological methods as well as with developmental and cross-cultural approaches can be accounted for by the auditory memory model. This led us to question whether current research on musical syntax can really be compared with linguistic processing. Our simulation also raises methodological and theoretical challenges to study musical syntax while disentangling the confounded low-level sensory influences. In order to investigate syntactic abilities in music comparable to language, research should preferentially use musical material with structures that circumvent the tonal effect exerted by psychoacoustic properties of sounds.

  7. Empirical evidence for musical syntax processing? Computer simulations reveal the contribution of auditory short-term memory.

    Science.gov (United States)

    Bigand, Emmanuel; Delbé, Charles; Poulin-Charronnat, Bénédicte; Leman, Marc; Tillmann, Barbara

    2014-01-01

    During the last decade, it has been argued that (1) music processing involves syntactic representations similar to those observed in language, and (2) that music and language share similar syntactic-like processes and neural resources. This claim is important for understanding the origin of music and language abilities and, furthermore, it has clinical implications. The Western musical system, however, is rooted in psychoacoustic properties of sound, and this is not the case for linguistic syntax. Accordingly, musical syntax processing could be parsimoniously understood as an emergent property of auditory memory rather than a property of abstract processing similar to linguistic processing. To support this view, we simulated numerous empirical studies that investigated the processing of harmonic structures, using a model based on the accumulation of sensory information in auditory memory. The simulations revealed that most of the musical syntax manipulations used with behavioral and neurophysiological methods as well as with developmental and cross-cultural approaches can be accounted for by the auditory memory model. This led us to question whether current research on musical syntax can really be compared with linguistic processing. Our simulation also raises methodological and theoretical challenges to study musical syntax while disentangling the confounded low-level sensory influences. In order to investigate syntactic abilities in music comparable to language, research should preferentially use musical material with structures that circumvent the tonal effect exerted by psychoacoustic properties of sounds.

  8. Empirical evidence for musical syntax processing? Computer simulations reveal the contribution of auditory short-term memory

    Science.gov (United States)

    Bigand, Emmanuel; Delbé, Charles; Poulin-Charronnat, Bénédicte; Leman, Marc; Tillmann, Barbara

    2014-01-01

    During the last decade, it has been argued that (1) music processing involves syntactic representations similar to those observed in language, and (2) that music and language share similar syntactic-like processes and neural resources. This claim is important for understanding the origin of music and language abilities and, furthermore, it has clinical implications. The Western musical system, however, is rooted in psychoacoustic properties of sound, and this is not the case for linguistic syntax. Accordingly, musical syntax processing could be parsimoniously understood as an emergent property of auditory memory rather than a property of abstract processing similar to linguistic processing. To support this view, we simulated numerous empirical studies that investigated the processing of harmonic structures, using a model based on the accumulation of sensory information in auditory memory. The simulations revealed that most of the musical syntax manipulations used with behavioral and neurophysiological methods as well as with developmental and cross-cultural approaches can be accounted for by the auditory memory model. This led us to question whether current research on musical syntax can really be compared with linguistic processing. Our simulation also raises methodological and theoretical challenges to study musical syntax while disentangling the confounded low-level sensory influences. In order to investigate syntactic abilities in music comparable to language, research should preferentially use musical material with structures that circumvent the tonal effect exerted by psychoacoustic properties of sounds. PMID:24936174

  9. Doxycycline hinders phenylalanine fibril assemblies revealing a potential novel therapeutic approach in phenylketonuria.

    Science.gov (United States)

    De Luigi, Ada; Mariani, Alessandro; De Paola, Massimiliano; Re Depaolini, Andrea; Colombo, Laura; Russo, Luca; Rondelli, Valeria; Brocca, Paola; Adler-Abramovich, Lihi; Gazit, Ehud; Del Favero, Elena; Cantù, Laura; Salmona, Mario

    2015-10-29

    A new paradigm for the aetiopathology of phenylketonuria suggests the presence of amyloid-like assemblies in the brains of transgenic mouse models and patients with phenylketonuria, possibly shedding light on the selective cognitive deficit associated with this disease. Paralleling the amyloidogenic route that identifies different stages of peptide aggregation, corresponding to different levels of toxicity, we experimentally address for the first time, the physico-chemical properties of phenylalanine aggregates via Small Angle, Wide Angle X-ray Scattering and Atomic Force Microscopy. Results are consistent with the presence of well-structured, aligned fibres generated by milliMolar concentrations of phenylalanine. Moreover, the amyloid-modulating doxycycline agent affects the local structure of phenylalanine aggregates, preventing the formation of well-ordered crystalline structures. Phenylalanine assemblies prove toxic in vitro to immortalized cell lines and primary neuronal cells. Furthermore, these assemblies also cause dendritic sprouting alterations and synaptic protein impairment in neurons. Doxycycline counteracts these toxic effects, suggesting an approach for the development of future innovative non-dietary preventive therapies.

  10. Recruitment and retention of rural general practitioners: a marketing approach reveals new possibilities.

    Science.gov (United States)

    Hemphill, Elizabeth; Dunn, Steve; Barich, Hayley; Infante, Rebecca

    2007-12-01

    This paper repositions the challenge of attracting and retaining rural GPs in a marketing context as a new focus for future research and policy development. Case study with mixed design of surveys of GPs and medical students and depth interviews with GPs, medical students, regional-division administrators and GP recruitment agents. GP recruitment and retention in the Limestone Coast region of South Australia. Twenty-seven Limestone Coast (LC) GPs; random sample of medical students from Adelaide University, Adelaide University Rural Health Society and Flinders University; snowball sampling two adjacent rural regions (20 GPs); and administrators from LC and adjacent regions and GP recruitment agencies in Adelaide. Drawing from marketing theory, creative suggestion of 'promotion of the practice and not the region' offers a means of GP recruitment and retention for structured succession planning for rural general practices. Structural attempts to broaden the GP market with overseas recruitment have done little for improving full-time equivalent GP levels. Market segmentation and market orientation offer a new emphasis on value exchange between the corporation (the practice), customer (GPs) and competition (all practices) to influence future mobility. A marketing orientation to the GP challenge emphasises individual's perceptions of value, GP expectations and practice offerings. Failure to acknowledge benefits of this marketing approach means that solutions such as those developed in the Limestone Coast region are unlikely. Research is now required to define GP satisfaction and value for long-term viability of general practices.

  11. Allergen relative abundance in several wheat varieties as revealed via a targeted quantitative approach using MS.

    Science.gov (United States)

    Rogniaux, Hélène; Pavlovic, Marija; Lupi, Roberta; Lollier, Virginie; Joint, Mathilde; Mameri, Hamza; Denery, Sandra; Larré, Colette

    2015-05-01

    Food allergy has become a major health issue in developed countries, therefore there is an urgent need to develop analytical methods able to detect and quantify with a good sensitivity and reliability some specific allergens in complex food matrices. In this paper, we present a targeted MS/MS approach to compare the relative abundance of the major recognized wheat allergens in the salt-soluble (albumin/globulin) fraction of wheat grains. Twelve allergens were quantified in seven wheat varieties, selected from three Triticum species: T. aestivum (bread wheat), T. durum (durum wheat), and T. monococcum. The allergens were monitored from one or two proteotypic peptides and their relative abundance was deduced from the intensity of one fragment measured in MS/MS. Whereas the abundance of some of the targeted allergens was quite stable across the genotypes, others like alpha-amylase inhibitors showed clear differences according to the wheat species, in accordance with the results of earlier functional studies. This study enriches the scarce knowledge available on allergens content in wheat genotypes, and brings new perspectives for food safety and plant breeding. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  13. Multi-Analytical Approach Reveals Potential Microbial Indicators in Soil for Sugarcane Model Systems.

    Directory of Open Access Journals (Sweden)

    Acacio Aparecido Navarrete

    Full Text Available This study focused on the effects of organic and inorganic amendments and straw retention on the microbial biomass (MB and taxonomic groups of bacteria in sugarcane-cultivated soils in a greenhouse mesocosm experiment monitored for gas emissions and chemical factors. The experiment consisted of combinations of synthetic nitrogen (N, vinasse (V; a liquid waste from ethanol production, and sugarcane-straw blankets. Increases in CO2-C and N2O-N emissions were identified shortly after the addition of both N and V to the soils, thus increasing MB nitrogen (MB-N and decreasing MB carbon (MB-C in the N+V-amended soils and altering soil chemical factors that were correlated with the MB. Across 57 soil metagenomic datasets, Actinobacteria (31.5%, Planctomycetes (12.3%, Deltaproteobacteria (12.3%, Alphaproteobacteria (12.0% and Betaproteobacteria (11.1% were the most dominant bacterial groups during the experiment. Differences in relative abundance of metagenomic sequences were mainly revealed for Acidobacteria, Actinobacteria, Gammaproteobacteria and Verrucomicrobia with regard to N+V fertilization and straw retention. Differential abundances in bacterial groups were confirmed using 16S rRNA gene-targeted phylum-specific primers for real-time PCR analysis in all soil samples, whose results were in accordance with sequence data, except for Gammaproteobacteria. Actinobacteria were more responsive to straw retention with Rubrobacterales, Bifidobacteriales and Actinomycetales related to the chemical factors of N+V-amended soils. Acidobacteria subgroup 7 and Opitutae, a verrucomicrobial class, were related to the chemical factors of soils without straw retention as a surface blanket. Taken together, the results showed that MB-C and MB-N responded to changes in soil chemical factors and CO2-C and N2O-N emissions, especially for N+V-amended soils. The results also indicated that several taxonomic groups of bacteria, such as Acidobacteria, Actinobacteria and

  14. Multi-Analytical Approach Reveals Potential Microbial Indicators in Soil for Sugarcane Model Systems

    Science.gov (United States)

    Navarrete, Acacio Aparecido; Diniz, Tatiana Rosa; Braga, Lucas Palma Perez; Silva, Genivaldo Gueiros Zacarias; Franchini, Julio Cezar; Rossetto, Raffaella; Edwards, Robert Alan; Tsai, Siu Mui

    2015-01-01

    This study focused on the effects of organic and inorganic amendments and straw retention on the microbial biomass (MB) and taxonomic groups of bacteria in sugarcane-cultivated soils in a greenhouse mesocosm experiment monitored for gas emissions and chemical factors. The experiment consisted of combinations of synthetic nitrogen (N), vinasse (V; a liquid waste from ethanol production), and sugarcane-straw blankets. Increases in CO2-C and N2O-N emissions were identified shortly after the addition of both N and V to the soils, thus increasing MB nitrogen (MB-N) and decreasing MB carbon (MB-C) in the N+V-amended soils and altering soil chemical factors that were correlated with the MB. Across 57 soil metagenomic datasets, Actinobacteria (31.5%), Planctomycetes (12.3%), Deltaproteobacteria (12.3%), Alphaproteobacteria (12.0%) and Betaproteobacteria (11.1%) were the most dominant bacterial groups during the experiment. Differences in relative abundance of metagenomic sequences were mainly revealed for Acidobacteria, Actinobacteria, Gammaproteobacteria and Verrucomicrobia with regard to N+V fertilization and straw retention. Differential abundances in bacterial groups were confirmed using 16S rRNA gene-targeted phylum-specific primers for real-time PCR analysis in all soil samples, whose results were in accordance with sequence data, except for Gammaproteobacteria. Actinobacteria were more responsive to straw retention with Rubrobacterales, Bifidobacteriales and Actinomycetales related to the chemical factors of N+V-amended soils. Acidobacteria subgroup 7 and Opitutae, a verrucomicrobial class, were related to the chemical factors of soils without straw retention as a surface blanket. Taken together, the results showed that MB-C and MB-N responded to changes in soil chemical factors and CO2-C and N2O-N emissions, especially for N+V-amended soils. The results also indicated that several taxonomic groups of bacteria, such as Acidobacteria, Actinobacteria and

  15. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  16. Stochastic approach for round-off error analysis in computing application to signal processing algorithms

    International Nuclear Information System (INIS)

    Vignes, J.

    1986-01-01

    Any result of algorithms provided by a computer always contains an error resulting from floating-point arithmetic round-off error propagation. Furthermore signal processing algorithms are also generally performed with data containing errors. The permutation-perturbation method, also known under the name CESTAC (controle et estimation stochastique d'arrondi de calcul) is a very efficient practical method for evaluating these errors and consequently for estimating the exact significant decimal figures of any result of algorithms performed on a computer. The stochastic approach of this method, its probabilistic proof, and the perfect agreement between the theoretical and practical aspects are described in this paper [fr

  17. Information Technology Service Management with Cloud Computing Approach to Improve Administration System and Online Learning Performance

    Directory of Open Access Journals (Sweden)

    Wilianto Wilianto

    2015-10-01

    Full Text Available This work discusses the development of information technology service management using cloud computing approach to improve the performance of administration system and online learning at STMIK IBBI Medan, Indonesia. The network topology is modeled and simulated for system administration and online learning. The same network topology is developed in cloud computing using Amazon AWS architecture. The model is designed and modeled using Riverbed Academic Edition Modeler to obtain values of the parameters: delay, load, CPU utilization, and throughput. The simu- lation results are the following. For network topology 1, without cloud computing, the average delay is 54  ms, load 110 000 bits/s, CPU utilization 1.1%, and throughput 440  bits/s.  With  cloud  computing,  the  average  delay  is 45 ms,  load  2 800  bits/s,  CPU  utilization  0.03%,  and throughput 540 bits/s. For network topology 2, without cloud computing, the average delay is 39  ms, load 3 500 bits/s, CPU utilization 0.02%, and throughput database server 1 400 bits/s. With cloud computing, the average delay is 26 ms, load 5 400 bits/s, CPU utilization email server 0.0001%, FTP server 0.001%, HTTP server 0.0002%, throughput email server 85 bits/s, FTP    server 100 bits/sec, and HTTP server 95  bits/s.  Thus,  the  delay, the load, and the CPU utilization decrease; but,  the throughput increases. Information technology service management with cloud computing approach has better performance.

  18. A Social Network Approach Reveals Associations between Mouse Social Dominance and Brain Gene Expression

    Science.gov (United States)

    So, Nina; Franks, Becca; Lim, Sean; Curley, James P.

    2015-01-01

    Modelling complex social behavior in the laboratory is challenging and requires analyses of dyadic interactions occurring over time in a physically and socially complex environment. In the current study, we approached the analyses of complex social interactions in group-housed male CD1 mice living in a large vivarium. Intensive observations of social interactions during a 3-week period indicated that male mice form a highly linear and steep dominance hierarchy that is maintained by fighting and chasing behaviors. Individual animals were classified as dominant, sub-dominant or subordinate according to their David’s Scores and I& SI ranking. Using a novel dynamic temporal Glicko rating method, we ascertained that the dominance hierarchy was stable across time. Using social network analyses, we characterized the behavior of individuals within 66 unique relationships in the social group. We identified two individual network metrics, Kleinberg’s Hub Centrality and Bonacich’s Power Centrality, as accurate predictors of individual dominance and power. Comparing across behaviors, we establish that agonistic, grooming and sniffing social networks possess their own distinctive characteristics in terms of density, average path length, reciprocity out-degree centralization and out-closeness centralization. Though grooming ties between individuals were largely independent of other social networks, sniffing relationships were highly predictive of the directionality of agonistic relationships. Individual variation in dominance status was associated with brain gene expression, with more dominant individuals having higher levels of corticotropin releasing factor mRNA in the medial and central nuclei of the amygdala and the medial preoptic area of the hypothalamus, as well as higher levels of hippocampal glucocorticoid receptor and brain-derived neurotrophic factor mRNA. This study demonstrates the potential and significance of combining complex social housing and intensive

  19. Revealing the biotechnological potential of Delftia sp. JD2 by a genomic approach

    Directory of Open Access Journals (Sweden)

    María A. Morel

    2016-04-01

    Full Text Available Delftia sp. JD2 is a chromium-resistant bacterium that reduces Cr(VI to Cr(III, accumulates Pb(II, produces the phytohormone indole-3-acetic acid and siderophores, and increases the plant growth performance of rhizobia in co-inoculation experiments. We aimed to analyze the biotechnological potential of JD2 using a genomic approach. JD2 has a genome of 6.76Mb, with 6,051 predicted protein coding sequences and 93 RNA genes (tRNA and rRNA. The indole-acetamide pathway was identified as responsible for the synthesis of indole-3-acetic acid. The genetic information involved in chromium resistance (the gene cluster, chrBACF, was found. At least 40 putative genes encoding for TonB-dependent receptors, probably involved in the utilization of siderophores and biopolymers, and genes for the synthesis, maturation, exportation and uptake of pyoverdine, and acquisition of Fe-pyochelin and Fe-enterobactin were also identified. The information also suggests that JD2 produce polyhydroxybutyrate, a carbon reserve polymer commonly used for manufacturing petrochemical free bioplastics. In addition, JD2 may degrade lignin-derived aromatic compounds to 2-pyrone-4,6-dicarboxylate, a molecule used in the bio-based polymer industry. Finally, a comparative genomic analysis of JD2, Delftia sp. Cs1-4 and Delftia acidovorans SPH-1 is also discussed. The present work provides insights into the physiology and genetics of a microorganism with many potential uses in biotechnology.

  20. Hybrid sequencing approach applied to human fecal metagenomic clone libraries revealed clones with potential biotechnological applications.

    Science.gov (United States)

    Džunková, Mária; D'Auria, Giuseppe; Pérez-Villarroya, David; Moya, Andrés

    2012-01-01

    Natural environments represent an incredible source of microbial genetic diversity. Discovery of novel biomolecules involves biotechnological methods that often require the design and implementation of biochemical assays to screen clone libraries. However, when an assay is applied to thousands of clones, one may eventually end up with very few positive clones which, in most of the cases, have to be "domesticated" for downstream characterization and application, and this makes screening both laborious and expensive. The negative clones, which are not considered by the selected assay, may also have biotechnological potential; however, unfortunately they would remain unexplored. Knowledge of the clone sequences provides important clues about potential biotechnological application of the clones in the library; however, the sequencing of clones one-by-one would be very time-consuming and expensive. In this study, we characterized the first metagenomic clone library from the feces of a healthy human volunteer, using a method based on 454 pyrosequencing coupled with a clone-by-clone Sanger end-sequencing. Instead of whole individual clone sequencing, we sequenced 358 clones in a pool. The medium-large insert (7-15 kb) cloning strategy allowed us to assemble these clones correctly, and to assign the clone ends to maintain the link between the position of a living clone in the library and the annotated contig from the 454 assembly. Finally, we found several open reading frames (ORFs) with previously described potential medical application. The proposed approach allows planning ad-hoc biochemical assays for the clones of interest, and the appropriate sub-cloning strategy for gene expression in suitable vectors/hosts.

  1. Hybrid sequencing approach applied to human fecal metagenomic clone libraries revealed clones with potential biotechnological applications.

    Directory of Open Access Journals (Sweden)

    Mária Džunková

    Full Text Available Natural environments represent an incredible source of microbial genetic diversity. Discovery of novel biomolecules involves biotechnological methods that often require the design and implementation of biochemical assays to screen clone libraries. However, when an assay is applied to thousands of clones, one may eventually end up with very few positive clones which, in most of the cases, have to be "domesticated" for downstream characterization and application, and this makes screening both laborious and expensive. The negative clones, which are not considered by the selected assay, may also have biotechnological potential; however, unfortunately they would remain unexplored. Knowledge of the clone sequences provides important clues about potential biotechnological application of the clones in the library; however, the sequencing of clones one-by-one would be very time-consuming and expensive. In this study, we characterized the first metagenomic clone library from the feces of a healthy human volunteer, using a method based on 454 pyrosequencing coupled with a clone-by-clone Sanger end-sequencing. Instead of whole individual clone sequencing, we sequenced 358 clones in a pool. The medium-large insert (7-15 kb cloning strategy allowed us to assemble these clones correctly, and to assign the clone ends to maintain the link between the position of a living clone in the library and the annotated contig from the 454 assembly. Finally, we found several open reading frames (ORFs with previously described potential medical application. The proposed approach allows planning ad-hoc biochemical assays for the clones of interest, and the appropriate sub-cloning strategy for gene expression in suitable vectors/hosts.

  2. A Social Network Approach Reveals Associations between Mouse Social Dominance and Brain Gene Expression.

    Directory of Open Access Journals (Sweden)

    Nina So

    Full Text Available Modelling complex social behavior in the laboratory is challenging and requires analyses of dyadic interactions occurring over time in a physically and socially complex environment. In the current study, we approached the analyses of complex social interactions in group-housed male CD1 mice living in a large vivarium. Intensive observations of social interactions during a 3-week period indicated that male mice form a highly linear and steep dominance hierarchy that is maintained by fighting and chasing behaviors. Individual animals were classified as dominant, sub-dominant or subordinate according to their David's Scores and I& SI ranking. Using a novel dynamic temporal Glicko rating method, we ascertained that the dominance hierarchy was stable across time. Using social network analyses, we characterized the behavior of individuals within 66 unique relationships in the social group. We identified two individual network metrics, Kleinberg's Hub Centrality and Bonacich's Power Centrality, as accurate predictors of individual dominance and power. Comparing across behaviors, we establish that agonistic, grooming and sniffing social networks possess their own distinctive characteristics in terms of density, average path length, reciprocity out-degree centralization and out-closeness centralization. Though grooming ties between individuals were largely independent of other social networks, sniffing relationships were highly predictive of the directionality of agonistic relationships. Individual variation in dominance status was associated with brain gene expression, with more dominant individuals having higher levels of corticotropin releasing factor mRNA in the medial and central nuclei of the amygdala and the medial preoptic area of the hypothalamus, as well as higher levels of hippocampal glucocorticoid receptor and brain-derived neurotrophic factor mRNA. This study demonstrates the potential and significance of combining complex social housing

  3. Computational intelligence approach for NOx emissions minimization in a coal-fired utility boiler

    International Nuclear Information System (INIS)

    Zhou Hao; Zheng Ligang; Cen Kefa

    2010-01-01

    The current work presented a computational intelligence approach used for minimizing NO x emissions in a 300 MW dual-furnaces coal-fired utility boiler. The fundamental idea behind this work included NO x emissions characteristics modeling and NO x emissions optimization. First, an objective function aiming at estimating NO x emissions characteristics from nineteen operating parameters of the studied boiler was represented by a support vector regression (SVR) model. Second, four levels of primary air velocities (PA) and six levels of secondary air velocities (SA) were regulated by using particle swarm optimization (PSO) so as to achieve low NO x emissions combustion. To reduce the time demanding, a more flexible stopping condition was used to improve the computational efficiency without the loss of the quality of the optimization results. The results showed that the proposed approach provided an effective way to reduce NO x emissions from 399.7 ppm to 269.3 ppm, which was much better than a genetic algorithm (GA) based method and was slightly better than an ant colony optimization (ACO) based approach reported in the earlier work. The main advantage of PSO was that the computational cost, typical of less than 25 s under a PC system, is much less than those required for ACO. This meant the proposed approach would be more applicable to online and real-time applications for NO x emissions minimization in actual power plant boilers.

  4. An Approach for Indoor Path Computation among Obstacles that Considers User Dimension

    Directory of Open Access Journals (Sweden)

    Liu Liu

    2015-12-01

    Full Text Available People often transport objects within indoor environments, who need enough space for the motion. In such cases, the accessibility of indoor spaces relies on the dimensions, which includes a person and her/his operated objects. This paper proposes a new approach to avoid obstacles and compute indoor paths with respect to the user dimension. The approach excludes inaccessible spaces for a user in five steps: (1 compute the minimum distance between obstacles and find the inaccessible gaps; (2 group obstacles according to the inaccessible gaps; (3 identify groups of obstacles that influence the path between two locations; (4 compute boundaries for the selected groups; and (5 build a network in the accessible area around the obstacles in the room. Compared to the Minkowski sum method for outlining inaccessible spaces, the proposed approach generates simpler polygons for groups of obstacles that do not contain inner rings. The creation of a navigation network becomes easier based on these simple polygons. By using this approach, we can create user- and task-specific networks in advance. Alternatively, the accessible path can be generated on the fly before the user enters a room.

  5. Identification of evolutionarily conserved Momordica charantia microRNAs using computational approach and its utility in phylogeny analysis.

    Science.gov (United States)

    Thirugnanasambantham, Krishnaraj; Saravanan, Subramanian; Karikalan, Kulandaivelu; Bharanidharan, Rajaraman; Lalitha, Perumal; Ilango, S; HairulIslam, Villianur Ibrahim

    2015-10-01

    Momordica charantia (bitter gourd, bitter melon) is a monoecious Cucurbitaceae with anti-oxidant, anti-microbial, anti-viral and anti-diabetic potential. Molecular studies on this economically valuable plant are very essential to understand its phylogeny and evolution. MicroRNAs (miRNAs) are conserved, small, non-coding RNA with ability to regulate gene expression by bind the 3' UTR region of target mRNA and are evolved at different rates in different plant species. In this study we have utilized homology based computational approach and identified 27 mature miRNAs for the first time from this bio-medically important plant. The phylogenetic tree developed from binary data derived from the data on presence/absence of the identified miRNAs were noticed to be uncertain and biased. Most of the identified miRNAs were highly conserved among the plant species and sequence based phylogeny analysis of miRNAs resolved the above difficulties in phylogeny approach using miRNA. Predicted gene targets of the identified miRNAs revealed their importance in regulation of plant developmental process. Reported miRNAs held sequence conservation in mature miRNAs and the detailed phylogeny analysis of pre-miRNA sequences revealed genus specific segregation of clusters. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. A direct approach to fault-tolerance in measurement-based quantum computation via teleportation

    International Nuclear Information System (INIS)

    Silva, Marcus; Danos, Vincent; Kashefi, Elham; Ollivier, Harold

    2007-01-01

    We discuss a simple variant of the one-way quantum computing model (Raussendorf R and Briegel H-J 2001 Phys. Rev. Lett. 86 5188), called the Pauli measurement model, where measurements are restricted to be along the eigenbases of the Pauli X and Y operators, while qubits can be initially prepared both in the vertical bar + π/4 > := 1/√2( vertical bar 0> + e i(π/4) vertical bar 1>) state and the usual vertical bar +> := 1/√2 ( vertical bar 0 > + vertical bar 1>) state. We prove the universality of this quantum computation model, and establish a standardization procedure which permits all entanglement and state preparation to be performed at the beginning of computation. This leads us to develop a direct approach to fault-tolerance by simple transformations of the entanglement graph and preparation operations, while error correction is performed naturally via syndrome-extracting teleportations

  7. Computational Approach for Studying Optical Properties of DNA Systems in Solution

    DEFF Research Database (Denmark)

    Nørby, Morten Steen; Svendsen, Casper Steinmann; Olsen, Jógvan Magnus Haugaard

    2016-01-01

    In this paper we present a study of the methodological aspects regarding calculations of optical properties for DNA systems in solution. Our computational approach will be built upon a fully polarizable QM/MM/Continuum model within a damped linear response theory framework. In this approach...... the environment is given a highly advanced description in terms of the electrostatic potential through the polarizable embedding model. Furthermore, bulk solvent effects are included in an efficient manner through a conductor-like screening model. With the aim of reducing the computational cost we develop a set...... of averaged partial charges and distributed isotropic dipole-dipole polarizabilities for DNA suitable for describing the classical region in ground-state and excited-state calculations. Calculations of the UV-spectrum of the 2-aminopurine optical probe embedded in a DNA double helical structure are presented...

  8. Approach and tool for computer animation of fields in electrical apparatus

    International Nuclear Information System (INIS)

    Miltchev, Radoslav; Yatchev, Ivan S.; Ritchie, Ewen

    2002-01-01

    The paper presents a technical approach and post-processing tool for creating and displaying computer animation. The approach enables handling of two- and three-dimensional physical field phenomena results obtained from finite element software or to display movement processes in electrical apparatus simulations. The main goal of this work is to extend auxiliary features built in general-purpose CAD software working in the Windows environment. Different storage techniques were examined and the one employing image capturing was chosen. The developed tool provides benefits of independent visualisation, creating scenarios and facilities for exporting animations in common file fon-nats for distribution on different computer platforms. It also provides a valuable educational tool.(Author)

  9. An efficient approach for computing the geometrical optics field reflected from a numerically specified surface

    Science.gov (United States)

    Mittra, R.; Rushdi, A.

    1979-01-01

    An approach for computing the geometrical optic fields reflected from a numerically specified surface is presented. The approach includes the step of deriving a specular point and begins with computing the reflected rays off the surface at the points where their coordinates, as well as the partial derivatives (or equivalently, the direction of the normal), are numerically specified. Then, a cluster of three adjacent rays are chosen to define a 'mean ray' and the divergence factor associated with this mean ray. Finally, the ampilitude, phase, and vector direction of the reflected field at a given observation point are derived by associating this point with the nearest mean ray and determining its position relative to such a ray.

  10. Seismic safety margins research program. Phase I. Project VII: systems analysis specifications of computational approach

    International Nuclear Information System (INIS)

    Collins, J.D.; Hudson, J.M.; Chrostowski, J.D.

    1979-02-01

    A computational methodology is presented for the prediction of core melt probabilities in a nuclear power plant due to earthquake events. The proposed model has four modules: seismic hazard, structural dynamic (including soil-structure interaction), component failure and core melt sequence. The proposed modules would operate in series and would not have to be operated at the same time. The basic statistical approach uses a Monte Carlo simulation to treat random and systematic error but alternate statistical approaches are permitted by the program design

  11. A Context-Aware Ubiquitous Learning Approach for Providing Instant Learning Support in Personal Computer Assembly Activities

    Science.gov (United States)

    Hsu, Ching-Kun; Hwang, Gwo-Jen

    2014-01-01

    Personal computer assembly courses have been recognized as being essential in helping students understand computer structure as well as the functionality of each computer component. In this study, a context-aware ubiquitous learning approach is proposed for providing instant assistance to individual students in the learning activity of a…

  12. Computational Design and Discovery of Ni-Based Alloys and Coatings: Thermodynamic Approaches Validated by Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zi-Kui [Pennsylvania State University; Gleeson, Brian [University of Pittsburgh; Shang, Shunli [Pennsylvania State University; Gheno, Thomas [University of Pittsburgh; Lindwall, Greta [Pennsylvania State University; Zhou, Bi-Cheng [Pennsylvania State University; Liu, Xuan [Pennsylvania State University; Ross, Austin [Pennsylvania State University

    2018-04-23

    This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities, which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.

  13. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    Science.gov (United States)

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  14. A Stochastic Approach for Blurred Image Restoration and Optical Flow Computation on Field Image Sequence

    Institute of Scientific and Technical Information of China (English)

    高文; 陈熙霖

    1997-01-01

    The blur in target images caused by camera vibration due to robot motion or hand shaking and by object(s) moving in the background scene is different to deal with in the computer vision system.In this paper,the authors study the relation model between motion and blur in the case of object motion existing in video image sequence,and work on a practical computation algorithm for both motion analysis and blut image restoration.Combining the general optical flow and stochastic process,the paper presents and approach by which the motion velocity can be calculated from blurred images.On the other hand,the blurred image can also be restored using the obtained motion information.For solving a problem with small motion limitation on the general optical flow computation,a multiresolution optical flow algoritm based on MAP estimation is proposed. For restoring the blurred image ,an iteration algorithm and the obtained motion velocity are used.The experiment shows that the proposed approach for both motion velocity computation and blurred image restoration works well.

  15. Discovery and Development of ATP-Competitive mTOR Inhibitors Using Computational Approaches.

    Science.gov (United States)

    Luo, Yao; Wang, Ling

    2017-11-16

    The mammalian target of rapamycin (mTOR) is a central controller of cell growth, proliferation, metabolism, and angiogenesis. This protein is an attractive target for new anticancer drug development. Significant progress has been made in hit discovery, lead optimization, drug candidate development and determination of the three-dimensional (3D) structure of mTOR. Computational methods have been applied to accelerate the discovery and development of mTOR inhibitors helping to model the structure of mTOR, screen compound databases, uncover structure-activity relationship (SAR) and optimize the hits, mine the privileged fragments and design focused libraries. Besides, computational approaches were also applied to study protein-ligand interactions mechanisms and in natural product-driven drug discovery. Herein, we survey the most recent progress on the application of computational approaches to advance the discovery and development of compounds targeting mTOR. Future directions in the discovery of new mTOR inhibitors using computational methods are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  16. Combined computational and experimental approach to improve the assessment of mitral regurgitation by echocardiography.

    Science.gov (United States)

    Sonntag, Simon J; Li, Wei; Becker, Michael; Kaestner, Wiebke; Büsen, Martin R; Marx, Nikolaus; Merhof, Dorit; Steinseifer, Ulrich

    2014-05-01

    Mitral regurgitation (MR) is one of the most frequent valvular heart diseases. To assess MR severity, color Doppler imaging (CDI) is the clinical standard. However, inadequate reliability, poor reproducibility and heavy user-dependence are known limitations. A novel approach combining computational and experimental methods is currently under development aiming to improve the quantification. A flow chamber for a circulatory flow loop was developed. Three different orifices were used to mimic variations of MR. The flow field was recorded simultaneously by a 2D Doppler ultrasound transducer and Particle Image Velocimetry (PIV). Computational Fluid Dynamics (CFD) simulations were conducted using the same geometry and boundary conditions. The resulting computed velocity field was used to simulate synthetic Doppler signals. Comparison between PIV and CFD shows a high level of agreement. The simulated CDI exhibits the same characteristics as the recorded color Doppler images. The feasibility of the proposed combination of experimental and computational methods for the investigation of MR is shown and the numerical methods are successfully validated against the experiments. Furthermore, it is discussed how the approach can be used in the long run as a platform to improve the assessment of MR quantification.

  17. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    OpenAIRE

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modificati...

  18. Reducing usage of the computational resources by event driven approach to model predictive control

    Science.gov (United States)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  19. A Systematic Protein Refolding Screen Method using the DGR Approach Reveals that Time and Secondary TSA are Essential Variables.

    Science.gov (United States)

    Wang, Yuanze; van Oosterwijk, Niels; Ali, Ameena M; Adawy, Alaa; Anindya, Atsarina L; Dömling, Alexander S S; Groves, Matthew R

    2017-08-24

    Refolding of proteins derived from inclusion bodies is very promising as it can provide a reliable source of target proteins of high purity. However, inclusion body-based protein production is often limited by the lack of techniques for the detection of correctly refolded protein. Thus, the selection of the refolding conditions is mostly achieved using trial and error approaches and is thus a time-consuming process. In this study, we use the latest developments in the differential scanning fluorimetry guided refolding approach as an analytical method to detect correctly refolded protein. We describe a systematic buffer screen that contains a 96-well primary pH-refolding screen in conjunction with a secondary additive screen. Our research demonstrates that this approach could be applied for determining refolding conditions for several proteins. In addition, it revealed which "helper" molecules, such as arginine and additives are essential. Four different proteins: HA-RBD, MDM2, IL-17A and PD-L1 were used to validate our refolding approach. Our systematic protocol evaluates the impact of the "helper" molecules, the pH, buffer system and time on the protein refolding process in a high-throughput fashion. Finally, we demonstrate that refolding time and a secondary thermal shift assay buffer screen are critical factors for improving refolding efficiency.

  20. From computer-assisted intervention research to clinical impact: The need for a holistic approach.

    Science.gov (United States)

    Ourselin, Sébastien; Emberton, Mark; Vercauteren, Tom

    2016-10-01

    The early days of the field of medical image computing (MIC) and computer-assisted intervention (CAI), when publishing a strong self-contained methodological algorithm was enough to produce impact, are over. As a community, we now have substantial responsibility to translate our scientific progresses into improved patient care. In the field of computer-assisted interventions, the emphasis is also shifting from the mere use of well-known established imaging modalities and position trackers to the design and combination of innovative sensing, elaborate computational models and fine-grained clinical workflow analysis to create devices with unprecedented capabilities. The barriers to translating such devices in the complex and understandably heavily regulated surgical and interventional environment can seem daunting. Whether we leave the translation task mostly to our industrial partners or welcome, as researchers, an important share of it is up to us. We argue that embracing the complexity of surgical and interventional sciences is mandatory to the evolution of the field. Being able to do so requires large-scale infrastructure and a critical mass of expertise that very few research centres have. In this paper, we emphasise the need for a holistic approach to computer-assisted interventions where clinical, scientific, engineering and regulatory expertise are combined as a means of moving towards clinical impact. To ensure that the breadth of infrastructure and expertise required for translational computer-assisted intervention research does not lead to a situation where the field advances only thanks to a handful of exceptionally large research centres, we also advocate that solutions need to be designed to lower the barriers to entry. Inspired by fields such as particle physics and astronomy, we claim that centralised very large innovation centres with state of the art technology and health technology assessment capabilities backed by core support staff and open

  1. Targeted intervention: Computational approaches to elucidate and predict relapse in alcoholism.

    Science.gov (United States)

    Heinz, Andreas; Deserno, Lorenz; Zimmermann, Ulrich S; Smolka, Michael N; Beck, Anne; Schlagenhauf, Florian

    2017-05-01

    Alcohol use disorder (AUD) and addiction in general is characterized by failures of choice resulting in repeated drug intake despite severe negative consequences. Behavioral change is hard to accomplish and relapse after detoxification is common and can be promoted by consumption of small amounts of alcohol as well as exposure to alcohol-associated cues or stress. While those environmental factors contributing to relapse have long been identified, the underlying psychological and neurobiological mechanism on which those factors act are to date incompletely understood. Based on the reinforcing effects of drugs of abuse, animal experiments showed that drug, cue and stress exposure affect Pavlovian and instrumental learning processes, which can increase salience of drug cues and promote habitual drug intake. In humans, computational approaches can help to quantify changes in key learning mechanisms during the development and maintenance of alcohol dependence, e.g. by using sequential decision making in combination with computational modeling to elucidate individual differences in model-free versus more complex, model-based learning strategies and their neurobiological correlates such as prediction error signaling in fronto-striatal circuits. Computational models can also help to explain how alcohol-associated cues trigger relapse: mechanisms such as Pavlovian-to-Instrumental Transfer can quantify to which degree Pavlovian conditioned stimuli can facilitate approach behavior including alcohol seeking and intake. By using generative models of behavioral and neural data, computational approaches can help to quantify individual differences in psychophysiological mechanisms that underlie the development and maintenance of AUD and thus promote targeted intervention. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Fourier-based approach to interpolation in single-slice helical computed tomography

    International Nuclear Information System (INIS)

    La Riviere, Patrick J.; Pan Xiaochuan

    2001-01-01

    It has recently been shown that longitudinal aliasing can be a significant and detrimental presence in reconstructed single-slice helical computed tomography (CT) volumes. This aliasing arises because the directly measured data in helical CT are generally undersampled by a factor of at least 2 in the longitudinal direction and because the exploitation of the redundancy of fanbeam data acquired over 360 degree sign to generate additional longitudinal samples does not automatically eliminate the aliasing. In this paper we demonstrate that for pitches near 1 or lower, the redundant fanbeam data, when used properly, can provide sufficient information to satisfy a generalized sampling theorem and thus to eliminate aliasing. We develop and evaluate a Fourier-based algorithm, called 180FT, that accomplishes this. As background we present a second Fourier-based approach, called 360FT, that makes use only of the directly measured data. Both Fourier-based approaches exploit the fast Fourier transform and the Fourier shift theorem to generate from the helical projection data a set of fanbeam sinograms corresponding to equispaced transverse slices. Slice-by-slice reconstruction is then performed by use of two-dimensional fanbeam algorithms. The proposed approaches are compared to their counterparts based on the use of linear interpolation - the 360LI and 180LI approaches. The aliasing suppression property of the 180FT approach is a clear advantage of the approach and represents a step toward the desirable goal of achieving uniform longitudinal resolution properties in reconstructed helical CT volumes

  3. Allele Age Under Non-Classical Assumptions is Clarified by an Exact Computational Markov Chain Approach.

    Science.gov (United States)

    De Sanctis, Bianca; Krukov, Ivan; de Koning, A P Jason

    2017-09-19

    Determination of the age of an allele based on its population frequency is a well-studied problem in population genetics, for which a variety of approximations have been proposed. We present a new result that, surprisingly, allows the expectation and variance of allele age to be computed exactly (within machine precision) for any finite absorbing Markov chain model in a matter of seconds. This approach makes none of the classical assumptions (e.g., weak selection, reversibility, infinite sites), exploits modern sparse linear algebra techniques, integrates over all sample paths, and is rapidly computable for Wright-Fisher populations up to N e  = 100,000. With this approach, we study the joint effect of recurrent mutation, dominance, and selection, and demonstrate new examples of "selective strolls" where the classical symmetry of allele age with respect to selection is violated by weakly selected alleles that are older than neutral alleles at the same frequency. We also show evidence for a strong age imbalance, where rare deleterious alleles are expected to be substantially older than advantageous alleles observed at the same frequency when population-scaled mutation rates are large. These results highlight the under-appreciated utility of computational methods for the direct analysis of Markov chain models in population genetics.

  4. Computational Approaches for Prediction of Pathogen-Host Protein-Protein Interactions

    Directory of Open Access Journals (Sweden)

    Esmaeil eNourani

    2015-02-01

    Full Text Available Infectious diseases are still among the major and prevalent health problems, mostly because of the drug resistance of novel variants of pathogens. Molecular interactions between pathogens and their hosts are the key part of the infection mechanisms. Novel antimicrobial therapeutics to fight drug resistance is only possible in case of a thorough understanding of pathogen-host interaction (PHI systems. Existing databases, which contain experimentally verified PHI data, suffer from scarcity of reported interactions due to the technically challenging and time consuming process of experiments. This has motivated many researchers to address the problem by proposing computational approaches for analysis and prediction of PHIs. The computational methods primarily utilize sequence information, protein structure and known interactions. Classic machine learning techniques are used when there are sufficient known interactions to be used as training data. On the opposite case, transfer and multi task learning methods are preferred. Here, we present an overview of these computational approaches for PHI prediction, discussing their weakness and abilities, with future directions.

  5. Distributional and Knowledge-Based Approaches for Computing Portuguese Word Similarity

    Directory of Open Access Journals (Sweden)

    Hugo Gonçalo Oliveira

    2018-02-01

    Full Text Available Identifying similar and related words is not only key in natural language understanding but also a suitable task for assessing the quality of computational resources that organise words and meanings of a language, compiled by different means. This paper, which aims to be a reference for those interested in computing word similarity in Portuguese, presents several approaches for this task and is motivated by the recent availability of state-of-the-art distributional models of Portuguese words, which add to several lexical knowledge bases (LKBs for this language, available for a longer time. The previous resources were exploited to answer word similarity tests, which also became recently available for Portuguese. We conclude that there are several valid approaches for this task, but not one that outperforms all the others in every single test. Distributional models seem to capture relatedness better, while LKBs are better suited for computing genuine similarity, but, in general, better results are obtained when knowledge from different sources is combined.

  6. Objective definition of rosette shape variation using a combined computer vision and data mining approach.

    Directory of Open Access Journals (Sweden)

    Anyela Camargo

    Full Text Available Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.

  7. Low rank approach to computing first and higher order derivatives using automatic differentiation

    International Nuclear Information System (INIS)

    Reed, J. A.; Abdel-Khalik, H. S.; Utke, J.

    2012-01-01

    This manuscript outlines a new approach for increasing the efficiency of applying automatic differentiation (AD) to large scale computational models. By using the principles of the Efficient Subspace Method (ESM), low rank approximations of the derivatives for first and higher orders can be calculated using minimized computational resources. The output obtained from nuclear reactor calculations typically has a much smaller numerical rank compared to the number of inputs and outputs. This rank deficiency can be exploited to reduce the number of derivatives that need to be calculated using AD. The effective rank can be determined according to ESM by computing derivatives with AD at random inputs. Reduced or pseudo variables are then defined and new derivatives are calculated with respect to the pseudo variables. Two different AD packages are used: OpenAD and Rapsodia. OpenAD is used to determine the effective rank and the subspace that contains the derivatives. Rapsodia is then used to calculate derivatives with respect to the pseudo variables for the desired order. The overall approach is applied to two simple problems and to MATWS, a safety code for sodium cooled reactors. (authors)

  8. New Approaches to the Computer Simulation of Amorphous Alloys: A Review.

    Science.gov (United States)

    Valladares, Ariel A; Díaz-Celaya, Juan A; Galván-Colín, Jonathan; Mejía-Mendoza, Luis M; Reyes-Retana, José A; Valladares, Renela M; Valladares, Alexander; Alvarez-Ramirez, Fernando; Qu, Dongdong; Shen, Jun

    2011-04-13

    In this work we review our new methods to computer generate amorphous atomic topologies of several binary alloys: SiH, SiN, CN; binary systems based on group IV elements like SiC; the GeSe 2 chalcogenide; aluminum-based systems: AlN and AlSi, and the CuZr amorphous alloy. We use an ab initio approach based on density functionals and computationally thermally-randomized periodically-continued cells with at least 108 atoms. The computational thermal process to generate the amorphous alloys is the undermelt-quench approach, or one of its variants, that consists in linearly heating the samples to just below their melting (or liquidus) temperatures, and then linearly cooling them afterwards. These processes are carried out from initial crystalline conditions using short and long time steps. We find that a step four-times the default time step is adequate for most of the simulations. Radial distribution functions (partial and total) are calculated and compared whenever possible with experimental results, and the agreement is very good. For some materials we report studies of the effect of the topological disorder on their electronic and vibrational densities of states and on their optical properties.

  9. A simplified approach to characterizing a kilovoltage source spectrum for accurate dose computation

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, Yannick; Kouznetsov, Alexei; Tambasco, Mauro [Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy and Department of Oncology, University of Calgary and Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada)

    2012-06-15

    Purpose: To investigate and validate the clinical feasibility of using half-value layer (HVL) and peak tube potential (kVp) for characterizing a kilovoltage (kV) source spectrum for the purpose of computing kV x-ray dose accrued from imaging procedures. To use this approach to characterize a Varian Registered-Sign On-Board Imager Registered-Sign (OBI) source and perform experimental validation of a novel in-house hybrid dose computation algorithm for kV x-rays. Methods: We characterized the spectrum of an imaging kV x-ray source using the HVL and the kVp as the sole beam quality identifiers using third-party freeware Spektr to generate the spectra. We studied the sensitivity of our dose computation algorithm to uncertainties in the beam's HVL and kVp by systematically varying these spectral parameters. To validate our approach experimentally, we characterized the spectrum of a Varian Registered-Sign OBI system by measuring the HVL using a Farmer-type Capintec ion chamber (0.06 cc) in air and compared dose calculations using our computationally validated in-house kV dose calculation code to measured percent depth-dose and transverse dose profiles for 80, 100, and 125 kVp open beams in a homogeneous phantom and a heterogeneous phantom comprising tissue, lung, and bone equivalent materials. Results: The sensitivity analysis of the beam quality parameters (i.e., HVL, kVp, and field size) on dose computation accuracy shows that typical measurement uncertainties in the HVL and kVp ({+-}0.2 mm Al and {+-}2 kVp, respectively) source characterization parameters lead to dose computation errors of less than 2%. Furthermore, for an open beam with no added filtration, HVL variations affect dose computation accuracy by less than 1% for a 125 kVp beam when field size is varied from 5 Multiplication-Sign 5 cm{sup 2} to 40 Multiplication-Sign 40 cm{sup 2}. The central axis depth dose calculations and experimental measurements for the 80, 100, and 125 kVp energies agreed within

  10. A Crafts-Oriented Approach to Computing in High School: Introducing Computational Concepts, Practices, and Perspectives with Electronic Textiles

    Science.gov (United States)

    Kafai, Yasmin B.; Lee, Eunkyoung; Searle, Kristin; Fields, Deborah; Kaplan, Eliot; Lui, Debora

    2014-01-01

    In this article, we examine the use of electronic textiles (e-textiles) for introducing key computational concepts and practices while broadening perceptions about computing. The starting point of our work was the design and implementation of a curriculum module using the LilyPad Arduino in a pre-AP high school computer science class. To…

  11. DIRProt: a computational approach for discriminating insecticide resistant proteins from non-resistant proteins.

    Science.gov (United States)

    Meher, Prabina Kumar; Sahu, Tanmaya Kumar; Banchariya, Anjali; Rao, Atmakuri Ramakrishna

    2017-03-24

    Insecticide resistance is a major challenge for the control program of insect pests in the fields of crop protection, human and animal health etc. Resistance to different insecticides is conferred by the proteins encoded from certain class of genes of the insects. To distinguish the insecticide resistant proteins from non-resistant proteins, no computational tool is available till date. Thus, development of such a computational tool will be helpful in predicting the insecticide resistant proteins, which can be targeted for developing appropriate insecticides. Five different sets of feature viz., amino acid composition (AAC), di-peptide composition (DPC), pseudo amino acid composition (PAAC), composition-transition-distribution (CTD) and auto-correlation function (ACF) were used to map the protein sequences into numeric feature vectors. The encoded numeric vectors were then used as input in support vector machine (SVM) for classification of insecticide resistant and non-resistant proteins. Higher accuracies were obtained under RBF kernel than that of other kernels. Further, accuracies were observed to be higher for DPC feature set as compared to others. The proposed approach achieved an overall accuracy of >90% in discriminating resistant from non-resistant proteins. Further, the two classes of resistant proteins i.e., detoxification-based and target-based were discriminated from non-resistant proteins with >95% accuracy. Besides, >95% accuracy was also observed for discrimination of proteins involved in detoxification- and target-based resistance mechanisms. The proposed approach not only outperformed Blastp, PSI-Blast and Delta-Blast algorithms, but also achieved >92% accuracy while assessed using an independent dataset of 75 insecticide resistant proteins. This paper presents the first computational approach for discriminating the insecticide resistant proteins from non-resistant proteins. Based on the proposed approach, an online prediction server DIRProt has

  12. Soft computing approach to 3D lung nodule segmentation in CT.

    Science.gov (United States)

    Badura, P; Pietka, E

    2014-10-01

    This paper presents a novel, multilevel approach to the segmentation of various types of pulmonary nodules in computed tomography studies. It is based on two branches of computational intelligence: the fuzzy connectedness (FC) and the evolutionary computation. First, the image and auxiliary data are prepared for the 3D FC analysis during the first stage of an algorithm - the masks generation. Its main goal is to process some specific types of nodules connected to the pleura or vessels. It consists of some basic image processing operations as well as dedicated routines for the specific cases of nodules. The evolutionary computation is performed on the image and seed points in order to shorten the FC analysis and improve its accuracy. After the FC application, the remaining vessels are removed during the postprocessing stage. The method has been validated using the first dataset of studies acquired and described by the Lung Image Database Consortium (LIDC) and by its latest release - the LIDC-IDRI (Image Database Resource Initiative) database. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Analysis of problem solving on project based learning with resource based learning approach computer-aided program

    Science.gov (United States)

    Kuncoro, K. S.; Junaedi, I.; Dwijanto

    2018-03-01

    This study aimed to reveal the effectiveness of Project Based Learning with Resource Based Learning approach computer-aided program and analyzed problem-solving abilities in terms of problem-solving steps based on Polya stages. The research method used was mixed method with sequential explanatory design. The subject of this research was the students of math semester 4. The results showed that the S-TPS (Strong Top Problem Solving) and W-TPS (Weak Top Problem Solving) had good problem-solving abilities in each problem-solving indicator. The problem-solving ability of S-MPS (Strong Middle Problem Solving) and (Weak Middle Problem Solving) in each indicator was good. The subject of S-BPS (Strong Bottom Problem Solving) had a difficulty in solving the problem with computer program, less precise in writing the final conclusion and could not reflect the problem-solving process using Polya’s step. While the Subject of W-BPS (Weak Bottom Problem Solving) had not been able to meet almost all the indicators of problem-solving. The subject of W-BPS could not precisely made the initial table of completion so that the completion phase with Polya’s step was constrained.

  14. Tailor-made Design of Chemical Blends using Decomposition-based Computer-aided Approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Manan, Zainuddin Abd.; Gernaey, Krist

    (properties). In this way, first the systematic computer-aided technique establishes the search space, and then narrows it down in subsequent steps until a small number of feasible and promising candidates remain and then experimental work may be conducted to verify if any or all the candidates satisfy......Computer aided technique is an efficient approach to solve chemical product design problems such as design of blended liquid products (chemical blending). In chemical blending, one tries to find the best candidate, which satisfies the product targets defined in terms of desired product attributes...... is decomposed into two stages. The first stage investigates the mixture stability where all unstable mixtures are eliminated and the stable blend candidates are retained for further testing. In the second stage, the blend candidates have to satisfy a set of target properties that are ranked according...

  15. Solubility of magnetite in high temperature water and an approach to generalized solubility computations

    International Nuclear Information System (INIS)

    Dinov, K.; Ishigure, K.; Matsuura, C.; Hiroishi, D.

    1993-01-01

    Magnetite solubility in pure water was measured at 423 K in a fully teflon-covered autoclave system. A fairly good agreement was found to exist between the experimental data and calculation results obtained from the thermodynamical model, based on the assumption of Fe 3 O 4 dissolution and Fe 2 O 3 deposition reactions. A generalized thermodynamical approach to the solubility computations under complex conditions on the basis of minimization of the total system Gibbs free energy was proposed. The forms of the chemical equilibria were obtained for various systems initially defined and successfully justified by the subsequent computations. A [Fe 3+ ] T -[Fe 2+ ] T phase diagram was introduced as a tool for systematic understanding of the magnetite dissolution phenomena in pure water and under oxidizing and reducing conditions. (orig.)

  16. A Hybrid Autonomic Computing-Based Approach to Distributed Constraint Satisfaction Problems

    Directory of Open Access Journals (Sweden)

    Abhishek Bhatia

    2015-03-01

    Full Text Available Distributed constraint satisfaction problems (DisCSPs are among the widely endeavored problems using agent-based simulation. Fernandez et al. formulated sensor and mobile tracking problem as a DisCSP, known as SensorDCSP In this paper, we adopt a customized ERE (environment, reactive rules and entities algorithm for the SensorDCSP, which is otherwise proven as a computationally intractable problem. An amalgamation of the autonomy-oriented computing (AOC-based algorithm (ERE and genetic algorithm (GA provides an early solution of the modeled DisCSP. Incorporation of GA into ERE facilitates auto-tuning of the simulation parameters, thereby leading to an early solution of constraint satisfaction. This study further contributes towards a model, built up in the NetLogo simulation environment, to infer the efficacy of the proposed approach.

  17. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  18. Functional gene profiling through metaRNAseq approach reveals diet-dependent variation in rumen microbiota of buffalo (Bubalus bubalis).

    Science.gov (United States)

    Hinsu, Ankit T; Parmar, Nidhi R; Nathani, Neelam M; Pandit, Ramesh J; Patel, Anand B; Patel, Amrutlal K; Joshi, Chaitanya G

    2017-04-01

    Recent advances in next generation sequencing technology have enabled analysis of complex microbial community from genome to transcriptome level. In the present study, metatranscriptomic approach was applied to elucidate functionally active bacteria and their biological processes in rumen of buffalo (Bubalus bubalis) adapted to different dietary treatments. Buffaloes were adapted to a diet containing 50:50, 75:25 and 100:0 forage to concentrate ratio, each for 6 weeks, before ruminal content sample collection. Metatranscriptomes from rumen fiber adherent and fiber-free active bacteria were sequenced using Ion Torrent PGM platform followed by annotation using MG-RAST server and CAZYmes (Carbohydrate active enzymes) analysis toolkit. In all the samples Bacteroidetes was the most abundant phylum followed by Firmicutes. Functional analysis using KEGG Orthology database revealed Metabolism as the most abundant category at level 1 within which Carbohydrate metabolism was dominating. Diet treatments also exerted significant differences in proportion of enzymes involved in metabolic pathways for VFA production. Carbohydrate Active Enzyme(CAZy) analysis revealed the abundance of genes encoding glycoside hydrolases with the highest representation of GH13 CAZy family in all the samples. The findings provide an overview of the activities occurring in the rumen as well as active bacterial population and the changes occurring through different dietary treatments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Omics approaches on fresh-cut lettuce reveal global molecular responses to sodium hypochlorite and peracetic acid treatment.

    Science.gov (United States)

    Daddiego, Loretta; Bianco, Linda; Capodicasa, Cristina; Carbone, Fabrizio; Dalmastri, Claudia; Daroda, Lorenza; Del Fiore, Antonella; De Rossi, Patrizia; Di Carli, Mariasole; Donini, Marcello; Lopez, Loredana; Mengoni, Alessio; Paganin, Patrizia; Perrotta, Gaetano; Bevivino, Annamaria

    2018-01-01

    Lettuce is a leafy vegetable that is extensively commercialized as a ready-to-eat product because of its widespread use in human nutrition as salad. It is well known that washing treatments can severely affect the quality and shelf-life of ready-to-eat vegetables. The study presented here evaluated the effect of two washing procedures on fresh-cut lettuce during storage. An omics approach was applied to reveal global changes at molecular level induced by peracetic acid washing in comparison with sodium hypochlorite treatment. Microbiological analyses were also performed to quantify total bacterial abundance and composition. The study revealed wide metabolic alterations induced by the two sanitizers. In particular, transcriptomic and proteomic analyses pointed out a number of transcripts and proteins differentially accumulated in response to peracetic acid washing, mainly occurring on the first day of storage. In parallel, different microbiota composition and significant reduction in total bacterial load following washing were also observed. The results provide useful information for the fresh-cut industry to select an appropriate washing procedure preserving fresh-like attributes as much as possible during storage of the end product. Molecular evidence indicated peracetic acid to be a valid alternative to sodium hypochlorite as sanitizer solution. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  20. A ChIP-chip approach reveals a novel role for transcription factor IRF1 in the DNA damage response.

    Science.gov (United States)

    Frontini, Mattia; Vijayakumar, Meeraa; Garvin, Alexander; Clarke, Nicole

    2009-03-01

    IRF1 is a transcription factor that regulates key processes in the immune system and in tumour suppression. To gain further insight into IRF1's role in these processes, we searched for new target genes by performing chromatin immunoprecipitation coupled to a CpG island microarray (ChIP-chip). Using this approach we identified 202 new IRF1-binding sites with high confidence. Functional categorization of the target genes revealed a surprising cadre of new roles that can be linked to IRF1. One of the major functional categories was the DNA damage response pathway. In order to further validate our findings, we show that IRF1 can regulate the mRNA expression of a number of the DNA damage response genes in our list. In particular, we demonstrate that the mRNA and protein levels of the DNA repair protein BRIP1 [Fanconi anemia gene J (FANC J)] are upregulated after IRF1 over-expression. We also demonstrate that knockdown of IRF1 by siRNA results in loss of BRIP1 expression, abrogation of BRIP1 foci after DNA interstrand crosslink (ICL) damage and hypersensitivity to the DNA crosslinking agent, melphalan; a characteristic phenotype of FANC J cells. Taken together, our data provides a more complete understanding of the regulatory networks controlled by IRF1 and reveals a novel role for IRF1 in regulating the ICL DNA damage response.

  1. A ChIP–chip approach reveals a novel role for transcription factor IRF1 in the DNA damage response

    Science.gov (United States)

    Frontini, Mattia; Vijayakumar, Meeraa; Garvin, Alexander; Clarke, Nicole

    2009-01-01

    IRF1 is a transcription factor that regulates key processes in the immune system and in tumour suppression. To gain further insight into IRF1's role in these processes, we searched for new target genes by performing chromatin immunoprecipitation coupled to a CpG island microarray (ChIP–chip). Using this approach we identified 202 new IRF1-binding sites with high confidence. Functional categorization of the target genes revealed a surprising cadre of new roles that can be linked to IRF1. One of the major functional categories was the DNA damage response pathway. In order to further validate our findings, we show that IRF1 can regulate the mRNA expression of a number of the DNA damage response genes in our list. In particular, we demonstrate that the mRNA and protein levels of the DNA repair protein BRIP1 [Fanconi anemia gene J (FANC J)] are upregulated after IRF1 over-expression. We also demonstrate that knockdown of IRF1 by siRNA results in loss of BRIP1 expression, abrogation of BRIP1 foci after DNA interstrand crosslink (ICL) damage and hypersensitivity to the DNA crosslinking agent, melphalan; a characteristic phenotype of FANC J cells. Taken together, our data provides a more complete understanding of the regulatory networks controlled by IRF1 and reveals a novel role for IRF1 in regulating the ICL DNA damage response. PMID:19129219

  2. Computed tomography angiography reveals stenosis and aneurysmal dilation of an aberrant right subclavian artery causing systemic blood pressure misreading in an old Pekinese dog.

    Science.gov (United States)

    Kim, Jaehwan; Eom, Kidong; Yoon, Hakyoung

    2017-06-16

    A 14-year-old dog weighing 4 kg presented with hypotension only in the right forelimb. Thoracic radiography revealed a round soft tissue opacity near the aortic arch and below the second thoracic vertebra on a lateral view. Three-dimensional computed tomography angiography clearly revealed stenosis and aneurysmal dilation of an aberrant right subclavian artery. Stenosis and aneurysm of an aberrant subclavian artery should be included as a differential diagnosis in dogs showing a round soft tissue opacity near the aortic arch and below the thoracic vertebra on the lateral thoracic radiograph.

  3. a Holistic Approach for Inspection of Civil Infrastructures Based on Computer Vision Techniques

    Science.gov (United States)

    Stentoumis, C.; Protopapadakis, E.; Doulamis, A.; Doulamis, N.

    2016-06-01

    In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  4. A HOLISTIC APPROACH FOR INSPECTION OF CIVIL INFRASTRUCTURES BASED ON COMPUTER VISION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    C. Stentoumis

    2016-06-01

    Full Text Available In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  5. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    Science.gov (United States)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  6. Computer assisted collimation gamma camera: A new approach to imaging contaminated tissues

    International Nuclear Information System (INIS)

    Quartuccio, M.; Franck, D.; Pihet, P.; Begot, S.; Jeanguillaume, C.

    2000-01-01

    Measurement systems with the capability of imaging tissues contaminated with radioactive materials would find relevant applications in medical physics research and possibly in health physics. The latter in particular depends critically on the performance achieved for sensitivity and spatial resolution. An original approach of computer assisted collimation gamma camera (French acronym CACAO) which could meet suitable characteristics has been proposed elsewhere. CACAO requires detectors with high spatial resolution. The present work was aimed at investigating the application of the CACAO principle on a laboratory testing bench using silicon detectors made of small pixels. (author)

  7. Computer assisted collimation gamma camera: A new approach to imaging contaminated tissues

    Energy Technology Data Exchange (ETDEWEB)

    Quartuccio, M.; Franck, D.; Pihet, P.; Begot, S.; Jeanguillaume, C

    2000-07-01

    Measurement systems with the capability of imaging tissues contaminated with radioactive materials would find relevant applications in medical physics research and possibly in health physics. The latter in particular depends critically on the performance achieved for sensitivity and spatial resolution. An original approach of computer assisted collimation gamma camera (French acronym CACAO) which could meet suitable characteristics has been proposed elsewhere. CACAO requires detectors with high spatial resolution. The present work was aimed at investigating the application of the CACAO principle on a laboratory testing bench using silicon detectors made of small pixels. (author)

  8. Seismic Safety Margins Research Program (Phase I). Project VII. Systems analysis specification of computational approach

    International Nuclear Information System (INIS)

    Wall, I.B.; Kaul, M.K.; Post, R.I.; Tagart, S.W. Jr.; Vinson, T.J.

    1979-02-01

    An initial specification is presented of a computation approach for a probabilistic risk assessment model for use in the Seismic Safety Margin Research Program. This model encompasses the whole seismic calculational chain from seismic input through soil-structure interaction, transfer functions to the probability of component failure, integration of these failures into a system model and thereby estimate the probability of a release of radioactive material to the environment. It is intended that the primary use of this model will be in sensitivity studies to assess the potential conservatism of different modeling elements in the chain and to provide guidance on priorities for research in seismic design of nuclear power plants

  9. Tensor Voting A Perceptual Organization Approach to Computer Vision and Machine Learning

    CERN Document Server

    Mordohai, Philippos

    2006-01-01

    This lecture presents research on a general framework for perceptual organization that was conducted mainly at the Institute for Robotics and Intelligent Systems of the University of Southern California. It is not written as a historical recount of the work, since the sequence of the presentation is not in chronological order. It aims at presenting an approach to a wide range of problems in computer vision and machine learning that is data-driven, local and requires a minimal number of assumptions. The tensor voting framework combines these properties and provides a unified perceptual organiza

  10. Parallel computations of molecular dynamics trajectories using the stochastic path approach

    Science.gov (United States)

    Zaloj, Veaceslav; Elber, Ron

    2000-06-01

    A novel protocol to parallelize molecular dynamics trajectories is discussed and tested on a cluster of PCs running the NT operating system. The new technique does not propagate the solution in small time steps, but uses instead a global optimization of a functional of the whole trajectory. The new approach is especially attractive for parallel and distributed computing and its advantages (and disadvantages) are presented. Two numerical examples are discussed: (a) A conformational transition in a solvated dipeptide, and (b) The R→T conformational transition in solvated hemoglobin.

  11. Interpretation of computed tomography imaging of the eye and orbit. A systematic approach

    Directory of Open Access Journals (Sweden)

    Naik Milind

    2002-01-01

    Full Text Available Computed tomography (CT has revolutionised the diagnosis and management of ocular and orbital diseases. The use of thin sections with multiplanar scanning (axial, coronal and sagittal planes and the possibility of three-dimensional reconstruction permits thorough evaluation. To make the most of this technique, users must familiarize themselves with the pertinent CT principles and terminology. The diagnostic yield is optimal when the ophthalmologist and radiologist collaborate in the radiodiagnostic workup. In this article we describe a systematic approach to the interpretation of ocular and orbital CT scans.

  12. Approach for discrimination and quantification of electroactive species: kinetics difference revealed by higher harmonics of Fourier transformed sinusoidal voltammetry.

    Science.gov (United States)

    Fang, Yishan; Huang, Xinjian; Wang, Lishi

    2015-01-06

    Discrimination and quantification of electroactive species are traditionally realized by a potential difference which is mainly determined by thermodynamics. However, the resolution of this approach is limited to tens of millivolts. In this paper, we described an application of Fourier transformed sinusoidal voltammetry (FT-SV) that provides a new approach for discrimination and quantitative evaluation of electroactive species, especially thermodynamic similar ones. Numerical simulation indicates that electron transfer kinetics difference between electroactive species can be revealed by the phase angle of higher order harmonics of FT-SV, and the difference can be amplified order by order. Thus, even a very subtle kinetics difference can be amplified to be distinguishable at a certain order of harmonics. This method was verified with structurally similar ferrocene derivatives which were chosen as the model systems. Although these molecules have very close redox potential (harmonics. The results demonstrated the feasibility and reliability of the method. It was also implied that the combination of the traditional thermodynamic method and this kinetics method can form a two-dimension resolved detection method, and it has the potential to extend the resolution of voltammetric techniques to a new level.

  13. Computational approaches in the design of synthetic receptors – A review

    Energy Technology Data Exchange (ETDEWEB)

    Cowen, Todd, E-mail: tc203@le.ac.uk; Karim, Kal; Piletsky, Sergey

    2016-09-14

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as “plastic antibodies” – high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller–Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. - Highlights: • A review of computational modelling in the design of molecularly imprinted polymers. • Target analytes and method of analysis for the vast majority of recent articles. • Explanations are given of all the popular and emerging techniques used in design. • Highlighted examples of sophisticated analysis of imprinted polymer systems.

  14. Computational approaches in the design of synthetic receptors – A review

    International Nuclear Information System (INIS)

    Cowen, Todd; Karim, Kal; Piletsky, Sergey

    2016-01-01

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as “plastic antibodies” – high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller–Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. - Highlights: • A review of computational modelling in the design of molecularly imprinted polymers. • Target analytes and method of analysis for the vast majority of recent articles. • Explanations are given of all the popular and emerging techniques used in design. • Highlighted examples of sophisticated analysis of imprinted polymer systems.

  15. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  16. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    Energy Technology Data Exchange (ETDEWEB)

    Aderholdt, Ferrol [Tennessee Technological Univ., Cookeville, TN (United States); Caldwell, Blake A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hicks, Susan Elaine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Koch, Scott M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Naughton, III, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pelfrey, Daniel S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pogge, James R [Tennessee Technological Univ., Cookeville, TN (United States); Scott, Stephen L [Tennessee Technological Univ., Cookeville, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sorrillo, Lawrence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges for the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.

  17. Design of tailor-made chemical blend using a decomposition-based computer-aided approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Gernaey, Krist; Manan, Z.A.

    2011-01-01

    Computer aided techniques form an efficient approach to solve chemical product design problems such as the design of blended liquid products (chemical blending). In chemical blending, one tries to find the best candidate, which satisfies the product targets defined in terms of desired product...... methodology for blended liquid products that identifies a set of feasible chemical blends. The blend design problem is formulated as a Mixed Integer Nonlinear Programming (MINLP) model where the objective is to find the optimal blended gasoline or diesel product subject to types of chemicals...... and their compositions and a set of desired target properties of the blended product as design constraints. This blend design problem is solved using a decomposition approach, which eliminates infeasible and/or redundant candidates gradually through a hierarchy of (property) model based constraints. This decomposition...

  18. COMPUTER EVALUATION OF SKILLS FORMATION QUALITY IN THE IMPLEMENTATION OF COMPETENCE-BASED APPROACH TO LEARNING

    Directory of Open Access Journals (Sweden)

    Vitalia A. Zhuravleva

    2014-01-01

    Full Text Available The article deals with the problem of effective organization of skills forming as an important part of the competence approach in education, implemented via educational standards of new generation. The solution of the problem suggests using of computer tools to assess the quality of skills formation and abilities based on the proposed model of the problem. This paper proposes an approach to creating an assessing model of the level of skills formation in knowledge management systems based on mathematical modeling methods. Attention is paid to the evaluation strategy and technology of assessment, which is based on the use of rules of fuzzy mathematics. Algorithmic implementation of the proposed model of evaluation of the quality of skills development is shown as well. 

  19. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    Science.gov (United States)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on

  20. Modular Approaches to Earth Science Scientific Computing: 3D Electromagnetic Induction Modeling as an Example

    Science.gov (United States)

    Tandon, K.; Egbert, G.; Siripunvaraporn, W.

    2003-12-01

    We are developing a modular system for three-dimensional inversion of electromagnetic (EM) induction data, using an object oriented programming approach. This approach allows us to modify the individual components of the inversion scheme proposed, and also reuse the components for variety of problems in earth science computing howsoever diverse they might be. In particular, the modularity allows us to (a) change modeling codes independently of inversion algorithm details; (b) experiment with new inversion algorithms; and (c) modify the way prior information is imposed in the inversion to test competing hypothesis and techniques required to solve an earth science problem. Our initial code development is for EM induction equations on a staggered grid, using iterative solution techniques in 3D. An example illustrated here is an experiment with the sensitivity of 3D magnetotelluric inversion to uncertainties in the boundary conditions required for regional induction problems. These boundary conditions should reflect the large-scale geoelectric structure of the study area, which is usually poorly constrained. In general for inversion of MT data, one fixes boundary conditions at the edge of the model domain, and adjusts the earth?s conductivity structure within the modeling domain. Allowing for errors in specification of the open boundary values is simple in principle, but no existing inversion codes that we are aware of have this feature. Adding a feature such as this is straightforward within the context of the modular approach. More generally, a modular approach provides an efficient methodology for setting up earth science computing problems to test various ideas. As a concrete illustration relevant to EM induction problems, we investigate the sensitivity of MT data near San Andreas Fault at Parkfield (California) to uncertainties in the regional geoelectric structure.

  1. Revealing fatigue damage evolution in unidirectional composites for wind turbine blades using x-ray computed tomography

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    ’. Thereby, it will be possible to lower the cost of energy for wind energy based electricity. In the presented work, a lab-source x-ray computed tomography equipment (Zeiss Xradia 520 Versa) has been used in connection with ex-situ fatigue testing of uni-directional composites in order to identify fibre...... to other comparable x-ray studies) have been used in order to ensure a representative test volume during the ex-situ fatigue testing. Using the ability of the x-ray computed tomography to zoom into regions of interest, non-destructive, the fatigue damage evolution in a repeating ex-situ fatigue loaded test...... improving the fatigue resistance of non-crimp fabric used in the wind turbine industry can be made....

  2. Computer-aided detection of masses in digital tomosynthesis mammography: Comparison of three approaches

    International Nuclear Information System (INIS)

    Chan Heangping; Wei Jun; Zhang Yiheng; Helvie, Mark A.; Moore, Richard H.; Sahiner, Berkman; Hadjiiski, Lubomir; Kopans, Daniel B.

    2008-01-01

    The authors are developing a computer-aided detection (CAD) system for masses on digital breast tomosynthesis mammograms (DBT). Three approaches were evaluated in this study. In the first approach, mass candidate identification and feature analysis are performed in the reconstructed three-dimensional (3D) DBT volume. A mass likelihood score is estimated for each mass candidate using a linear discriminant analysis (LDA) classifier. Mass detection is determined by a decision threshold applied to the mass likelihood score. A free response receiver operating characteristic (FROC) curve that describes the detection sensitivity as a function of the number of false positives (FPs) per breast is generated by varying the decision threshold over a range. In the second approach, prescreening of mass candidate and feature analysis are first performed on the individual two-dimensional (2D) projection view (PV) images. A mass likelihood score is estimated for each mass candidate using an LDA classifier trained for the 2D features. The mass likelihood images derived from the PVs are backprojected to the breast volume to estimate the 3D spatial distribution of the mass likelihood scores. The FROC curve for mass detection can again be generated by varying the decision threshold on the 3D mass likelihood scores merged by backprojection. In the third approach, the mass likelihood scores estimated by the 3D and 2D approaches, described above, at the corresponding 3D location are combined and evaluated using FROC analysis. A data set of 100 DBT cases acquired with a GE prototype system at the Breast Imaging Laboratory in the Massachusetts General Hospital was used for comparison of the three approaches. The LDA classifiers with stepwise feature selection were designed with leave-one-case-out resampling. In FROC analysis, the CAD system for detection in the DBT volume alone achieved test sensitivities of 80% and 90% at average FP rates of 1.94 and 3.40 per breast, respectively. With the

  3. Computational Experiment Approach to Controlled Evolution of Procurement Pattern in Cluster Supply Chain

    Directory of Open Access Journals (Sweden)

    Xiao Xue

    2015-01-01

    Full Text Available Companies have been aware of the benefits of developing Cluster Supply Chains (CSCs, and they are spending a great deal of time and money attempting to develop the new business pattern. Yet, the traditional techniques for identifying CSCs have strong theoretical antecedents, but seem to have little traction in the field. We believe this is because the standard techniques fail to capture evolution over time, nor provide useful intervention measures to reach goals. To address these problems, we introduce an agent-based modeling approach to evaluate CSCs. Taking collaborative procurement as research object, our approach is composed of three parts: model construction, model instantiation, and computational experiment. We use the approach to explore the service charging policy problem in collaborative procurement. Three kinds of service charging polices are compared in the same experiment environment. Finally, “Fixed Cost” is identified as the optimal policy under the stable market environment. The case study can help us to understand the workflow of applying the approach, and provide valuable decision support applications to industry.

  4. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  5. Soft and hard computing approaches for real-time prediction of currents in a tide-dominated coastal area

    Digital Repository Service at National Institute of Oceanography (India)

    Charhate, S.B.; Deo, M.C.; SanilKumar, V.

    . Owing to the complex real sea conditions, such methods may not always yield satisfactory results. This paper discusses a few alternative approaches based on the soft computing tools of artificial neural networks (ANNs) and genetic programming (GP...

  6. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba; Bajic, Vladimir B.

    2016-01-01

    decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we

  7. A Computer-Assisted Personalized Approach in an Undergraduate Plant Physiology Class1

    Science.gov (United States)

    Artus, Nancy N.; Nadler, Kenneth D.

    1999-01-01

    We used Computer-Assisted Personalized Approach (CAPA), a networked teaching and learning tool that generates computer individualized homework problem sets, in our large-enrollment introductory plant physiology course. We saw significant improvement in student examination performance with regular homework assignments, with CAPA being an effective and efficient substitute for hand-graded homework. Using CAPA, each student received a printed set of similar but individualized problems of a conceptual (qualitative) and/or quantitative nature with quality graphics. Because each set of problems is unique, students were encouraged to work together to clarify concepts but were required to do their own work for credit. Students could enter answers multiple times without penalty, and they were able to obtain immediate feedback and hints until the due date. These features increased student time on task, allowing higher course standards and student achievement in a diverse student population. CAPA handles routine tasks such as grading, recording, summarizing, and posting grades. In anonymous surveys, students indicated an overwhelming preference for homework in CAPA format, citing several features such as immediate feedback, multiple tries, and on-line accessibility as reasons for their preference. We wrote and used more than 170 problems on 17 topics in introductory plant physiology, cataloging them in a computer library for general access. Representative problems are compared and discussed. PMID:10198076

  8. Computational investigation of fluid flow and heat transfer of an economizer by porous medium approach

    Science.gov (United States)

    Babu, C. Rajesh; Kumar, P.; Rajamohan, G.

    2017-07-01

    Computation of fluid flow and heat transfer in an economizer is simulated by a porous medium approach, with plain tubes having a horizontal in-line arrangement and cross flow arrangement in a coal-fired thermal power plant. The economizer is a thermal mechanical device that captures waste heat from the thermal exhaust flue gasses through heat transfer surfaces to preheat boiler feed water. In order to evaluate the fluid flow and heat transfer on tubes, a numerical analysis on heat transfer performance is carried out on an 110 t/h MCR (Maximum continuous rating) boiler unit. In this study, thermal performance is investigated using the computational fluid dynamics (CFD) simulation using ANSYS FLUENT. The fouling factor ε and the overall heat transfer coefficient ψ are employed to evaluate the fluid flow and heat transfer. The model demands significant computational details for geometric modeling, grid generation, and numerical calculations to evaluate the thermal performance of an economizer. The simulation results show that the overall heat transfer coefficient 37.76 W/(m2K) and economizer coil side pressure drop of 0.2 (kg/cm2) are found to be conformity within the tolerable limits when compared with existing industrial economizer data.

  9. A computer-assisted personalized approach in an undergraduate plant physiology class

    Science.gov (United States)

    Artus; Nadler

    1999-04-01

    We used Computer-Assisted Personalized Approach (CAPA), a networked teaching and learning tool that generates computer individualized homework problem sets, in our large-enrollment introductory plant physiology course. We saw significant improvement in student examination performance with regular homework assignments, with CAPA being an effective and efficient substitute for hand-graded homework. Using CAPA, each student received a printed set of similar but individualized problems of a conceptual (qualitative) and/or quantitative nature with quality graphics. Because each set of problems is unique, students were encouraged to work together to clarify concepts but were required to do their own work for credit. Students could enter answers multiple times without penalty, and they were able to obtain immediate feedback and hints until the due date. These features increased student time on task, allowing higher course standards and student achievement in a diverse student population. CAPA handles routine tasks such as grading, recording, summarizing, and posting grades. In anonymous surveys, students indicated an overwhelming preference for homework in CAPA format, citing several features such as immediate feedback, multiple tries, and on-line accessibility as reasons for their preference. We wrote and used more than 170 problems on 17 topics in introductory plant physiology, cataloging them in a computer library for general access. Representative problems are compared and discussed.

  10. Highly efficient separation materials created by computational approach. For the separation of lanthanides and actinides

    International Nuclear Information System (INIS)

    Goto, Masahiro; Uezu, Kazuya; Aoshima, Atsushi; Koma, Yoshikazu

    2002-05-01

    In this study, efficient separation materials have been created by the computational approach. Based on the computational calculation, novel organophosphorus extractants, which have two functional moieties in the molecular structure, were developed for the recycle system of transuranium elements using liquid-liquid extraction. Furthermore, molecularly imprinted resins were prepared by the surface-imprint polymerization technique. Thorough this research project, we obtained two principal results: 1) design of novel extractants by computational approach, and 2) preparation of highly selective resins by the molecular imprinting technique. The synthesized extractants showed extremely high extractability to rare earth metals compared to those of commercially available extractants. The results of extraction equilibrium suggested that the structural effect of extractants is one of the key factors to enhance the selectivity and extractability in rare earth extractions. Furthermore, a computational analysis was carried out to evaluate the extraction properties for the extraction of rare earth metals by the synthesized extractants. The computer simulation was shown to be very useful for designing new extractants. The new concept to connect some functional moieties with a spacer is very useful and is a promising method to develop novel extractants for the treatment of nuclear fuel. In the second part, we proposed a novel molecular imprinting technique (surface template polymerization) for the separation of lanthanides and actinides. A surface-templated resin is prepared by an emulsion polymerization using an ion-binding (host) monomer, a resin matrix-forming monomer and the target Nd(III) metal ion. A host monomer which has amphiphilic nature forms a complex with a metal ion at the interface, and the complex remains as it is. After the matrix is polymerized, the coordination structure is 'imprinted' at the resin interface. Adsorption of Nd(III) and La(III) ions onto the

  11. Tuneable resolution as a systems biology approach for multi-scale, multi-compartment computational models.

    Science.gov (United States)

    Kirschner, Denise E; Hunt, C Anthony; Marino, Simeone; Fallahi-Sichani, Mohammad; Linderman, Jennifer J

    2014-01-01

    The use of multi-scale mathematical and computational models to study complex biological processes is becoming increasingly productive. Multi-scale models span a range of spatial and/or temporal scales and can encompass multi-compartment (e.g., multi-organ) models. Modeling advances are enabling virtual experiments to explore and answer questions that are problematic to address in the wet-lab. Wet-lab experimental technologies now allow scientists to observe, measure, record, and analyze experiments focusing on different system aspects at a variety of biological scales. We need the technical ability to mirror that same flexibility in virtual experiments using multi-scale models. Here we present a new approach, tuneable resolution, which can begin providing that flexibility. Tuneable resolution involves fine- or coarse-graining existing multi-scale models at the user's discretion, allowing adjustment of the level of resolution specific to a question, an experiment, or a scale of interest. Tuneable resolution expands options for revising and validating mechanistic multi-scale models, can extend the longevity of multi-scale models, and may increase computational efficiency. The tuneable resolution approach can be applied to many model types, including differential equation, agent-based, and hybrid models. We demonstrate our tuneable resolution ideas with examples relevant to infectious disease modeling, illustrating key principles at work. © 2014 The Authors. WIREs Systems Biology and Medicine published by Wiley Periodicals, Inc.

  12. Integrative computational approach for genome-based study of microbial lipid-degrading enzymes.

    Science.gov (United States)

    Vorapreeda, Tayvich; Thammarongtham, Chinae; Laoteng, Kobkul

    2016-07-01

    Lipid-degrading or lipolytic enzymes have gained enormous attention in academic and industrial sectors. Several efforts are underway to discover new lipase enzymes from a variety of microorganisms with particular catalytic properties to be used for extensive applications. In addition, various tools and strategies have been implemented to unravel the functional relevance of the versatile lipid-degrading enzymes for special purposes. This review highlights the study of microbial lipid-degrading enzymes through an integrative computational approach. The identification of putative lipase genes from microbial genomes and metagenomic libraries using homology-based mining is discussed, with an emphasis on sequence analysis of conserved motifs and enzyme topology. Molecular modelling of three-dimensional structure on the basis of sequence similarity is shown to be a potential approach for exploring the structural and functional relationships of candidate lipase enzymes. The perspectives on a discriminative framework of cutting-edge tools and technologies, including bioinformatics, computational biology, functional genomics and functional proteomics, intended to facilitate rapid progress in understanding lipolysis mechanism and to discover novel lipid-degrading enzymes of microorganisms are discussed.

  13. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    Science.gov (United States)

    Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361

  14. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Sonia Yassa

    2013-01-01

    Full Text Available We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  15. Morphology control in polymer blend fibers—a high throughput computing approach

    Science.gov (United States)

    Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar

    2016-08-01

    Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.

  16. The Educator´s Approach to Media Training and Computer Games within Leisure Time of School-children

    OpenAIRE

    MORAVCOVÁ, Dagmar

    2009-01-01

    The paper describes possible ways of approaching computer games playing as part of leisure time of school-children and deals with the significance of media training in leisure time. At first it specifies the concept of leisure time and its functions, then shows some positive and negative effects of the media. It further describes classical computer games, the problem of excess computer game playing and means of prevention. The paper deals with the educator's personality and the importance of ...

  17. Creating the computer player: an engaging and collaborative approach to introduce computational thinking by combining ‘unplugged’ activities with visual programming

    Directory of Open Access Journals (Sweden)

    Anna Gardeli

    2017-11-01

    Full Text Available Ongoing research is being conducted on appropriate course design, practices and teacher interventions for improving the efficiency of computer science and programming courses in K-12 education. The trend is towards a more constructivist problem-based learning approach. Computational thinking, which refers to formulating and solving problems in a form that can be efficiently processed by a computer, raises an important educational challenge. Our research aims to explore possible ways of enriching computer science teaching with a focus on development of computational thinking. We have prepared and evaluated a learning intervention for introducing computer programming to children between 10 and 14 years old; this involves students working in groups to program the behavior of the computer player of a well-known game. The programming process is split into two parts. First, students design a high-level version of their algorithm during an ‘unplugged’ pen & paper phase, and then they encode their solution as an executable program in a visual programming environment. Encouraging evaluation results have been achieved regarding the educational and motivational value of the proposed approach.

  18. Exploring the Unfolding Pathway of Maltose Binding Proteins: An Integrated Computational Approach

    KAUST Repository

    Guardiani, Carlo; Marino, Daniele Di; Tramontano, Anna; Chinappi, Mauro; Cecconi, Fabio

    2014-01-01

    © 2014 American Chemical Society. Recent single-molecule force spectroscopy experiments on the Maltose Binding Proteins (MBPs) identified four stable structural units, termed unfoldons, that resist mechanical stress and determine the intermediates of the unfolding pathway. In this work, we analyze the topological origin and the dynamical role of the unfoldons using an integrated approach which combines a graph-theoretical analysis of the interaction network of the MBP native-state with steered molecular dynamics simulations. The topological analysis of the native state, while revealing the structural nature of the unfoldons, provides a framework to interpret the MBP mechanical unfolding pathway. Indeed, the experimental pathway can be effectively predicted by means of molecular dynamics simulations with a simple topology-based and low-resolution model of the MBP. The results obtained from the coarse-grained approach are confirmed and further refined by all-atom molecular dynamics.

  19. Exploring the Unfolding Pathway of Maltose Binding Proteins: An Integrated Computational Approach

    KAUST Repository

    Guardiani, Carlo

    2014-09-09

    © 2014 American Chemical Society. Recent single-molecule force spectroscopy experiments on the Maltose Binding Proteins (MBPs) identified four stable structural units, termed unfoldons, that resist mechanical stress and determine the intermediates of the unfolding pathway. In this work, we analyze the topological origin and the dynamical role of the unfoldons using an integrated approach which combines a graph-theoretical analysis of the interaction network of the MBP native-state with steered molecular dynamics simulations. The topological analysis of the native state, while revealing the structural nature of the unfoldons, provides a framework to interpret the MBP mechanical unfolding pathway. Indeed, the experimental pathway can be effectively predicted by means of molecular dynamics simulations with a simple topology-based and low-resolution model of the MBP. The results obtained from the coarse-grained approach are confirmed and further refined by all-atom molecular dynamics.

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  1. Efficient approach to compute melting properties fully from ab initio with application to Cu

    Science.gov (United States)

    Zhu, Li-Fang; Grabowski, Blazej; Neugebauer, Jörg

    2017-12-01

    Applying thermodynamic integration within an ab initio-based free-energy approach is a state-of-the-art method to calculate melting points of materials. However, the high computational cost and the reliance on a good reference system for calculating the liquid free energy have so far hindered a general application. To overcome these challenges, we propose the two-optimized references thermodynamic integration using Langevin dynamics (TOR-TILD) method in this work by extending the two-stage upsampled thermodynamic integration using Langevin dynamics (TU-TILD) method, which has been originally developed to obtain anharmonic free energies of solids, to the calculation of liquid free energies. The core idea of TOR-TILD is to fit two empirical potentials to the energies from density functional theory based molecular dynamics runs for the solid and the liquid phase and to use these potentials as reference systems for thermodynamic integration. Because the empirical potentials closely reproduce the ab initio system in the relevant part of the phase space the convergence of the thermodynamic integration is very rapid. Therefore, the proposed approach improves significantly the computational efficiency while preserving the required accuracy. As a test case, we apply TOR-TILD to fcc Cu computing not only the melting point but various other melting properties, such as the entropy and enthalpy of fusion and the volume change upon melting. The generalized gradient approximation (GGA) with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional and the local-density approximation (LDA) are used. Using both functionals gives a reliable ab initio confidence interval for the melting point, the enthalpy of fusion, and entropy of fusion.

  2. Vector and parallel computing on the IBM ES/3090, a powerful approach to solving problems in the utility industry

    International Nuclear Information System (INIS)

    Bellucci, V.J.

    1990-01-01

    This paper describes IBM's approach to parallel computing using the IBM ES/3090 computer. Parallel processing concepts were discussed including its advantages, potential performance improvements and limitations. Particular applications and capabilities for the IBM ES/3090 were presented along with preliminary results from some utilities in the application of parallel processing to simulation of system reliability, air pollution models, and power network dynamics

  3. Experimental/Computational Approach to Accommodation Coefficients and its Application to Noble Gases on Aluminum Surface (Preprint)

    Science.gov (United States)

    2009-02-03

    computational approach to accommodation coefficients and its application to noble gases on aluminum surface Nathaniel Selden Uruversity of Southern Cahfornia, Los ...8217 ,. 0.’ a~ .......,..,P. • " ,,-0, "p"’U".. ,Po"D.’ 0.’P.... uro . P." FIG. 5: Experimental and computed radiometri~ force for argon (left), xenon

  4. A Novel Goal-Oriented Approach for Training Older Adult Computer Novices: Beyond the Effects of Individual-Difference Factors.

    Science.gov (United States)

    Hollis-Sawyer, Lisa A.; Sterns, Harvey L.

    1999-01-01

    Spreadsheet training using either goal-oriented or verbal persuasion approach was given to 106 computer novices aged 50-89. Goal orientation achieved more changes in computer attitudes, efficacy, and proficiency. Intellectual ability and personality dimensions did not affect results. (SK)

  5. Epithelial invasion outcompetes hypha development during Candida albicans infection as revealed by an image-based systems biology approach.

    Science.gov (United States)

    Mech, Franziska; Wilson, Duncan; Lehnert, Teresa; Hube, Bernhard; Thilo Figge, Marc

    2014-02-01

    Candida albicans is the most common opportunistic fungal pathogen of the human mucosal flora, frequently causing infections. The fungus is responsible for invasive infections in immunocompromised patients that can lead to sepsis. The yeast to hypha transition and invasion of host-tissue represent major determinants in the switch from benign colonizer to invasive pathogen. A comprehensive understanding of the infection process requires analyses at the quantitative level. Utilizing fluorescence microscopy with differential staining, we obtained images of C. albicans undergoing epithelial invasion during a time course of 6 h. An image-based systems biology approach, combining image analysis and mathematical modeling, was applied to quantify the kinetics of hyphae development, hyphal elongation, and epithelial invasion. The automated image analysis facilitates high-throughput screening and provided quantities that allow for the time-resolved characterization of the morphological and invasive state of fungal cells. The interpretation of these data was supported by two mathematical models, a kinetic growth model and a kinetic transition model, that were developed using differential equations. The kinetic growth model describes the increase in hyphal length and revealed that hyphae undergo mass invasion of epithelial cells following primary hypha formation. We also provide evidence that epithelial cells stimulate the production of secondary hyphae by C. albicans. Based on the kinetic transition model, the route of invasion was quantified in the state space of non-invasive and invasive fungal cells depending on their number of hyphae. This analysis revealed that the initiation of hyphae formation represents an ultimate commitment to invasive growth and suggests that in vivo, the yeast to hypha transition must be under exquisitely tight negative regulation to avoid the transition from commensal to pathogen invading the epithelium. © 2013 International Society for

  6. A novel dendrochronological approach reveals drivers of carbon sequestration in tree species of riparian forests across spatiotemporal scales.

    Science.gov (United States)

    Rieger, Isaak; Kowarik, Ingo; Cherubini, Paolo; Cierjacks, Arne

    2017-01-01

    Aboveground carbon (C) sequestration in trees is important in global C dynamics, but reliable techniques for its modeling in highly productive and heterogeneous ecosystems are limited. We applied an extended dendrochronological approach to disentangle the functioning of drivers from the atmosphere (temperature, precipitation), the lithosphere (sedimentation rate), the hydrosphere (groundwater table, river water level fluctuation), the biosphere (tree characteristics), and the anthroposphere (dike construction). Carbon sequestration in aboveground biomass of riparian Quercus robur L. and Fraxinus excelsior L. was modeled (1) over time using boosted regression tree analysis (BRT) on cross-datable trees characterized by equal annual growth ring patterns and (2) across space using a subsequent classification and regression tree analysis (CART) on cross-datable and not cross-datable trees. While C sequestration of cross-datable Q. robur responded to precipitation and temperature, cross-datable F. excelsior also responded to a low Danube river water level. However, CART revealed that C sequestration over time is governed by tree height and parameters that vary over space (magnitude of fluctuation in the groundwater table, vertical distance to mean river water level, and longitudinal distance to upstream end of the study area). Thus, a uniform response to climatic drivers of aboveground C sequestration in Q. robur was only detectable in trees of an intermediate height class and in taller trees (>21.8m) on sites where the groundwater table fluctuated little (≤0.9m). The detection of climatic drivers and the river water level in F. excelsior depended on sites at lower altitudes above the mean river water level (≤2.7m) and along a less dynamic downstream section of the study area. Our approach indicates unexploited opportunities of understanding the interplay of different environmental drivers in aboveground C sequestration. Results may support species-specific and

  7. Revealing the control of migratory fueling: An integrated approach combining laboratory and field studies in northern wheatears Oenanthe oenanthe

    Directory of Open Access Journals (Sweden)

    Franz BAIRLEIN,Volker DIERSCHKE, Julia DELINGAT, Cas EIKENAAR, Ivan MAGGINI, Marc BULTE, Heiko SCHMALJOHANN

    2013-06-01

    Full Text Available Migratory birds rely on fueling prior to migratory flights. Fueling in migrants is controlled by intrinsic as well as extrinsic factors. From captive studies we have started understanding the internal mechanisms controlling bird migration. Field studies have demonstrated the effects of external factors, such as food availability, weather, competitors, parasites or diseases, on the stopover behavior of migrants. However, an integrated approach is still missing to study coherently how the innate migration program interacts with the varying environmental cues and to estimate the contribution of the innate migration program and the environment to realized migration. The northern wheatear Oenanthe oenanthe offers a unique opportunity for integrated studies. It breeds across almost the whole Holarctic with just a “gap” between eastern Canada and Alaska. All breeding populations overwinter in sub-Saharan Africa which makes the northern wheatear one of the most long-distant migratory songbirds with extraordinary long non-stop flights across oceans. It is a nocturnal migrant which travels without parental or social aid/guidance. Thus, young birds rely entirely on endogenous mechanisms of timing, route selection and fueling on their first outbound migration. By establishing indoor housing under controlled conditions the endogenous control mechanisms of northern wheatear migration could be revealed. At the same time, environmental factors controlling fueling could be investigated in the field. On migration wheatears occur in a variety of habitats with sparse vegetation where their stopover behavior could be quantitatively studied in the light of “optimal migration” theory by the use of remote balances, radio-tagging and even experimentally manipulated food availability. The present paper summarizes our approach to understand the control of migration in northern wheatears by combining field and laboratory studies at various spatial and temporal

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  9. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  11. Discovery of resources using MADM approaches for parallel and distributed computing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2017-06-01

    Full Text Available Grid, a form of parallel and distributed computing, allows the sharing of data and computational resources among its users from various geographical locations. The grid resources are diverse in terms of their underlying attributes. The majority of the state-of-the-art resource discovery techniques rely on the static resource attributes during resource selection. However, the matching resources based on the static resource attributes may not be the most appropriate resources for the execution of user applications because they may have heavy job loads, less storage space or less working memory (RAM. Hence, there is a need to consider the current state of the resources in order to find the most suitable resources. In this paper, we have proposed a two-phased multi-attribute decision making (MADM approach for discovery of grid resources by using P2P formalism. The proposed approach considers multiple resource attributes for decision making of resource selection and provides the best suitable resource(s to grid users. The first phase describes a mechanism to discover all matching resources and applies SAW method to shortlist the top ranked resources, which are communicated to the requesting super-peer. The second phase of our proposed methodology applies integrated MADM approach (AHP enriched PROMETHEE-II on the list of selected resources received from different super-peers. The pairwise comparison of the resources with respect to their attributes is made and the rank of each resource is determined. The top ranked resource is then communicated to the grid user by the grid scheduler. Our proposed methodology enables the grid scheduler to allocate the most suitable resource to the user application and also reduces the search complexity by filtering out the less suitable resources during resource discovery.

  12. Systems Bioinformatics: increasing precision of computational diagnostics and therapeutics through network-based approaches.

    Science.gov (United States)

    Oulas, Anastasis; Minadakis, George; Zachariou, Margarita; Sokratous, Kleitos; Bourdakou, Marilena M; Spyrou, George M

    2017-11-27

    Systems Bioinformatics is a relatively new approach, which lies in the intersection of systems biology and classical bioinformatics. It focuses on integrating information across different levels using a bottom-up approach as in systems biology with a data-driven top-down approach as in bioinformatics. The advent of omics technologies has provided the stepping-stone for the emergence of Systems Bioinformatics. These technologies provide a spectrum of information ranging from genomics, transcriptomics and proteomics to epigenomics, pharmacogenomics, metagenomics and metabolomics. Systems Bioinformatics is the framework in which systems approaches are applied to such data, setting the level of resolution as well as the boundary of the system of interest and studying the emerging properties of the system as a whole rather than the sum of the properties derived from the system's individual components. A key approach in Systems Bioinformatics is the construction of multiple networks representing each level of the omics spectrum and their integration in a layered network that exchanges information within and between layers. Here, we provide evidence on how Systems Bioinformatics enhances computational therapeutics and diagnostics, hence paving the way to precision medicine. The aim of this review is to familiarize the reader with the emerging field of Systems Bioinformatics and to provide a comprehensive overview of its current state-of-the-art methods and technologies. Moreover, we provide examples of success stories and case studies that utilize such methods and tools to significantly advance research in the fields of systems biology and systems medicine. © The Author 2017. Published by Oxford University Press.

  13. Metabolomics approach reveals metabolic disorders and potential biomarkers associated with the developmental toxicity of tetrabromobisphenol A and tetrachlorobisphenol A

    Science.gov (United States)

    Ye, Guozhu; Chen, Yajie; Wang, Hong-Ou; Ye, Ting; Lin, Yi; Huang, Qiansheng; Chi, Yulang; Dong, Sijun

    2016-10-01

    Tetrabromobisphenol A and tetrachlorobisphenol A are halogenated bisphenol A (H-BPA), and has raised concerns about their adverse effects on the development of fetuses and infants, however, the molecular mechanisms are unclear, and related metabolomics studies are limited. Accordingly, a metabolomics study based on gas chromatography-mass spectrometry was employed to elucidate the molecular developmental toxicology of H-BPA using the marine medaka (Oryzias melastigmas) embryo model. Here, we revealed decreased synthesis of nucleosides, amino acids and lipids, and disruptions in the TCA (tricarboxylic acid) cycle, glycolysis and lipid metabolism, thus inhibiting the developmental processes of embryos exposed to H-BPA. Unexpectedly, we observed enhanced neural activity accompanied by lactate accumulation and accelerated heart rates due to an increase in dopamine pathway and a decrease in inhibitory neurotransmitters following H-BPA exposure. Notably, disorders of the neural system, and disruptions in glycolysis, the TCA cycle, nucleoside metabolism, lipid metabolism, glutamate and aspartate metabolism induced by H-BPA exposure were heritable. Furthermore, lactate and dopa were identified as potential biomarkers of the developmental toxicity of H-BPA and related genetic effects. This study has demonstrated that the metabolomics approach is a useful tool for obtaining comprehensive and novel insights into the molecular developmental toxicity of environmental pollutants.

  14. A holistic approach to dissecting SPARC family protein complexity reveals FSTL-1 as an inhibitor of pancreatic cancer cell growth.

    Science.gov (United States)

    Viloria, Katrina; Munasinghe, Amanda; Asher, Sharan; Bogyere, Roberto; Jones, Lucy; Hill, Natasha J

    2016-11-25

    SPARC is a matricellular protein that is involved in both pancreatic cancer and diabetes. It belongs to a wider family of proteins that share structural and functional similarities. Relatively little is known about this extended family, but evidence of regulatory interactions suggests the importance of a holistic approach to their study. We show that Hevin, SPOCKs, and SMOCs are strongly expressed within islets, ducts, and blood vessels, suggesting important roles for these proteins in the normal pancreas, while FSTL-1 expression is localised to the stromal compartment reminiscent of SPARC. In direct contrast to SPARC, however, FSTL-1 expression is reduced in pancreatic cancer. Consistent with this, FSTL-1 inhibited pancreatic cancer cell proliferation. The complexity of SPARC family proteins is further revealed by the detection of multiple cell-type specific isoforms that arise due to a combination of post-translational modification and alternative splicing. Identification of splice variants lacking a signal peptide suggests the existence of novel intracellular isoforms. This study underlines the importance of addressing the complexity of the SPARC family and provides a new framework to explain their controversial and contradictory effects. We also demonstrate for the first time that FSTL-1 suppresses pancreatic cancer cell growth.

  15. Results of computer assisted mini-incision subvastus approach for total knee arthroplasty.

    Science.gov (United States)

    Turajane, Thana; Larbpaiboonpong, Viroj; Kongtharvonskul, Jatupon; Maungsiri, Samart

    2009-12-01

    Mini-incision subvastus approach is soft tissue preservation of the knee. Advantages of the mini-incision subvastus approach included reduced blood loss, reduced pain, self rehabilitation and faster recovery. However, the improved visualization, component alignment, and more blood preservation have been debatable to achieve the better outcome and preventing early failure of the Total Knee Arthroplasty (TKA). The computer navigation has been introduced to improve alignment and blood loss. The purpose of this study was to evaluate the short term outcomes of the combination of computer assisted mini-incision subvastus approach for Total Knee Arthroplasty (CMS-TKA). A prospective case series of the initial 80 patients who underwent computer assisted mini-incision subvastus approach for CMS-TKA from January 2007 to October 2008 was carried out. The patients' conditions were classified into 2 groups, the simple OA knee (varus deformity was less than 15 degree, BMI was less than 20%, no associated deformities) and the complex deformity (varus deformity was more than 15 degrees, BMI more was than 20%, associated with flexion contractor). There were 59 patients in group 1 and 21 patients in group 2. Of the 80 knees, 38 were on the left and 42 on the right. The results of CMS-TKA [the mean (range)] in group 1: group 2 were respectively shown as the incision length [10.88 (8-13): 11.92 (10-14], the operation time [118 (111.88-125.12): 131 (119.29-143.71) minutes, lateral releases (0 in both groups), postoperative range of motion in flexion [94.5 (90-100): 95.25 (90-105) degree] and extension [1.75 (0-5): 1.5 (0-5) degree] Blood loss in 24 hours [489.09 (414.7-563.48): 520 (503.46-636.54) ml] and blood transfusion [1 (0-1) unit? in both groups], Tibiofemoral angle preoperative [Varus = 4 (varus 0-10): Varus = 17.14 (varus 15.7-18.5) degree, Tibiofemoral angle postoperative [Valgus = 1.38 (Valgus 0-4): Valgus = 2.85 (valgus 2.1-3.5) degree], Tibiofemoral angle outlier (85% both

  16. An image processing approach to computing distances between RNA secondary structures dot plots

    Directory of Open Access Journals (Sweden)

    Sapiro Guillermo

    2009-02-01

    Full Text Available Abstract Background Computing the distance between two RNA secondary structures can contribute in understanding the functional relationship between them. When used repeatedly, such a procedure may lead to finding a query RNA structure of interest in a database of structures. Several methods are available for computing distances between RNAs represented as strings or graphs, but none utilize the RNA representation with dot plots. Since dot plots are essentially digital images, there is a clear motivation to devise an algorithm for computing the distance between dot plots based on image processing methods. Results We have developed a new metric dubbed 'DoPloCompare', which compares two RNA structures. The method is based on comparing dot plot diagrams that represent the secondary structures. When analyzing two diagrams and motivated by image processing, the distance is based on a combination of histogram correlations and a geometrical distance measure. We introduce, describe, and illustrate the procedure by two applications that utilize this metric on RNA sequences. The first application is the RNA design problem, where the goal is to find the nucleotide sequence for a given secondary structure. Examples where our proposed distance measure outperforms others are given. The second application locates peculiar point mutations that induce significant structural alternations relative to the wild type predicted secondary structure. The approach reported in the past to solve this problem was tested on several RNA sequences with known secondary structures to affirm their prediction, as well as on a data set of ribosomal pieces. These pieces were computationally cut from a ribosome for which an experimentally derived secondary structure is available, and on each piece the prediction conveys similarity to the experimental result. Our newly proposed distance measure shows benefit in this problem as well when compared to standard methods used for assessing

  17. Efficient Discovery of Novel Multicomponent Mixtures for Hydrogen Storage: A Combined Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Wolverton, Christopher [Northwestern Univ., Evanston, IL (United States). Dept. of Materials Science and Engineering; Ozolins, Vidvuds [Univ. of California, Los Angeles, CA (United States). Dept. of Materials Science and Engineering; Kung, Harold H. [Northwestern Univ., Evanston, IL (United States). Dept. of Chemical and Biological Engineering; Yang, Jun [Ford Scientific Research Lab., Dearborn, MI (United States); Hwang, Sonjong [California Inst. of Technology (CalTech), Pasadena, CA (United States). Dept. of Chemistry and Chemical Engineering; Shore, Sheldon [The Ohio State Univ., Columbus, OH (United States). Dept. of Chemistry and Biochemistry

    2016-11-28

    The objective of the proposed program is to discover novel mixed hydrides for hydrogen storage, which enable the DOE 2010 system-level goals. Our goal is to find a material that desorbs 8.5 wt.% H2 or more at temperatures below 85°C. The research program will combine first-principles calculations of reaction thermodynamics and kinetics with material and catalyst synthesis, testing, and characterization. We will combine materials from distinct categories (e.g., chemical and complex hydrides) to form novel multicomponent reactions. Systems to be studied include mixtures of complex hydrides and chemical hydrides [e.g. LiNH2+NH3BH3] and nitrogen-hydrogen based borohydrides [e.g. Al(BH4)3(NH3)3]. The 2010 and 2015 FreedomCAR/DOE targets for hydrogen storage systems are very challenging, and cannot be met with existing materials. The vast majority of the work to date has delineated materials into various classes, e.g., complex and metal hydrides, chemical hydrides, and sorbents. However, very recent studies indicate that mixtures of storage materials, particularly mixtures between various classes, hold promise to achieve technological attributes that materials within an individual class cannot reach. Our project involves a systematic, rational approach to designing novel multicomponent mixtures of materials with fast hydrogenation/dehydrogenation kinetics and favorable thermodynamics using a combination of state-of-the-art scientific computing and experimentation. We will use the accurate predictive power of first-principles modeling to understand the thermodynamic and microscopic kinetic processes involved in hydrogen release and uptake and to design new material/catalyst systems with improved properties. Detailed characterization and atomic-scale catalysis experiments will elucidate the effect of dopants and nanoscale catalysts in achieving fast kinetics and reversibility. And

  18. A computationally inexpensive CFD approach for small-scale biomass burners equipped with enhanced air staging

    International Nuclear Information System (INIS)

    Buchmayr, M.; Gruber, J.; Hargassner, M.; Hochenauer, C.

    2016-01-01

    Highlights: • Time efficient CFD model to predict biomass boiler performance. • Boundary conditions for numerical modeling are provided by measurements. • Tars in the product from primary combustion was considered. • Simulation results were validated by experiments on a real-scale reactor. • Very good accordance between experimental and simulation results. - Abstract: Computational Fluid Dynamics (CFD) is an upcoming technique for optimization and as a part of the design process of biomass combustion systems. An accurate simulation of biomass combustion can only be provided with high computational effort so far. This work presents an accurate, time efficient CFD approach for small-scale biomass combustion systems equipped with enhanced air staging. The model can handle the high amount of biomass tars in the primary combustion product at very low primary air ratios. Gas-phase combustion in the freeboard was performed by the Steady Flamelet Model (SFM) together with a detailed heptane combustion mechanism. The advantage of the SFM is that complex combustion chemistry can be taken into account at low computational effort because only two additional transport equations have to be solved to describe the chemistry in the reacting flow. Boundary conditions for primary combustion product composition were obtained from the fuel bed by experiments. The fuel bed data were used as fuel inlet boundary condition for the gas-phase combustion model. The numerical and experimental investigations were performed for different operating conditions and varying wood-chip moisture on a special designed real-scale reactor. The numerical predictions were validated with experimental results and a very good agreement was found. With the presented approach accurate results can be provided within 24 h using a standard Central Processing Unit (CPU) consisting of six cores. Case studies e.g. for combustion geometry improvement can be realized effectively due to the short calculation

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  20. Replicated Computations Results (RCR) report for “A holistic approach for collaborative workload execution in volunteer clouds”

    DEFF Research Database (Denmark)

    Vandin, Andrea

    2018-01-01

    “A Holistic Approach for Collaborative Workload Execution in Volunteer Clouds” [3] proposes a novel approach to task scheduling in volunteer clouds. Volunteer clouds are decentralized cloud systems based on collaborative task execution, where clients voluntarily share their own unused computational...

  1. Unexpected Regularity in Swimming Behavior of Clausocalanus furcatus Revealed by a Telecentric 3D Computer Vision System.

    Directory of Open Access Journals (Sweden)

    Giuseppe Bianco

    Full Text Available Planktonic copepods display a large repertoire of motion behaviors in a three-dimensional environment. Two-dimensional video observations demonstrated that the small copepod Clausocalanus furcatus, one the most widely distributed calanoids at low to medium latitudes, presented a unique swimming behavior that was continuous and fast and followed notably convoluted trajectories. Furthermore, previous observations indicated that the motion of C. furcatus resembled a random process. We characterized the swimming behavior of this species in three-dimensional space using a video system equipped with telecentric lenses, which allow tracking of zooplankton without the distortion errors inherent in common lenses. Our observations revealed unexpected regularities in the behavior of C. furcatus that appear primarily in the horizontal plane and could not have been identified in previous observations based on lateral views. Our results indicate that the swimming behavior of C. furcatus is based on a limited repertoire of basic kinematic modules but exhibits greater plasticity than previously thought.

  2. A Nonlinear Dynamic Approach Reveals a Long-Term Stroke Effect on Cerebral Blood Flow Regulation at Multiple Time Scales

    Science.gov (United States)

    Hu, Kun; Lo, Men-Tzung; Peng, Chung-Kang; Liu, Yanhui; Novak, Vera

    2012-01-01

    Cerebral autoregulation (CA) is an important vascular control mechanism responsible for relatively stable cerebral blood flow despite changes of systemic blood pressure (BP). Impaired CA may leave brain tissue unprotected against potentially harmful effects of BP fluctuations. It is generally accepted that CA is less effective or even inactive at frequencies >∼0.1 Hz. Without any physiological foundation, this concept is based on studies that quantified the coupling between BP and cerebral blood flow velocity (BFV) using transfer function analysis. This traditional analysis assumes stationary oscillations with constant amplitude and period, and may be unreliable or even invalid for analysis of nonstationary BP and BFV signals. In this study we propose a novel computational tool for CA assessment that is based on nonlinear dynamic theory without the assumption of stationary signals. Using this method, we studied BP and BFV recordings collected from 39 patients with chronic ischemic infarctions and 40 age-matched non-stroke subjects during baseline resting conditions. The active CA function in non-stroke subjects was associated with an advanced phase in BFV oscillations compared to BP oscillations at frequencies from ∼0.02 to 0.38 Hz. The phase shift was reduced in stroke patients even at > = 6 months after stroke, and the reduction was consistent at all tested frequencies and in both stroke and non-stroke hemispheres. These results provide strong evidence that CA may be active in a much wider frequency region than previously believed and that the altered multiscale CA in different vascular territories following stroke may have important clinical implications for post-stroke recovery. Moreover, the stroke effects on multiscale cerebral blood flow regulation could not be detected by transfer function analysis, suggesting that nonlinear approaches without the assumption of stationarity are more sensitive for the assessment of the coupling of nonstationary

  3. Some surprises and paradoxes revealed by inverse problem approach and notion about qualitative solutions of Schroedinger equations 'in mind'

    International Nuclear Information System (INIS)

    Zakhariev, B.N.; Chabanov, V.M.

    2008-01-01

    It was an important examination to give a review talk at the previous Conference on Inverse Quantum Scattering (1996, Lake Balaton) about computer visualization of this science in front of its fathers - creators, B.M. Levitan and V.A. Marchenko. We have achieved a new understanding that the discovered main rules of transformations of a single wave function bump, e.g., for the ground bound states of one dimensional quantum systems are applicable to any state of any potential with arbitrary number of bumps from finite to unlimited ones as scattering states and bound states embedded into continuum. It appeared that we need only to repeat the rule mentally the necessary number of times. That uttermost simplification and unification of physical notion of spectral, scattering and decay control for any potential have got an obligatory praise from B.M. Levitan at the conference and was a mighty stimulus for our further research. After that we have written both Russian (2002) and improved English editions of 'Submissive Quantum Mechanics. New Status of the Theory in Inverse Problem Approach' (appeared at the very end of 2007). This book was written for correction of the present defect in quantum education throughout the world. Recently the quantum IP intuition helped us to discover a new concept of permanent wave resonance with potential spatial oscillations. This means the constant wave swinging frequency on the whole energy intervals of spectral forbidden zones destroying physical solutions and deepening the theory of waves in periodic potentials. It also shows the other side of strengthening the fundamentally important magic structures. A 'new language' of wave bending will be presented to enrich our quantum intuition, e.g., the paradoxical effective attraction of barriers and repulsion of wells in multichannel systems, etc. (author)

  4. Towards electromechanical computation: An alternative approach to realize complex logic circuits

    KAUST Repository

    Hafiz, Md Abdullah Al; Kosuru, Lakshmoji; Younis, Mohammad I.

    2016-01-01

    Electromechanical computing based on micro/nano resonators has recently attracted significant attention. However, full implementation of this technology has been hindered by the difficulty in realizing complex logic circuits. We report here an alternative approach to realize complex logic circuits based on multiple MEMS resonators. As case studies, we report the construction of a single-bit binary comparator, a single-bit 4-to-2 encoder, and parallel XOR/XNOR and AND/NOT logic gates. Toward this, several microresonators are electrically connected and their resonance frequencies are tuned through an electrothermal modulation scheme. The microresonators operating in the linear regime do not require large excitation forces, and work at room temperature and at modest air pressure. This study demonstrates that by reconfiguring the same basic building block, tunable resonator, several essential complex logic functions can be achieved.

  5. Towards electromechanical computation: An alternative approach to realize complex logic circuits

    KAUST Repository

    Hafiz, M. A. A.

    2016-08-18

    Electromechanical computing based on micro/nano resonators has recently attracted significant attention. However, full implementation of this technology has been hindered by the difficulty in realizing complex logic circuits. We report here an alternative approach to realize complex logic circuits based on multiple MEMS resonators. As case studies, we report the construction of a single-bit binary comparator, a single-bit 4-to-2 encoder, and parallel XOR/XNOR and AND/NOT logic gates. Toward this, several microresonators are electrically connected and their resonance frequencies are tuned through an electrothermal modulation scheme. The microresonators operating in the linear regime do not require large excitation forces, and work at room temperature and at modest air pressure. This study demonstrates that by reconfiguring the same basic building block, tunable resonator, several essential complex logic functions can be achieved.

  6. Computational Modelling Approaches on Epigenetic Factors in Neurodegenerative and Autoimmune Diseases and Their Mechanistic Analysis

    Directory of Open Access Journals (Sweden)

    Afroza Khanam Irin

    2015-01-01

    Full Text Available Neurodegenerative as well as autoimmune diseases have unclear aetiologies, but an increasing number of evidences report for a combination of genetic and epigenetic alterations that predispose for the development of disease. This review examines the major milestones in epigenetics research in the context of diseases and various computational approaches developed in the last decades to unravel new epigenetic modifications. However, there are limited studies that systematically link genetic and epigenetic alterations of DNA to the aetiology of diseases. In this work, we demonstrate how disease-related epigenetic knowledge can be systematically captured and integrated with heterogeneous information into a functional context using Biological Expression Language (BEL. This novel methodology, based on BEL, enables us to integrate epigenetic modifications such as DNA methylation or acetylation of histones into a specific disease network. As an example, we depict the integration of epigenetic and genetic factors in a functional context specific to Parkinson’s disease (PD and Multiple Sclerosis (MS.

  7. Cloud computing approaches for prediction of ligand binding poses and pathways.

    Science.gov (United States)

    Lawrenz, Morgan; Shukla, Diwakar; Pande, Vijay S

    2015-01-22

    We describe an innovative protocol for ab initio prediction of ligand crystallographic binding poses and highly effective analysis of large datasets generated for protein-ligand dynamics. We include a procedure for setup and performance of distributed molecular dynamics simulations on cloud computing architectures, a model for efficient analysis of simulation data, and a metric for evaluation of model convergence. We give accurate binding pose predictions for five ligands ranging in affinity from 7 nM to > 200 μM for the immunophilin protein FKBP12, for expedited results in cases where experimental structures are difficult to produce. Our approach goes beyond single, low energy ligand poses to give quantitative kinetic information that can inform protein engineering and ligand design.

  8. Hurricane Sandy Economic Impacts Assessment: A Computable General Equilibrium Approach and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Boero, Riccardo [Los Alamos National Laboratory; Edwards, Brian Keith [Los Alamos National Laboratory

    2017-08-07

    Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying the event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.

  9. Security approaches in using tablet computers for primary data collection in clinical research.

    Science.gov (United States)

    Wilcox, Adam B; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project.

  10. A dynamic fail-safe approach to the design of computer-based safety systems

    International Nuclear Information System (INIS)

    Smith, I.C.; Miller, M.

    1994-01-01

    For over 30 years AEA Technology has carried out research and development in the field of nuclear instrumentation and protection systems. Throughout the course of this extensive period of research and development the dominant theme has been the achievement of fully fail-safe designs. These are defined as designs in which the failure of any single component will result in the unit output reverting to a demand for trip action status. At an early stage it was recognized that the use of dynamic rather than static logic could ease the difficulties inherent in achieving a fail-safe design. The first dynamic logic systems coupled logic elements magnetically. The paper outlines the evolution from these early concepts of a dynamic fail-safe approach to the design of computer-based safety systems. Details are given of collaboration between AEA Technology and Duke Power Co. to mount an ISAT TM demonstration at Duke's Oconee Nuclear Power Station

  11. A convolutional neural network approach to calibrating the rotation axis for X-ray computed tomography.

    Science.gov (United States)

    Yang, Xiaogang; De Carlo, Francesco; Phatak, Charudatta; Gürsoy, Dogˇa

    2017-03-01

    This paper presents an algorithm to calibrate the center-of-rotation for X-ray tomography by using a machine learning approach, the Convolutional Neural Network (CNN). The algorithm shows excellent accuracy from the evaluation of synthetic data with various noise ratios. It is further validated with experimental data of four different shale samples measured at the Advanced Photon Source and at the Swiss Light Source. The results are as good as those determined by visual inspection and show better robustness than conventional methods. CNN has also great potential for reducing or removing other artifacts caused by instrument instability, detector non-linearity, etc. An open-source toolbox, which integrates the CNN methods described in this paper, is freely available through GitHub at tomography/xlearn and can be easily integrated into existing computational pipelines available at various synchrotron facilities. Source code, documentation and information on how to contribute are also provided.

  12. Bridging computational approaches to speech production: The semantic–lexical–auditory–motor model (SLAM)

    Science.gov (United States)

    Hickok, Gregory

    2017-01-01

    Speech production is studied from both psycholinguistic and motor-control perspectives, with little interaction between the approaches. We assessed the explanatory value of integrating psycholinguistic and motor-control concepts for theories of speech production. By augmenting a popular psycholinguistic model of lexical retrieval with a motor-control-inspired architecture, we created a new computational model to explain speech errors in the context of aphasia. Comparing the model fits to picture-naming data from 255 aphasic patients, we found that our new model improves fits for a theoretically predictable subtype of aphasia: conduction. We discovered that the improved fits for this group were a result of strong auditory-lexical feedback activation, combined with weaker auditory-motor feedforward activation, leading to increased competition from phonologically related neighbors during lexical selection. We discuss the implications of our findings with respect to other extant models of lexical retrieval. PMID:26223468

  13. Systems approach to modeling the Token Bucket algorithm in computer networks

    Directory of Open Access Journals (Sweden)

    Ahmed N. U.

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  14. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  15. Inpainting approaches to fill in detector gaps in phase contrast computed tomography

    Science.gov (United States)

    Brun, F.; Delogu, P.; Longo, R.; Dreossi, D.; Rigon, L.

    2018-01-01

    Photon counting semiconductor detectors in radiation imaging present attractive properties, such as high efficiency, low noise, and energy sensitivity. The very complex electronics limits the sensitive area of current devices to a few square cm. This disadvantage is often compensated by tiling a larger matrix with an adequate number of detector units but this usually results in non-negligible insensitive gaps between two adjacent modules. When considering the case of Computed Tomography (CT), these gaps lead to degraded reconstructed images with severe streak and ring artifacts. This work presents two digital image processing solutions to fill in these gaps when considering the specific case of synchrotron radiation x-ray parallel beam phase contrast CT. While not discussed with experimental data, other CT modalities, such as spectral, cone beam and other geometries might benefit from the presented approaches.

  16. Screening of photosynthetic pigments for herbicidal activity with a new computational molecular approach.

    Science.gov (United States)

    Krishnaraj, R Navanietha; Chandran, Saravanan; Pal, Parimal; Berchmans, Sheela

    2013-12-01

    There is an immense interest among the researchers to identify new herbicides which are effective against the herbs without affecting the environment. In this work, photosynthetic pigments are used as the ligands to predict their herbicidal activity. The enzyme 5-enolpyruvylshikimate-3-phosphate (EPSP) synthase is a good target for the herbicides. Homology modeling of the target enzyme is done using Modeler 9.11 and the model is validated. Docking studies were performed with AutoDock Vina algorithm to predict the binding of the natural pigments such as β-carotene, chlorophyll a, chlorophyll b, phycoerythrin and phycocyanin to the target. β-carotene, phycoerythrin and phycocyanin have higher binding energies indicating the herbicidal activity of the pigments. This work reports a procedure to screen herbicides with computational molecular approach. These pigments will serve as potential bioherbicides in the future.

  17. Computational Approach to Annotating Variants of Unknown Significance in Clinical Next Generation Sequencing.

    Science.gov (United States)

    Schulz, Wade L; Tormey, Christopher A; Torres, Richard

    2015-01-01

    Next generation sequencing (NGS) has become a common technology in the clinical laboratory, particularly for the analysis of malignant neoplasms. However, most mutations identified by NGS are variants of unknown clinical significance (VOUS). Although the approach to define these variants differs by institution, software algorithms that predict variant effect on protein function may be used. However, these algorithms commonly generate conflicting results, potentially adding uncertainty to interpretation. In this review, we examine several computational tools used to predict whether a variant has clinical significance. In addition to describing the role of these tools in clinical diagnostics, we assess their efficacy in analyzing known pathogenic and benign variants in hematologic malignancies. Copyright© by the American Society for Clinical Pathology (ASCP).

  18. Utero-fetal unit and pregnant woman modeling using a computer graphics approach for dosimetry studies.

    Science.gov (United States)

    Anquez, Jérémie; Boubekeur, Tamy; Bibin, Lazar; Angelini, Elsa; Bloch, Isabelle

    2009-01-01

    Potential sanitary effects related to electromagnetic fields exposure raise public concerns, especially for fetuses during pregnancy. Human fetus exposure can only be assessed through simulated dosimetry studies, performed on anthropomorphic models of pregnant women. In this paper, we propose a new methodology to generate a set of detailed utero-fetal unit (UFU) 3D models during the first and third trimesters of pregnancy, based on segmented 3D ultrasound and MRI data. UFU models are built using recent geometry processing methods derived from mesh-based computer graphics techniques and embedded in a synthetic woman body. Nine pregnant woman models have been generated using this approach and validated by obstetricians, for anatomical accuracy and representativeness.

  19. Multiproxy approach revealing climate and cultural changes during the last 26kyrs in south-central Chile

    Science.gov (United States)

    Abarzua, Ana M.; Jarpa, Leonora; Martel, Alejandra; Vega, Rodrigo; Pino, Mario

    2010-05-01

    Multiproxy approach from Purén Lumaco Valley (38°S) describes the paleonvironmental history during the Last Maximum Glacial (LGM) in south-central Chile. Three sediment cores and severals AMS 14C dates were used to perform a complete pollen, diatoms, chironomids, and sedimentological records demonstrating the existence of a large and non profundal paleolake, between 25 and 20kyr BP. Some of these evidence are laminated silty-clay sediments (lacustrine rhythmites), associated with the presence of siderite mineral (FeCO3), besides biological proxies like Fragilaria construens and Stauroforma inermes (planctonic diatoms), and Dicrotendipes sp. and Tanytarsini tribe (littoral chironomids). The pollen ensemble reveals the first glacial refuge of Araucaria araucana forests in the low lands during the LGM. The lake was drained abruptly into a swamp/bog at 12kyr BP and colonized by Myrtaceae wet forest. This evidence suggest the dry/warm climate period of early Holocene in south-central Chile. Later, the sediments indicate variable lacustrine levels, and increase of charcoal particles, associated to current climatic conditions. The pollen spectrum dominated by Myrtaceae and Nothofagus contrasts with a strongly disturb current landscape. Actually, Purén-Lumaco valley constitutes a complex peat-bog system dominated by exotic grasses and forest species (Tritricum aestivum, Pinus radiata and Eucalyptus spp.). Some archaeological antecedents in the area document the human development at ca. 7yrs BP. The greatest archaeological characteristic present in the valley is the kuel, a Mapuche earth accumulation. The presence and extension of almost 300 kuel in the valley reflect the social/economic development, and partly explains why the region was the major resistance area for Spanish colonizer during XVI-XVII centuries. Also the archaeological findings reveal the presence of maize pollen (Zea mays) within their food consumption. The influence of climate and human impact in

  20. Computational approaches to screen candidate ligands with anti- Parkinson's activity using R programming.

    Science.gov (United States)

    Jayadeepa, R M; Niveditha, M S

    2012-01-01

    It is estimated that by 2050 over 100 million people will be affected by the Parkinson's disease (PD). We propose various computational approaches to screen suitable candidate ligand with anti-Parkinson's activity from phytochemicals. Five different types of dopamine receptors have been identified in the brain, D1-D5. Dopamine receptor D3 was selected as the target receptor. The D3 receptor exists in areas of the brain outside the basal ganglia, such as the limbic system, and thus may play a role in the cognitive and emotional changes noted in Parkinson's disease. A ligand library of 100 molecules with anti-Parkinson's activity was collected from literature survey. Nature is the best combinatorial chemist and possibly has answers to all diseases of mankind. Failure of some synthetic drugs and its side effects have prompted many researches to go back to ancient healing methods which use herbal medicines to give relief. Hence, the candidate ligands with anti-Parkinson's were selected from herbal sources through literature survey. Lipinski rules were applied to screen the suitable molecules for the study, the resulting 88 molecules were energy minimized, and subjected to docking using Autodock Vina. The top eleven molecules were screened according to the docking score generated by Autodock Vina Commercial drug Ropinirole was computed similarly and was compared with the 11 phytochemicals score, the screened molecules were subjected to toxicity analysis and to verify toxic property of phytochemicals. R Programming was applied to remove the bias from the top eleven molecules. Using cluster analysis and Confusion Matrix two phytochemicals were computationally selected namely Rosmarinic acid and Gingkolide A for further studies on the disease Parkinson's.

  1. Computer Assisted REhabilitation (CARE) Lab: A novel approach towards Pediatric Rehabilitation 2.0.

    Science.gov (United States)

    Olivieri, Ivana; Meriggi, Paolo; Fedeli, Cristina; Brazzoli, Elena; Castagna, Anna; Roidi, Marina Luisa Rodocanachi; Angelini, Lucia

    2018-01-01

    Pediatric Rehabilitation therapists have always worked using a variety of off-the-shelf or custom-made objects and devices, more recently including computer based systems. These Information and Communication Technology (ICT) solutions vary widely in complexity, from easy-to-use interactive videogame consoles originally intended for entertainment purposes to sophisticated systems specifically developed for rehabilitation.This paper describes the principles underlying an innovative "Pediatric Rehabilitation 2.0" approach, based on the combination of suitable ICT solutions and traditional rehabilitation, which has been progressively refined while building up and using a computer-assisted rehabilitation laboratory. These principles are thus summarized in the acronym EPIQ, to account for the terms Ecological, Personalized, Interactive and Quantitative. The paper also presents the laboratory, which has been designed to meet the children's rehabilitation needs and to empower therapists in their work. The laboratory is equipped with commercial hardware and specially developed software called VITAMIN: a virtual reality platform for motor and cognitive rehabilitation.

  2. A six step approach for developing computer based assessment in medical education.

    Science.gov (United States)

    Hassanien, Mohammed Ahmed; Al-Hayani, Abdulmoneam; Abu-Kamer, Rasha; Almazrooa, Adnan

    2013-01-01

    Assessment, which entails the systematic evaluation of student learning, is an integral part of any educational process. Computer-based assessment (CBA) techniques provide a valuable resource to students seeking to evaluate their academic progress through instantaneous, personalized feedback. CBA reduces examination, grading and reviewing workloads and facilitates training. This paper describes a six step approach for developing CBA in higher education and evaluates student perceptions of computer-based summative assessment at the College of Medicine, King Abdulaziz University. A set of questionnaires were distributed to 341 third year medical students (161 female and 180 male) immediately after examinations in order to assess the adequacy of the system for the exam program. The respondents expressed high satisfaction with the first Saudi experience of CBA for final examinations. However, about 50% of them preferred the use of a pilot CBA before its formal application; hence, many did not recommend its use for future examinations. Both male and female respondents reported that the range of advantages offered by CBA outweighed any disadvantages. Further studies are required to monitor the extended employment of CBA technology for larger classes and for a variety of subjects at universities.

  3. An ordinal approach to computing with words and the preference-aversion model

    DEFF Research Database (Denmark)

    Franco de los Rios, Camilo Andres; Rodríguez, J. Tinguaro; Montero, Javier

    2014-01-01

    Computing with words (CWW) explores the brain’s ability to handle and evaluate perceptions through language, i.e., by means of the linguistic representation of information and knowledge. On the other hand, standard preference structures examine decision problems through the decomposition of the p......Computing with words (CWW) explores the brain’s ability to handle and evaluate perceptions through language, i.e., by means of the linguistic representation of information and knowledge. On the other hand, standard preference structures examine decision problems through the decomposition...... of the preference predicate into the simpler situations of strict preference, indifference and incomparability. Hence, following the distinctive cognitive/neurological features for perceiving positive and negative stimuli in separate regions of the brain, we consider two separate and opposite poles of preference...... and aversion, and obtain an extended preference structure named the Preference–aversion (P–A) structure. In this way, examining the meaning of words under an ordinal scale and using CWW’s methodology, we are able to formulate the P–A model under a simple and purely linguistic approach to decision making...

  4. Computational identification of binding energy hot spots in protein-RNA complexes using an ensemble approach.

    Science.gov (United States)

    Pan, Yuliang; Wang, Zixiang; Zhan, Weihua; Deng, Lei

    2018-05-01

    Identifying RNA-binding residues, especially energetically favored hot spots, can provide valuable clues for understanding the mechanisms and functional importance of protein-RNA interactions. Yet, limited availability of experimentally recognized energy hot spots in protein-RNA crystal structures leads to the difficulties in developing empirical identification approaches. Computational prediction of RNA-binding hot spot residues is still in its infant stage. Here, we describe a computational method, PrabHot (Prediction of protein-RNA binding hot spots), that can effectively detect hot spot residues on protein-RNA binding interfaces using an ensemble of conceptually different machine learning classifiers. Residue interaction network features and new solvent exposure characteristics are combined together and selected for classification with the Boruta algorithm. In particular, two new reference datasets (benchmark and independent) have been generated containing 107 hot spots from 47 known protein-RNA complex structures. In 10-fold cross-validation on the training dataset, PrabHot achieves promising performances with an AUC score of 0.86 and a sensitivity of 0.78, which are significantly better than that of the pioneer RNA-binding hot spot prediction method HotSPRing. We also demonstrate the capability of our proposed method on the independent test dataset and gain a competitive advantage as a result. The PrabHot webserver is freely available at http://denglab.org/PrabHot/. leideng@csu.edu.cn. Supplementary data are available at Bioinformatics online.

  5. A Brief Review of Computer-Assisted Approaches to Rational Design of Peptide Vaccines

    Directory of Open Access Journals (Sweden)

    Ashesh Nandy

    2016-05-01

    Full Text Available The growing incidences of new viral diseases and increasingly frequent viral epidemics have strained therapeutic and preventive measures; the high mutability of viral genes puts additional strains on developmental efforts. Given the high cost and time requirements for new drugs development, vaccines remain as a viable alternative, but there too traditional techniques of live-attenuated or inactivated vaccines have the danger of allergenic reactions and others. Peptide vaccines have, over the last several years, begun to be looked on as more appropriate alternatives, which are economically affordable, require less time for development and hold the promise of multi-valent dosages. The developments in bioinformatics, proteomics, immunogenomics, structural biology and other sciences have spurred the growth of vaccinomics where computer assisted approaches serve to identify suitable peptide targets for eventual development of vaccines. In this mini-review we give a brief overview of some of the recent trends in computer assisted vaccine development with emphasis on the primary selection procedures of probable peptide candidates for vaccine development.

  6. A simplified computational fluid-dynamic approach to the oxidizer injector design in hybrid rockets

    Science.gov (United States)

    Di Martino, Giuseppe D.; Malgieri, Paolo; Carmicino, Carmine; Savino, Raffaele

    2016-12-01

    Fuel regression rate in hybrid rockets is non-negligibly affected by the oxidizer injection pattern. In this paper a simplified computational approach developed in an attempt to optimize the oxidizer injector design is discussed. Numerical simulations of the thermo-fluid-dynamic field in a hybrid rocket are carried out, with a commercial solver, to investigate into several injection configurations with the aim of increasing the fuel regression rate and minimizing the consumption unevenness, but still favoring the establishment of flow recirculation at the motor head end, which is generated with an axial nozzle injector and has been demonstrated to promote combustion stability, and both larger efficiency and regression rate. All the computations have been performed on the configuration of a lab-scale hybrid rocket motor available at the propulsion laboratory of the University of Naples with typical operating conditions. After a preliminary comparison between the two baseline limiting cases of an axial subsonic nozzle injector and a uniform injection through the prechamber, a parametric analysis has been carried out by varying the oxidizer jet flow divergence angle, as well as the grain port diameter and the oxidizer mass flux to study the effect of the flow divergence on heat transfer distribution over the fuel surface. Some experimental firing test data are presented, and, under the hypothesis that fuel regression rate and surface heat flux are proportional, the measured fuel consumption axial profiles are compared with the predicted surface heat flux showing fairly good agreement, which allowed validating the employed design approach. Finally an optimized injector design is proposed.

  7. A computational design approach for virtual screening of peptide interactions across K+ channel families

    Directory of Open Access Journals (Sweden)

    Craig A. Doupnik

    2015-01-01

    Full Text Available Ion channels represent a large family of membrane proteins with many being well established targets in pharmacotherapy. The ‘druggability’ of heteromeric channels comprised of different subunits remains obscure, due largely to a lack of channel-specific probes necessary to delineate their therapeutic potential in vivo. Our initial studies reported here, investigated the family of inwardly rectifying potassium (Kir channels given the availability of high resolution crystal structures for the eukaryotic constitutively active Kir2.2 channel. We describe a ‘limited’ homology modeling approach that can yield chimeric Kir channels having an outer vestibule structure representing nearly any known vertebrate or invertebrate channel. These computationally-derived channel structures were tested in silico for ‘docking’ to NMR structures of tertiapin (TPN, a 21 amino acid peptide found in bee venom. TPN is a highly selective and potent blocker for the epithelial rat Kir1.1 channel, but does not block human or zebrafish Kir1.1 channel isoforms. Our Kir1.1 channel-TPN docking experiments recapitulated published in vitro findings for TPN-sensitive and TPN-insensitive channels. Additionally, in silico site-directed mutagenesis identified ‘hot spots’ within the channel outer vestibule that mediate energetically favorable docking scores and correlate with sites previously identified with in vitro thermodynamic mutant-cycle analysis. These ‘proof-of-principle’ results establish a framework for virtual screening of re-engineered peptide toxins for interactions with computationally derived Kir channels that currently lack channel-specific blockers. When coupled with electrophysiological validation, this virtual screening approach may accelerate the drug discovery process, and can be readily applied to other ion channels families where high resolution structures are available.

  8. Neural computational modeling reveals a major role of corticospinal gating of central oscillations in the generation of essential tremor

    Directory of Open Access Journals (Sweden)

    Hong-en Qu

    2017-01-01

    Full Text Available Essential tremor, also referred to as familial tremor, is an autosomal dominant genetic disease and the most common movement disorder. It typically involves a postural and motor tremor of the hands, head or other part of the body. Essential tremor is driven by a central oscillation signal in the brain. However, the corticospinal mechanisms involved in the generation of essential tremor are unclear. Therefore, in this study, we used a neural computational model that includes both monosynaptic and multisynaptic corticospinal pathways interacting with a propriospinal neuronal network. A virtual arm model is driven by the central oscillation signal to simulate tremor activity behavior. Cortical descending commands are classified as alpha or gamma through monosynaptic or multisynaptic corticospinal pathways, which converge respectively on alpha or gamma motoneurons in the spinal cord. Several scenarios are evaluated based on the central oscillation signal passing down to the spinal motoneurons via each descending pathway. The simulated behaviors are compared with clinical essential tremor characteristics to identify the corticospinal pathways responsible for transmitting the central oscillation signal. A propriospinal neuron with strong cortical inhibition performs a gating function in the generation of essential tremor. Our results indicate that the propriospinal neuronal network is essential for relaying the central oscillation signal and the production of essential tremor.

  9. Neural computational modeling reveals a major role of corticospinal gating of central oscillations in the generation of essential tremor.

    Science.gov (United States)

    Qu, Hong-En; Niu, Chuanxin M; Li, Si; Hao, Man-Zhao; Hu, Zi-Xiang; Xie, Qing; Lan, Ning

    2017-12-01

    Essential tremor, also referred to as familial tremor, is an autosomal dominant genetic disease and the most common movement disorder. It typically involves a postural and motor tremor of the hands, head or other part of the body. Essential tremor is driven by a central oscillation signal in the brain. However, the corticospinal mechanisms involved in the generation of essential tremor are unclear. Therefore, in this study, we used a neural computational model that includes both monosynaptic and multisynaptic corticospinal pathways interacting with a propriospinal neuronal network. A virtual arm model is driven by the central oscillation signal to simulate tremor activity behavior. Cortical descending commands are classified as alpha or gamma through monosynaptic or multisynaptic corticospinal pathways, which converge respectively on alpha or gamma motoneurons in the spinal cord. Several scenarios are evaluated based on the central oscillation signal passing down to the spinal motoneurons via each descending pathway. The simulated behaviors are compared with clinical essential tremor characteristics to identify the corticospinal pathways responsible for transmitting the central oscillation signal. A propriospinal neuron with strong cortical inhibition performs a gating function in the generation of essential tremor. Our results indicate that the propriospinal neuronal network is essential for relaying the central oscillation signal and the production of essential tremor.

  10. X-Ray Computed Tomography Reveals the Response of Root System Architecture to Soil Texture1[OPEN

    Science.gov (United States)

    Rogers, Eric D.; Monaenkova, Daria; Mijar, Medhavinee; Goldman, Daniel I.

    2016-01-01

    Root system architecture (RSA) impacts plant fitness and crop yield by facilitating efficient nutrient and water uptake from the soil. A better understanding of the effects of soil on RSA could improve crop productivity by matching roots to their soil environment. We used x-ray computed tomography to perform a detailed three-dimensional quantification of changes in rice (Oryza sativa) RSA in response to the physical properties of a granular substrate. We characterized the RSA of eight rice cultivars in five different growth substrates and determined that RSA is the result of interactions between genotype and growth environment. We identified cultivar-specific changes in RSA in response to changing growth substrate texture. The cultivar Azucena exhibited low RSA plasticity in all growth substrates, whereas cultivar Bala root depth was a function of soil hardness. Our imaging techniques provide a framework to study RSA in different growth environments, the results of which can be used to improve root traits with agronomic potential. PMID:27208237

  11. Studies to reveal the nature of interactions between catalase and curcumin using computational methods and optical techniques.

    Science.gov (United States)

    Mofidi Najjar, Fayezeh; Ghadari, Rahim; Yousefi, Reza; Safari, Naser; Sheikhhasani, Vahid; Sheibani, Nader; Moosavi-Movahedi, Ali Akbar

    2017-02-01

    Curcumin is an important antioxidant compound, and is widely reported as an effective component for reducing complications of many diseases. However, the detailed mechanisms of its activity remain poorly understood. We found that curcumin can significantly increase catalase activity of BLC (bovine liver catalase). The mechanism of curcumin action was investigated using a computational method. We suggested that curcumin may activate BLC by modifying the bottleneck of its narrow channel. The molecular dynamic simulation data showed that placing curcumin on the structure of enzyme can increase the size of the bottleneck in the narrow channel of BLC, and readily allow the access of substrate to the active site. Because of the increase of the distance between amino acids of the bottleneck in the presence of curcumin, the entrance space of substrate increased from 250Å 3 to 440Å 3 . In addition, the increase in emission of intrinsic fluorescence of BLC in presence of curcumin demonstrated changes in tertiary structure of catalase, and possibility of less quenching. We also used circular dichroism (CD) spectropolarimetry to determine how curcumin may alter the enzyme secondary structure. Catalase spectra in the presence of various concentrations of curcumin showed an increase in the amount of α-helix content. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. A Novel Interaction Between the TLR7 and a Colchicine Derivative Revealed Through a Computational and Experimental Study

    Directory of Open Access Journals (Sweden)

    Francesco Gentile

    2018-02-01

    Full Text Available The Toll-Like Receptor 7 (TLR7 is an endosomal membrane receptor involved in the innate immune system response. Its best-known small molecule activators are imidazoquinoline derivatives such as imiquimod (R-837 and resiquimod (R-848. Recently, an interaction between R-837 and the colchicine binding site of tubulin was reported. To investigate the possibility of an interaction between structural analogues of colchicine and the TLR7, a recent computational model for the dimeric form of the TLR7 receptor was used to determine a possible interaction with a colchicine derivative called CR42-24, active as a tubulin polymerization inhibitor. The estimated values of the binding energy of this molecule with respect to the TLR7 receptor were comparable to the energies of known binders as reported in a previous study. The binding to the TLR7 was further assessed by introducing genetic transformations in the TLR7 gene in cancer cell lines and exposing them to the compound. A negative shift of the IC50 value in terms of cell growth was observed in cell lines carrying the mutated TLR7 gene. The reported study suggests a possible interaction between TLR7 and a colchicine derivative, which can be explored for rational design of new drugs acting on this receptor by using a colchicine scaffold for additional modifications.

  13. A computational approach to achieve situational awareness from limited observations of a complex system

    Science.gov (United States)

    Sherwin, Jason

    human activities. Nevertheless, since it is not constrained by computational details, the study of situational awareness provides a unique opportunity to approach complex tasks of operation from an analytical perspective. In other words, with SA, we get to see how humans observe, recognize and react to complex systems on which they exert some control. Reconciling this perspective on complexity with complex systems research, it might be possible to further our understanding of complex phenomena if we can probe the anatomical mechanisms by which we, as humans, do it naturally. At this unique intersection of two disciplines, a hybrid approach is needed. So in this work, we propose just such an approach. In particular, this research proposes a computational approach to the situational awareness (SA) of complex systems. Here we propose to implement certain aspects of situational awareness via a biologically-inspired machine-learning technique called Hierarchical Temporal Memory (HTM). In doing so, we will use either simulated or actual data to create and to test computational implementations of situational awareness. This will be tested in two example contexts, one being more complex than the other. The ultimate goal of this research is to demonstrate a possible approach to analyzing and understanding complex systems. By using HTM and carefully developing techniques to analyze the SA formed from data, it is believed that this goal can be obtained.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  16. Linked functional network abnormalities during intrinsic and extrinsic activity in schizophrenia as revealed by a data-fusion approach.

    Science.gov (United States)

    Hashimoto, Ryu-Ichiro; Itahashi, Takashi; Okada, Rieko; Hasegawa, Sayaka; Tani, Masayuki; Kato, Nobumasa; Mimura, Masaru

    2018-01-01

    Abnormalities in functional brain networks in schizophrenia have been studied by examining intrinsic and extrinsic brain activity under various experimental paradigms. However, the identified patterns of abnormal functional connectivity (FC) vary depending on the adopted paradigms. Thus, it is unclear whether and how these patterns are inter-related. In order to assess relationships between abnormal patterns of FC during intrinsic activity and those during extrinsic activity, we adopted a data-fusion approach and applied partial least square (PLS) analyses to FC datasets from 25 patients with chronic schizophrenia and 25 age- and sex-matched normal controls. For the input to the PLS analyses, we generated a pair of FC maps during the resting state (REST) and the auditory deviance response (ADR) from each participant using the common seed region in the left middle temporal gyrus, which is a focus of activity associated with auditory verbal hallucinations (AVHs). PLS correlation (PLS-C) analysis revealed that patients with schizophrenia have significantly lower loadings of a component containing positive FCs in default-mode network regions during REST and a component containing positive FCs in the auditory and attention-related networks during ADR. Specifically, loadings of the REST component were significantly correlated with the severities of positive symptoms and AVH in patients with schizophrenia. The co-occurrence of such altered FC patterns during REST and ADR was replicated using PLS regression, wherein FC patterns during REST are modeled to predict patterns during ADR. These findings provide an integrative understanding of altered FCs during intrinsic and extrinsic activity underlying core schizophrenia symptoms.

  17. Tensor Analysis Reveals Distinct Population Structure that Parallels the Different Computational Roles of Areas M1 and V1.

    Science.gov (United States)

    Seely, Jeffrey S; Kaufman, Matthew T; Ryu, Stephen I; Shenoy, Krishna V; Cunningham, John P; Churchland, Mark M

    2016-11-01

    Cortical firing rates frequently display elaborate and heterogeneous temporal structure. One often wishes to compute quantitative summaries of such structure-a basic example is the frequency spectrum-and compare with model-based predictions. The advent of large-scale population recordings affords the opportunity to do so in new ways, with the hope of distinguishing between potential explanations for why responses vary with time. We introduce a method that assesses a basic but previously unexplored form of population-level structure: when data contain responses across multiple neurons, conditions, and times, they are naturally expressed as a third-order tensor. We examined tensor structure for multiple datasets from primary visual cortex (V1) and primary motor cortex (M1). All V1 datasets were 'simplest' (there were relatively few degrees of freedom) along the neuron mode, while all M1 datasets were simplest along the condition mode. These differences could not be inferred from surface-level response features. Formal considerations suggest why tensor structure might differ across modes. For idealized linear models, structure is simplest across the neuron mode when responses reflect external variables, and simplest across the condition mode when responses reflect population dynamics. This same pattern was present for existing models that seek to explain motor cortex responses. Critically, only dynamical models displayed tensor structure that agreed with the empirical M1 data. These results illustrate that tensor structure is a basic feature of the data. For M1 the tensor structure was compatible with only a subset of existing models.

  18. Tensor Analysis Reveals Distinct Population Structure that Parallels the Different Computational Roles of Areas M1 and V1.

    Directory of Open Access Journals (Sweden)

    Jeffrey S Seely

    2016-11-01

    Full Text Available Cortical firing rates frequently display elaborate and heterogeneous temporal structure. One often wishes to compute quantitative summaries of such structure-a basic example is the frequency spectrum-and compare with model-based predictions. The advent of large-scale population recordings affords the opportunity to do so in new ways, with the hope of distinguishing between potential explanations for why responses vary with time. We introduce a method that assesses a basic but previously unexplored form of population-level structure: when data contain responses across multiple neurons, conditions, and times, they are naturally expressed as a third-order tensor. We examined tensor structure for multiple datasets from primary visual cortex (V1 and primary motor cortex (M1. All V1 datasets were 'simplest' (there were relatively few degrees of freedom along the neuron mode, while all M1 datasets were simplest along the condition mode. These differences could not be inferred from surface-level response features. Formal considerations suggest why tensor structure might differ across modes. For idealized linear models, structure is simplest across the neuron mode when responses reflect external variables, and simplest across the condition mode when responses reflect population dynamics. This same pattern was present for existing models that seek to explain motor cortex responses. Critically, only dynamical models displayed tensor structure that agreed with the empirical M1 data. These results illustrate that tensor structure is a basic feature of the data. For M1 the tensor structure was compatible with only a subset of existing models.

  19. Effects of clinically relevant MPL mutations in the transmembrane domain revealed at the atomic level through computational modeling.

    Science.gov (United States)

    Lee, Tai-Sung; Kantarjian, Hagop; Ma, Wanlong; Yeh, Chen-Hsiung; Giles, Francis; Albitar, Maher

    2011-01-01

    Mutations in the thrombopoietin receptor (MPL) may activate relevant pathways and lead to chronic myeloproliferative neoplasms (MPNs). The mechanisms of MPL activation remain elusive because of a lack of experimental structures. Modern computational biology techniques were utilized to explore the mechanisms of MPL protein activation due to various mutations. Transmembrane (TM) domain predictions, homology modeling, ab initio protein structure prediction, and molecular dynamics (MD) simulations were used to build structural dynamic models of wild-type and four clinically observed mutants of MPL. The simulation results suggest that S505 and W515 are important in keeping the TM domain in its correct position within the membrane. Mutations at either of these two positions cause movement of the TM domain, altering the conformation of the nearby intracellular domain in unexpected ways, and may cause the unwanted constitutive activation of MPL's kinase partner, JAK2. Our findings represent the first full-scale molecular dynamics simulations of the wild-type and clinically observed mutants of the MPL protein, a critical element of the MPL-JAK2-STAT signaling pathway. In contrast to usual explanations for the activation mechanism that are based on the relative translational movement between rigid domains of MPL, our results suggest that mutations within the TM region could result in conformational changes including tilt and rotation (azimuthal) angles along the membrane axis. Such changes may significantly alter the conformation of the adjacent and intrinsically flexible intracellular domain. Hence, caution should be exercised when interpreting experimental evidence based on rigid models of cytokine receptors or similar systems.

  20. Insights into cellulase-lignin non-specific binding revealed by computational redesign of the surface of green fluorescent protein.

    Science.gov (United States)

    Haarmeyer, Carolyn N; Smith, Matthew D; Chundawat, Shishir P S; Sammond, Deanne; Whitehead, Timothy A

    2017-04-01

    lignin-binding cellulases by either rational design or by computational screening genomic databases. Biotechnol. Bioeng. 2017;114: 740-750. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Efficient Machine Learning Approach for Optimizing Scientific Computing Applications on Emerging HPC Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Arumugam, Kamesh [Old Dominion Univ., Norfolk, VA (United States)

    2017-05-01

    the parallel implementation challenges of such irregular applications on different HPC architectures. In particular, we use supervised learning to predict the computation structure and use it to address the control-ow and memory access irregularities in the parallel implementation of such applications on GPUs, Xeon Phis, and heterogeneous architectures composed of multi-core CPUs with GPUs or Xeon Phis. We use numerical simulation of charged particles beam dynamics simulation as a motivating example throughout the dissertation to present our new approach, though they should be equally applicable to a wide range of irregular applications. The machine learning approach presented here use predictive analytics and forecasting techniques to adaptively model and track the irregular memory access pattern at each time step of the simulation to anticipate the future memory access pattern. Access pattern forecasts can then be used to formulate optimization decisions during application execution which improves the performance of the application at a future time step based on the observations from earlier time steps. In heterogeneous architectures, forecasts can also be used to improve the memory performance and resource utilization of all the processing units to deliver a good aggregate performance. We used these optimization techniques and anticipation strategy to design a cache-aware, memory efficient parallel algorithm to address the irregularities in the parallel implementation of charged particles beam dynamics simulation on different HPC architectures. Experimental result using a diverse mix of HPC architectures shows that our approach in using anticipation strategy is effective in maximizing data reuse, ensuring workload balance, minimizing branch and memory divergence, and in improving resource utilization.

  2. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches

    KAUST Repository

    Jiang, Hanlun

    2016-12-06

    MicroRNA (miRNA) and Argonaute (AGO) protein together form the RNA-induced silencing complex (RISC) that plays an essential role in the regulation of gene expression. Elucidating the underlying mechanism of AGO-miRNA recognition is thus of great importance not only for the in-depth understanding of miRNA function but also for inspiring new drugs targeting miRNAs. In this chapter we introduce a combined computational approach of molecular dynamics (MD) simulations, Markov state models (MSMs), and protein-RNA docking to investigate AGO-miRNA recognition. Constructed from MD simulations, MSMs can elucidate the conformational dynamics of AGO at biologically relevant timescales. Protein-RNA docking can then efficiently identify the AGO conformations that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms of molecular recognition between large, flexible, and complex biomolecules.

  3. Signal Amplification Technique (SAT): an approach for improving resolution and reducing image noise in computed tomography

    International Nuclear Information System (INIS)

    Phelps, M.E.; Huang, S.C.; Hoffman, E.J.; Plummer, D.; Carson, R.

    1981-01-01

    Spatial resolution improvements in computed tomography (CT) have been limited by the large and unique error propagation properties of this technique. The desire to provide maximum image resolution has resulted in the use of reconstruction filter functions designed to produce tomographic images with resolution as close as possible to the intrinsic detector resolution. Thus, many CT systems produce images with excessive noise with the system resolution determined by the detector resolution rather than the reconstruction algorithm. CT is a rigorous mathematical technique which applies an increasing amplification to increasing spatial frequencies in the measured data. This mathematical approach to spatial frequency amplification cannot distinguish between signal and noise and therefore both are amplified equally. We report here a method in which tomographic resolution is improved by using very small detectors to selectively amplify the signal and not noise. Thus, this approach is referred to as the signal amplification technique (SAT). SAT can provide dramatic improvements in image resolution without increases in statistical noise or dose because increases in the cutoff frequency of the reconstruction algorithm are not required to improve image resolution. Alternatively, in cases where image counts are low, such as in rapid dynamic or receptor studies, statistical noise can be reduced by lowering the cutoff frequency while still maintaining the best possible image resolution. A possible system design for a positron CT system with SAT is described

  4. A unified approach to linking experimental, statistical and computational analysis of spike train data.

    Directory of Open Access Journals (Sweden)

    Liang Meng

    Full Text Available A fundamental issue in neuroscience is how to identify the multiple biophysical mechanisms through which neurons generate observed patterns of spiking activity. In previous work, we proposed a method for linking observed patterns of spiking activity to specific biophysical mechanisms based on a state space modeling framework and a sequential Monte Carlo, or particle filter, estimation algorithm. We have shown, in simulation, that this approach is able to identify a space of simple biophysical models that were consistent with observed spiking data (and included the model that generated the data, but have yet to demonstrate the application of the method to identify realistic currents from real spike train data. Here, we apply the particle filter to spiking data recorded from rat layer V cortical neurons, and correctly identify the dynamics of an slow, intrinsic current. The underlying intrinsic current is successfully identified in four distinct neurons, even though the cells exhibit two distinct classes of spiking activity: regular spiking and bursting. This approach--linking statistical, computational, and experimental neuroscience--provides an effective technique to constrain detailed biophysical models to specific mechanisms consistent with observed spike train data.

  5. A computational modeling approach for the characterization of mechanical properties of 3D alginate tissue scaffolds.

    Science.gov (United States)

    Nair, K; Yan, K C; Sun, W

    2008-01-01

    Scaffold guided tissue engineering is an innovative approach wherein cells are seeded onto biocompatible and biodegradable materials to form 3-dimensional (3D) constructs that, when implanted in the body facilitate the regeneration of tissue. Tissue scaffolds act as artificial extracellular matrix providing the environment conducive for tissue growth. Characterization of scaffold properties is necessary to understand better the underlying processes involved in controlling cell behavior and formation of functional tissue. We report a computational modeling approach to characterize mechanical properties of 3D gellike biomaterial, specifically, 3D alginate scaffold encapsulated with cells. Alginate inherent nonlinearity and variations arising from minute changes in its concentration and viscosity make experimental evaluation of its mechanical properties a challenging and time consuming task. We developed an in silico model to determine the stress-strain relationship of alginate based scaffolds from experimental data. In particular, we compared the Ogden hyperelastic model to other hyperelastic material models and determined that this model was the most suitable to characterize the nonlinear behavior of alginate. We further propose a mathematical model that represents the alginate material constants in Ogden model as a function of concentrations and viscosity. This study demonstrates the model capability to predict mechanical properties of 3D alginate scaffolds.

  6. A computational approach for thermomechanical fatigue life prediction of dissimilarly welded superheater tubes

    Energy Technology Data Exchange (ETDEWEB)

    Krishnasamy, Ram-Kumar; Seifert, Thomas; Siegele, Dieter [Fraunhofer-Institut fuer Werkstoffmechanik (IWM), Freiburg im Breisgau (Germany)

    2010-07-01

    In this paper a computational approach for fatigue life prediction of dissimilarly welded superheater tubes is presented and applied to a dissimilar weld between tubes made of the nickel base alloy Alloy617 tube and the 12% chromium steel VM12. The approach comprises the calculation of the residual stresses in the welded tubes with a multi-pass dissimilar welding simulation, the relaxation of the residual stresses in a post weld heat treatment (PWHT) simulation and the fatigue life prediction using the remaining residual stresses as initial condition. A cyclic fiscoplasticity model is used to calculate the transient stresses and strains under thermocyclic service loadings. The fatigue life is predicted with a damage parameter which is based on fracture mechanics. The adjustable parameters of the model are determined based on LCF and TMF experiments. The simulations show, that the residual stresses that remain after PWHT further relax in the first loading cycles. The predicted fatigue lives depend on the residual stresses and, thus, on the choice of the loading cycle in which the damage parameter is evaluated. It the first loading cycle, where residual stresses are still present, is considered, lower fatigue lives are predicted compared to predictions considering loading cycles with relaxed residual stresses. (orig.)

  7. Fuzzy logic approach to SWOT analysis for economics tasks and example of its computer realization

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2016-07-01

    Full Text Available The article discusses the widely used classic method of analysis, forecasting and decision-making in the various economic problems, called SWOT analysis. As known, it is a qualitative comparison of multicriteria degree of Strength, Weakness, Opportunity, Threat for different kinds of risks, forecasting the development in the markets, status and prospects of development of enterprises, regions and economic sectors, territorials etc. It can also be successfully applied to the evaluation and analysis of different project management tasks - investment, innovation, marketing, development, design and bring products to market and so on. However, in practical competitive market and economic conditions, there are various uncertainties, ambiguities, vagueness. Its making usage of SWOT analysis in the classical sense not enough reasonable and ineffective. In this case, the authors propose to use fuzzy logic approach and the theory of fuzzy sets for a more adequate representation and posttreatment assessments in the SWOT analysis. In particular, has been short showed the mathematical formulation of respective task and the main approaches to its solution. Also are given examples of suitable computer calculations in specialized software Fuzicalc for processing and operations with fuzzy input data. Finally, are presented considerations for interpretation of the results.

  8. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches.

    Science.gov (United States)

    Jiang, Hanlun; Zhu, Lizhe; Héliou, Amélie; Gao, Xin; Bernauer, Julie; Huang, Xuhui

    2017-01-01

    MicroRNA (miRNA) and Argonaute (AGO) protein together form the RNA-induced silencing complex (RISC) that plays an essential role in the regulation of gene expression. Elucidating the underlying mechanism of AGO-miRNA recognition is thus of great importance not only for the in-depth understanding of miRNA function but also for inspiring new drugs targeting miRNAs. In this chapter we introduce a combined computational approach of molecular dynamics (MD) simulations, Markov state models (MSMs), and protein-RNA docking to investigate AGO-miRNA recognition. Constructed from MD simulations, MSMs can elucidate the conformational dynamics of AGO at biologically relevant timescales. Protein-RNA docking can then efficiently identify the AGO conformations that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms of molecular recognition between large, flexible, and complex biomolecules.

  9. Computational study of the fibril organization of polyglutamine repeats reveals a common motif identified in beta-helices.

    Science.gov (United States)

    Zanuy, David; Gunasekaran, Kannan; Lesk, Arthur M; Nussinov, Ruth

    2006-04-21

    The formation of fibril aggregates by long polyglutamine sequences is assumed to play a major role in neurodegenerative diseases such as Huntington. Here, we model peptides rich in glutamine, through a series of molecular dynamics simulations. Starting from a rigid nanotube-like conformation, we have obtained a new conformational template that shares structural features of a tubular helix and of a beta-helix conformational organization. Our new model can be described as a super-helical arrangement of flat beta-sheet segments linked by planar turns or bends. Interestingly, our comprehensive analysis of the Protein Data Bank reveals that this is a common motif in beta-helices (termed beta-bend), although it has not been identified so far. The motif is based on the alternation of beta-sheet and helical conformation as the protein sequence is followed from the N to the C termini (beta-alpha(R)-beta-polyPro-beta). We further identify this motif in the ssNMR structure of the protofibril of the amyloidogenic peptide Abeta(1-40). The recurrence of the beta-bend suggests a general mode of connecting long parallel beta-sheet segments that would allow the growth of partially ordered fibril structures. The design allows the peptide backbone to change direction with a minimal loss of main chain hydrogen bonds. The identification of a coherent organization beyond that of the beta-sheet segments in different folds rich in parallel beta-sheets suggests a higher degree of ordered structure in protein fibrils, in agreement with their low solubility and dense molecular packing.

  10. Mathematics revealed

    CERN Document Server

    Berman, Elizabeth

    1979-01-01

    Mathematics Revealed focuses on the principles, processes, operations, and exercises in mathematics.The book first offers information on whole numbers, fractions, and decimals and percents. Discussions focus on measuring length, percent, decimals, numbers as products, addition and subtraction of fractions, mixed numbers and ratios, division of fractions, addition, subtraction, multiplication, and division. The text then examines positive and negative numbers and powers and computation. Topics include division and averages, multiplication, ratios, and measurements, scientific notation and estim

  11. Systematic approach for deriving feasible mappings of parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir; Imre, Kayhan M.

    2017-01-01

    The need for high-performance computing together with the increasing trend from single processor to parallel computer architectures has leveraged the adoption of parallel computing. To benefit from parallel computing power, usually parallel algorithms are defined that can be mapped and executed

  12. A hybrid agent-based computational economics and optimization approach for supplier selection problem

    Directory of Open Access Journals (Sweden)

    Zahra Pourabdollahi

    2017-12-01

    Full Text Available Supplier evaluation and selection problem is among the most important of logistics decisions that have been addressed extensively in supply chain management. The same logistics decision is also important in freight transportation since it identifies trade relationships between business establishments and determines commodity flows between production and consumption points. The commodity flows are then used as input to freight transportation models to determine cargo movements and their characteristics including mode choice and shipment size. Various approaches have been proposed to explore this latter problem in previous studies. Traditionally, potential suppliers are evaluated and selected using only price/cost as the influential criteria and the state-of-practice methods. This paper introduces a hybrid agent-based computational economics and optimization approach for supplier selection. The proposed model combines an agent-based multi-criteria supplier evaluation approach with a multi-objective optimization model to capture both behavioral and economical aspects of the supplier selection process. The model uses a system of ordered response models to determine importance weights of the different criteria in supplier evaluation from a buyers’ point of view. The estimated weights are then used to calculate a utility for each potential supplier in the market and rank them. The calculated utilities are then entered into a mathematical programming model in which best suppliers are selected by maximizing the total accrued utility for all buyers and minimizing total shipping costs while balancing the capacity of potential suppliers to ensure market clearing mechanisms. The proposed model, herein, was implemented under an operational agent-based supply chain and freight transportation framework for the Chicago Metropolitan Area.

  13. Electronic nature of zwitterionic alkali metal methanides, silanides and germanides - a combined experimental and computational approach.

    Science.gov (United States)

    Li, H; Aquino, A J A; Cordes, D B; Hase, W L; Krempner, C

    2017-02-01

    Zwitterionic group 14 complexes of the alkali metals of formula [C(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 1 ), [Si(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 2 ), [Ge(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 3 ), where M = Li, Na or K, have been prepared, structurally characterized and their electronic nature was investigated by computational methods. Zwitterions M- 2 and M- 3 were synthesized via reactions of [Si(SiMe 2 OCH 2 CH 2 OMe) 4 ] ( 2 ) and [Ge(SiMe 2 OCH 2 CH 2 OMe) 4 ] ( 3 ) with MOBu t (M = Li, Na or K), resp., in almost quantitative yields, while M- 1 were prepared from deprotonation of [HC(SiMe 2 OCH 2 CH 2 OMe) 3 ] ( 1 ) with LiBu t , NaCH 2 Ph and KCH 2 Ph, resp. X-ray crystallographic studies and DFT calculations in the gas-phase, including calculations of the NPA charges confirm the zwitterionic nature of these compounds, with the alkali metal cations being rigidly locked and charge separated from the anion by the internal OCH 2 CH 2 OMe donor groups. Natural bond orbital (NBO) analysis and the second order perturbation theory analysis of the NBOs reveal significant hyperconjugative interactions in M- 1 -M- 3 , primarily between the lone pair and the antibonding Si-O orbitals, the extent of which decreases in the order M- 1 > M- 2 > M- 3 . The experimental basicities and the calculated gas-phase basicities of M- 1 -M- 3 reveal the zwitterionic alkali metal methanides M- 1 to be significantly stronger bases than the analogous silanides M- 2 and germanium M- 3 .

  14. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...