WorldWideScience

Sample records for computational approach identifies

  1. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  2. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  3. A comparison of approaches for finding minimum identifying codes on graphs

    Science.gov (United States)

    Horan, Victoria; Adachi, Steve; Bak, Stanley

    2016-05-01

    In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.

  4. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...... of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype....

  5. Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.

    Science.gov (United States)

    Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal

    2016-12-01

    In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for

  6. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  7. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  8. MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACH

    OpenAIRE

    Omar AlSheikSalem

    2016-01-01

    In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating healthcare applications and services is one of the vast data approaches that can be adapted to mobile cloud computing. This work proposes a framework of a global healthcare computing based combining both mobile computing and cloud computing. This approach leads to integrate all of ...

  9. Distributional and Knowledge-Based Approaches for Computing Portuguese Word Similarity

    Directory of Open Access Journals (Sweden)

    Hugo Gonçalo Oliveira

    2018-02-01

    Full Text Available Identifying similar and related words is not only key in natural language understanding but also a suitable task for assessing the quality of computational resources that organise words and meanings of a language, compiled by different means. This paper, which aims to be a reference for those interested in computing word similarity in Portuguese, presents several approaches for this task and is motivated by the recent availability of state-of-the-art distributional models of Portuguese words, which add to several lexical knowledge bases (LKBs for this language, available for a longer time. The previous resources were exploited to answer word similarity tests, which also became recently available for Portuguese. We conclude that there are several valid approaches for this task, but not one that outperforms all the others in every single test. Distributional models seem to capture relatedness better, while LKBs are better suited for computing genuine similarity, but, in general, better results are obtained when knowledge from different sources is combined.

  10. Maximal Predictability Approach for Identifying the Right Descriptors for Electrocatalytic Reactions.

    Science.gov (United States)

    Krishnamurthy, Dilip; Sumaria, Vaidish; Viswanathan, Venkatasubramanian

    2018-02-01

    Density functional theory (DFT) calculations are being routinely used to identify new material candidates that approach activity near fundamental limits imposed by thermodynamics or scaling relations. DFT calculations are associated with inherent uncertainty, which limits the ability to delineate materials (distinguishability) that possess high activity. Development of error-estimation capabilities in DFT has enabled uncertainty propagation through activity-prediction models. In this work, we demonstrate an approach to propagating uncertainty through thermodynamic activity models leading to a probability distribution of the computed activity and thereby its expectation value. A new metric, prediction efficiency, is defined, which provides a quantitative measure of the ability to distinguish activity of materials and can be used to identify the optimal descriptor(s) ΔG opt . We demonstrate the framework for four important electrochemical reactions: hydrogen evolution, chlorine evolution, oxygen reduction and oxygen evolution. Future studies could utilize expected activity and prediction efficiency to significantly improve the prediction accuracy of highly active material candidates.

  11. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too

  12. Cognitive Approaches for Medicine in Cloud Computing.

    Science.gov (United States)

    Ogiela, Urszula; Takizawa, Makoto; Ogiela, Lidia

    2018-03-03

    This paper will present the application potential of the cognitive approach to data interpretation, with special reference to medical areas. The possibilities of using the meaning approach to data description and analysis will be proposed for data analysis tasks in Cloud Computing. The methods of cognitive data management in Cloud Computing are aimed to support the processes of protecting data against unauthorised takeover and they serve to enhance the data management processes. The accomplishment of the proposed tasks will be the definition of algorithms for the execution of meaning data interpretation processes in safe Cloud Computing. • We proposed a cognitive methods for data description. • Proposed a techniques for secure data in Cloud Computing. • Application of cognitive approaches for medicine was described.

  13. Xtalk: a path-based approach for identifying crosstalk between signaling pathways

    Science.gov (United States)

    Tegge, Allison N.; Sharp, Nicholas; Murali, T. M.

    2016-01-01

    Motivation: Cells communicate with their environment via signal transduction pathways. On occasion, the activation of one pathway can produce an effect downstream of another pathway, a phenomenon known as crosstalk. Existing computational methods to discover such pathway pairs rely on simple overlap statistics. Results: We present Xtalk, a path-based approach for identifying pairs of pathways that may crosstalk. Xtalk computes the statistical significance of the average length of multiple short paths that connect receptors in one pathway to the transcription factors in another. By design, Xtalk reports the precise interactions and mechanisms that support the identified crosstalk. We applied Xtalk to signaling pathways in the KEGG and NCI-PID databases. We manually curated a gold standard set of 132 crosstalking pathway pairs and a set of 140 pairs that did not crosstalk, for which Xtalk achieved an area under the receiver operator characteristic curve of 0.65, a 12% improvement over the closest competing approach. The area under the receiver operator characteristic curve varied with the pathway, suggesting that crosstalk should be evaluated on a pathway-by-pathway level. We also analyzed an extended set of 658 pathway pairs in KEGG and to a set of more than 7000 pathway pairs in NCI-PID. For the top-ranking pairs, we found substantial support in the literature (81% for KEGG and 78% for NCI-PID). We provide examples of networks computed by Xtalk that accurately recovered known mechanisms of crosstalk. Availability and implementation: The XTALK software is available at http://bioinformatics.cs.vt.edu/~murali/software. Crosstalk networks are available at http://graphspace.org/graphs?tags=2015-bioinformatics-xtalk. Contact: ategge@vt.edu, murali@cs.vt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26400040

  14. Error characterization for asynchronous computations: Proxy equation approach

    Science.gov (United States)

    Sallai, Gabriella; Mittal, Ankita; Girimaji, Sharath

    2017-11-01

    Numerical techniques for asynchronous fluid flow simulations are currently under development to enable efficient utilization of massively parallel computers. These numerical approaches attempt to accurately solve time evolution of transport equations using spatial information at different time levels. The truncation error of asynchronous methods can be divided into two parts: delay dependent (EA) or asynchronous error and delay independent (ES) or synchronous error. The focus of this study is a specific asynchronous error mitigation technique called proxy-equation approach. The aim of this study is to examine these errors as a function of the characteristic wavelength of the solution. Mitigation of asynchronous effects requires that the asynchronous error be smaller than synchronous truncation error. For a simple convection-diffusion equation, proxy-equation error analysis identifies critical initial wave-number, λc. At smaller wave numbers, synchronous error are larger than asynchronous errors. We examine various approaches to increase the value of λc in order to improve the range of applicability of proxy-equation approach.

  15. An Approach for a Synthetic CTL Vaccine Design against Zika Flavivirus Using Class I and Class II Epitopes Identified by Computer Modeling

    Directory of Open Access Journals (Sweden)

    Edecio Cunha-Neto

    2017-06-01

    Full Text Available The threat posed by severe congenital abnormalities related to Zika virus (ZKV infection during pregnancy has turned development of a ZKV vaccine into an emergency. Recent work suggests that the cytotoxic T lymphocyte (CTL response to infection is an important defense mechanism in response to ZKV. Here, we develop the rationale and strategy for a new approach to developing cytotoxic T lymphocyte (CTL vaccines for ZKV flavivirus infection. The proposed approach is based on recent studies using a protein structure computer model for HIV epitope selection designed to select epitopes for CTL attack optimized for viruses that exhibit antigenic drift. Because naturally processed and presented human ZKV T cell epitopes have not yet been described, we identified predicted class I peptide sequences on ZKV matching previously identified DNV (Dengue class I epitopes and by using a Major Histocompatibility Complex (MHC binding prediction tool. A subset of those met the criteria for optimal CD8+ attack based on physical chemistry parameters determined by analysis of the ZKV protein structure encoded in open source Protein Data File (PDB format files. We also identified candidate ZKV epitopes predicted to bind promiscuously to multiple HLA class II molecules that could provide help to the CTL responses. This work suggests that a CTL vaccine for ZKV may be possible even if ZKV exhibits significant antigenic drift. We have previously described a microsphere-based CTL vaccine platform capable of eliciting an immune response for class I epitopes in mice and are currently working toward in vivo testing of class I and class II epitope delivery directed against ZKV epitopes using the same microsphere-based vaccine.

  16. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    Directory of Open Access Journals (Sweden)

    Daniel Durstewitz

    2017-06-01

    Full Text Available The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast maximum-likelihood estimation framework for PLRNNs that may enable to recover

  17. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  18. An Approach for Indoor Path Computation among Obstacles that Considers User Dimension

    Directory of Open Access Journals (Sweden)

    Liu Liu

    2015-12-01

    Full Text Available People often transport objects within indoor environments, who need enough space for the motion. In such cases, the accessibility of indoor spaces relies on the dimensions, which includes a person and her/his operated objects. This paper proposes a new approach to avoid obstacles and compute indoor paths with respect to the user dimension. The approach excludes inaccessible spaces for a user in five steps: (1 compute the minimum distance between obstacles and find the inaccessible gaps; (2 group obstacles according to the inaccessible gaps; (3 identify groups of obstacles that influence the path between two locations; (4 compute boundaries for the selected groups; and (5 build a network in the accessible area around the obstacles in the room. Compared to the Minkowski sum method for outlining inaccessible spaces, the proposed approach generates simpler polygons for groups of obstacles that do not contain inner rings. The creation of a navigation network becomes easier based on these simple polygons. By using this approach, we can create user- and task-specific networks in advance. Alternatively, the accessible path can be generated on the fly before the user enters a room.

  19. An Efficient Approach for Identifying Stable Lobes with Discretization Method

    Directory of Open Access Journals (Sweden)

    Baohai Wu

    2013-01-01

    Full Text Available This paper presents a new approach for quick identification of chatter stability lobes with discretization method. Firstly, three different kinds of stability regions are defined: absolute stable region, valid region, and invalid region. Secondly, while identifying the chatter stability lobes, three different regions within the chatter stability lobes are identified with relatively large time intervals. Thirdly, stability boundary within the valid regions is finely calculated to get exact chatter stability lobes. The proposed method only needs to test a small portion of spindle speed and cutting depth set; about 89% computation time is savedcompared with full discretization method. It spends only about10 minutes to get exact chatter stability lobes. Since, based on discretization method, the proposed method can be used for different immersion cutting including low immersion cutting process, the proposed method can be directly implemented in the workshop to promote machining parameters selection efficiency.

  20. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  1. The node-weighted Steiner tree approach to identify elements of cancer-related signaling pathways.

    Science.gov (United States)

    Sun, Yahui; Ma, Chenkai; Halgamuge, Saman

    2017-12-28

    Cancer constitutes a momentous health burden in our society. Critical information on cancer may be hidden in its signaling pathways. However, even though a large amount of money has been spent on cancer research, some critical information on cancer-related signaling pathways still remains elusive. Hence, new works towards a complete understanding of cancer-related signaling pathways will greatly benefit the prevention, diagnosis, and treatment of cancer. We propose the node-weighted Steiner tree approach to identify important elements of cancer-related signaling pathways at the level of proteins. This new approach has advantages over previous approaches since it is fast in processing large protein-protein interaction networks. We apply this new approach to identify important elements of two well-known cancer-related signaling pathways: PI3K/Akt and MAPK. First, we generate a node-weighted protein-protein interaction network using protein and signaling pathway data. Second, we modify and use two preprocessing techniques and a state-of-the-art Steiner tree algorithm to identify a subnetwork in the generated network. Third, we propose two new metrics to select important elements from this subnetwork. On a commonly used personal computer, this new approach takes less than 2 s to identify the important elements of PI3K/Akt and MAPK signaling pathways in a large node-weighted protein-protein interaction network with 16,843 vertices and 1,736,922 edges. We further analyze and demonstrate the significance of these identified elements to cancer signal transduction by exploring previously reported experimental evidences. Our node-weighted Steiner tree approach is shown to be both fast and effective to identify important elements of cancer-related signaling pathways. Furthermore, it may provide new perspectives into the identification of signaling pathways for other human diseases.

  2. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  3. A multi-criteria decision making approach to identify a vaccine formulation.

    Science.gov (United States)

    Dewé, Walthère; Durand, Christelle; Marion, Sandie; Oostvogels, Lidia; Devaster, Jeanne-Marie; Fourneau, Marc

    2016-01-01

    This article illustrates the use of a multi-criteria decision making approach, based on desirability functions, to identify an appropriate adjuvant composition for an influenza vaccine to be used in elderly. The proposed adjuvant system contained two main elements: monophosphoryl lipid and α-tocopherol with squalene in an oil/water emulsion. The objective was to elicit a stronger immune response while maintaining an acceptable reactogenicity and safety profile. The study design, the statistical models, the choice of the desirability functions, the computation of the overall desirability index, and the assessment of the robustness of the ranking are all detailed in this manuscript.

  4. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  5. Identify and rank key factors influencing the adoption of cloud computing for a healthy Electronics

    Directory of Open Access Journals (Sweden)

    Javad Shukuhy

    2015-02-01

    Full Text Available Cloud computing as a new technology with Internet infrastructure and new approaches can be significant benefits in providing medical services electronically. Aplying this technology in E-Health requires consideration of various factors. The main objective of this study is to identify and rank the factors influencing the adoption of e-health cloud. Based on the Technology-Organization-Environment (TOE framework and Human-Organization-Technology fit (HOT-fit model, 16 sub-factors were identified in four major factors. With survey of 60 experts, academics and experts in health information technology and with the help of fuzzy analytic hierarchy process had ranked these sub-factors and factors. In the literature, considering newness this study, no internal or external study, have not alluded these number of criteria. The results show that when deciding to adopt cloud computing in E-Health, respectively, must be considered technological, human, organizational and environmental factors.

  6. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  7. Computational modeling identifies key gene regulatory interactions underlying phenobarbital-mediated tumor promotion

    Science.gov (United States)

    Luisier, Raphaëlle; Unterberger, Elif B.; Goodman, Jay I.; Schwarz, Michael; Moggs, Jonathan; Terranova, Rémi; van Nimwegen, Erik

    2014-01-01

    Gene regulatory interactions underlying the early stages of non-genotoxic carcinogenesis are poorly understood. Here, we have identified key candidate regulators of phenobarbital (PB)-mediated mouse liver tumorigenesis, a well-characterized model of non-genotoxic carcinogenesis, by applying a new computational modeling approach to a comprehensive collection of in vivo gene expression studies. We have combined our previously developed motif activity response analysis (MARA), which models gene expression patterns in terms of computationally predicted transcription factor binding sites with singular value decomposition (SVD) of the inferred motif activities, to disentangle the roles that different transcriptional regulators play in specific biological pathways of tumor promotion. Furthermore, transgenic mouse models enabled us to identify which of these regulatory activities was downstream of constitutive androstane receptor and β-catenin signaling, both crucial components of PB-mediated liver tumorigenesis. We propose novel roles for E2F and ZFP161 in PB-mediated hepatocyte proliferation and suggest that PB-mediated suppression of ESR1 activity contributes to the development of a tumor-prone environment. Our study shows that combining MARA with SVD allows for automated identification of independent transcription regulatory programs within a complex in vivo tissue environment and provides novel mechanistic insights into PB-mediated hepatocarcinogenesis. PMID:24464994

  8. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Science.gov (United States)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-03-01

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.

  9. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    Directory of Open Access Journals (Sweden)

    Grover Kearns

    2010-06-01

    Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

  10. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  12. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  13. Analytical and computational approaches to define the Aspergillus niger secretome

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, Adrian; Butler, Gregory D.; Powlowski, Justin; Panisko, Ellen A.; Baker, Scott E.

    2009-03-01

    We used computational and mass spectrometric approaches to characterize the Aspergillus niger secretome. The 11,200 gene models predicted in the genome of A. niger strain ATCC 1015 were the data source for the analysis. Depending on the computational methods used, 691 to 881 proteins were predicted to be secreted proteins. We cultured A. niger in six different media and analyzed the extracellular proteins produced using mass spectrometry. A total of 222 proteins were identified, with 39 proteins expressed under all six conditions and 74 proteins expressed under only one condition. The secreted proteins identified by mass spectrometry were used to guide the correction of about 20 gene models. Additional analysis focused on extracellular enzymes of interest for biomass processing. Of the 63 glycoside hydrolases predicted to be capable of hydrolyzing cellulose, hemicellulose or pectin, 94% of the exo-acting enzymes and only 18% of the endo-acting enzymes were experimentally detected.

  14. Distributed design approach in persistent identifiers systems

    Science.gov (United States)

    Golodoniuc, Pavel; Car, Nicholas; Klump, Jens

    2017-04-01

    The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID) systems, of which there is a great variety in terms of technical and social implementations, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have catered for identifier uniqueness, integrity, persistence, and trustworthiness, regardless of the identifier's application domain, the scope of which has expanded significantly in the past two decades. Since many PID systems have been largely conceived and developed by small communities, or even a single organisation, they have faced challenges in gaining widespread adoption and, most importantly, the ability to survive change of technology. This has left a legacy of identifiers that still exist and are being used but which have lost their resolution service. We believe that one of the causes of once successful PID systems fading is their reliance on a centralised technical infrastructure or a governing authority. Golodoniuc et al. (2016) proposed an approach to the development of PID systems that combines the use of (a) the Handle system, as a distributed system for the registration and first-degree resolution of persistent identifiers, and (b) the PID Service (Golodoniuc et al., 2015), to enable fine-grained resolution to different information object representations. The proposed approach solved the problem of guaranteed first-degree resolution of identifiers, but left fine-grained resolution and information delivery under the control of a single authoritative source, posing risk to the long-term availability of information resources. Herein, we develop these approaches further and explore the potential of large-scale decentralisation at all levels: (i) persistent identifiers and information resources registration; (ii) identifier resolution; and (iii) data delivery. To achieve large-scale decentralisation

  15. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...

  16. Quantum Computing: a Quantum Group Approach

    OpenAIRE

    Wang, Zhenghan

    2013-01-01

    There is compelling theoretical evidence that quantum physics will change the face of information science. Exciting progress has been made during the last two decades towards the building of a large scale quantum computer. A quantum group approach stands out as a promising route to this holy grail, and provides hope that we may have quantum computers in our future.

  17. Targeted intervention: Computational approaches to elucidate and predict relapse in alcoholism.

    Science.gov (United States)

    Heinz, Andreas; Deserno, Lorenz; Zimmermann, Ulrich S; Smolka, Michael N; Beck, Anne; Schlagenhauf, Florian

    2017-05-01

    Alcohol use disorder (AUD) and addiction in general is characterized by failures of choice resulting in repeated drug intake despite severe negative consequences. Behavioral change is hard to accomplish and relapse after detoxification is common and can be promoted by consumption of small amounts of alcohol as well as exposure to alcohol-associated cues or stress. While those environmental factors contributing to relapse have long been identified, the underlying psychological and neurobiological mechanism on which those factors act are to date incompletely understood. Based on the reinforcing effects of drugs of abuse, animal experiments showed that drug, cue and stress exposure affect Pavlovian and instrumental learning processes, which can increase salience of drug cues and promote habitual drug intake. In humans, computational approaches can help to quantify changes in key learning mechanisms during the development and maintenance of alcohol dependence, e.g. by using sequential decision making in combination with computational modeling to elucidate individual differences in model-free versus more complex, model-based learning strategies and their neurobiological correlates such as prediction error signaling in fronto-striatal circuits. Computational models can also help to explain how alcohol-associated cues trigger relapse: mechanisms such as Pavlovian-to-Instrumental Transfer can quantify to which degree Pavlovian conditioned stimuli can facilitate approach behavior including alcohol seeking and intake. By using generative models of behavioral and neural data, computational approaches can help to quantify individual differences in psychophysiological mechanisms that underlie the development and maintenance of AUD and thus promote targeted intervention. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Machine learning and computer vision approaches for phenotypic profiling.

    Science.gov (United States)

    Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J

    2017-01-02

    With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.

  19. Computational fluid dynamics a practical approach

    CERN Document Server

    Tu, Jiyuan; Liu, Chaoqun

    2018-01-01

    Computational Fluid Dynamics: A Practical Approach, Third Edition, is an introduction to CFD fundamentals and commercial CFD software to solve engineering problems. The book is designed for a wide variety of engineering students new to CFD, and for practicing engineers learning CFD for the first time. Combining an appropriate level of mathematical background, worked examples, computer screen shots, and step-by-step processes, this book walks the reader through modeling and computing, as well as interpreting CFD results. This new edition has been updated throughout, with new content and improved figures, examples and problems.

  20. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  1. Microarray-based cancer prediction using soft computing approach.

    Science.gov (United States)

    Wang, Xiaosheng; Gotoh, Osamu

    2009-05-26

    One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.

  2. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  3. Computed Tomography Fractional Flow Reserve Can Identify Culprit Lesions in Aortoiliac Occlusive Disease Using Minimally Invasive Techniques.

    Science.gov (United States)

    Ward, Erin P; Shiavazzi, Daniele; Sood, Divya; Marsden, Allison; Lane, John; Owens, Erik; Barleben, Andrew

    2017-01-01

    Currently, the gold standard diagnostic examination for significant aortoiliac lesions is angiography. Fractional flow reserve (FFR) has a growing body of literature in coronary artery disease as a minimally invasive diagnostic procedure. Improvements in numerical hemodynamics have allowed for an accurate and minimally invasive approach to estimating FFR, utilizing cross-sectional imaging. We aim to demonstrate a similar approach to aortoiliac occlusive disease (AIOD). A retrospective review evaluated 7 patients with claudication and cross-sectional imaging showing AIOD. FFR was subsequently measured during conventional angiogram with pull-back pressures in a retrograde fashion. To estimate computed tomography (CT) FFR, CT angiography (CTA) image data were analyzed using the SimVascular software suite to create a computational fluid dynamics model of the aortoiliac system. Inlet flow conditions were derived based on cardiac output, while 3-element Windkessel outlet boundary conditions were optimized to match the expected systolic and diastolic pressures, with outlet resistance distributed based on Murray's law. The data were evaluated with a Student's t-test and receiver operating characteristic curve. All patients had evidence of AIOD on CT and FFR was successfully measured during angiography. The modeled data were found to have high sensitivity and specificity between the measured and CT FFR (P = 0.986, area under the curve = 1). The average difference between the measured and calculated FFRs was 0.136, with a range from 0.03 to 0.30. CT FFR successfully identified aortoiliac lesions with significant pressure drops that were identified with angiographically measured FFR. CT FFR has the potential to provide a minimally invasive approach to identify flow-limiting stenosis for AIOD. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  5. Identifying failure in a tree network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  6. Identifying seizure onset zone from electrocorticographic recordings: A machine learning approach based on phase locking value.

    Science.gov (United States)

    Elahian, Bahareh; Yeasin, Mohammed; Mudigoudar, Basanagoud; Wheless, James W; Babajani-Feremi, Abbas

    2017-10-01

    Using a novel technique based on phase locking value (PLV), we investigated the potential for features extracted from electrocorticographic (ECoG) recordings to serve as biomarkers to identify the seizure onset zone (SOZ). We computed the PLV between the phase of the amplitude of high gamma activity (80-150Hz) and the phase of lower frequency rhythms (4-30Hz) from ECoG recordings obtained from 10 patients with epilepsy (21 seizures). We extracted five features from the PLV and used a machine learning approach based on logistic regression to build a model that classifies electrodes as SOZ or non-SOZ. More than 96% of electrodes identified as the SOZ by our algorithm were within the resected area in six seizure-free patients. In four non-seizure-free patients, more than 31% of the identified SOZ electrodes by our algorithm were outside the resected area. In addition, we observed that the seizure outcome in non-seizure-free patients correlated with the number of non-resected SOZ electrodes identified by our algorithm. This machine learning approach, based on features extracted from the PLV, effectively identified electrodes within the SOZ. The approach has the potential to assist clinicians in surgical decision-making when pre-surgical intracranial recordings are utilized. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  7. A computer model for identifying security system upgrades

    International Nuclear Information System (INIS)

    Lamont, A.

    1988-01-01

    This paper describes a prototype safeguards analysis tool that automatically identifies system weaknesses against an insider adversary and suggest possible upgrades to improve the probability that the adversary will be detected. The tool is based on this premise: as the adversary acts, he or she creates a set of facts that can be detected by safeguards components. Whenever an adversary's planned set of actions create a set of facts which the security personnel would consider irregular or unusual, we can improve the security system by implementing safeguards that detect those facts. Therefore, an intelligent computer program can suggest upgrades to the facility if we construct a knowledge base that contains information about: (1) the facts created by each possible adversary action, (2) the facts that each possible safeguard can detect, and (3) groups of facts which will be considered irregular whenever they occur together. The authors describe the structure of the knowledge base and show how the above information can be represented in it. They also describe the procedures that a computer program can use to identify missing or weak safeguards and to suggest upgrades

  8. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  9. The hierarchy-by-interval approach to identifying important models that need improvement in severe-accident simulation codes

    International Nuclear Information System (INIS)

    Heames, T.J.; Khatib-Rahbar, M.; Kelly, J.E.

    1995-01-01

    The hierarchy-by-interval (HBI) methodology was developed to determine an appropriate phenomena identification and ranking table for an independent peer review of severe-accident computer codes. The methodology is described, and the results of a specific code review are presented. Use of this systematic and structured approach ensures that important code models that need improvement are identified and prioritized, which allows code sponsors to more effectively direct limited resources in future code development. In addition, critical phenomenological areas that need more fundamental work, such as experimentation, are identified

  10. Computational Experiment Approach to Controlled Evolution of Procurement Pattern in Cluster Supply Chain

    Directory of Open Access Journals (Sweden)

    Xiao Xue

    2015-01-01

    Full Text Available Companies have been aware of the benefits of developing Cluster Supply Chains (CSCs, and they are spending a great deal of time and money attempting to develop the new business pattern. Yet, the traditional techniques for identifying CSCs have strong theoretical antecedents, but seem to have little traction in the field. We believe this is because the standard techniques fail to capture evolution over time, nor provide useful intervention measures to reach goals. To address these problems, we introduce an agent-based modeling approach to evaluate CSCs. Taking collaborative procurement as research object, our approach is composed of three parts: model construction, model instantiation, and computational experiment. We use the approach to explore the service charging policy problem in collaborative procurement. Three kinds of service charging polices are compared in the same experiment environment. Finally, “Fixed Cost” is identified as the optimal policy under the stable market environment. The case study can help us to understand the workflow of applying the approach, and provide valuable decision support applications to industry.

  11. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  12. Matrine Is Identified as a Novel Macropinocytosis Inducer by a Network Target Approach

    Directory of Open Access Journals (Sweden)

    Bo Zhang

    2018-01-01

    Full Text Available Comprehensively understanding pharmacological functions of natural products is a key issue to be addressed for the discovery of new drugs. Unlike some single-target drugs, natural products always exert diverse therapeutic effects through acting on a “network” that consists of multiple targets, making it necessary to develop a systematic approach, e.g., network pharmacology, to reveal pharmacological functions of natural products and infer their mechanisms of action. In this work, to identify the “network target” of a natural product, we perform a functional analysis of matrine, a marketed drug in China extracted from a medical herb Ku-Shen (Radix Sophorae Flavescentis. Here, the network target of matrine was firstly predicted by drugCIPHER, a genome-wide target prediction method. Based on the network target of matrine, we performed a functional gene set enrichment analysis to computationally identify the potential pharmacological functions of matrine, most of which are supported by the literature evidence, including neurotoxicity and neuropharmacological activities of matrine. Furthermore, computational results demonstrated that matrine has the potential for the induction of macropinocytosis and the regulation of ATP metabolism. Our experimental data revealed that the large vesicles induced by matrine are consistent with the typical characteristics of macropinosome. Our verification results also suggested that matrine could decrease cellular ATP level. These findings demonstrated the availability and effectiveness of the network target strategy for identifying the comprehensive pharmacological functions of natural products.

  13. Cultural Distance-Aware Service Recommendation Approach in Mobile Edge Computing

    Directory of Open Access Journals (Sweden)

    Yan Li

    2018-01-01

    Full Text Available In the era of big data, traditional computing systems and paradigms are not efficient and even difficult to use. For high performance big data processing, mobile edge computing is emerging as a complement framework of cloud computing. In this new computing architecture, services are provided within a close proximity of mobile users by servers at the edge of network. Traditional collaborative filtering recommendation approach only focuses on the similarity extracted from the rating data, which may lead to an inaccuracy expression of user preference. In this paper, we propose a cultural distance-aware service recommendation approach which focuses on not only the similarity but also the local characteristics and preference of users. Our approach employs the cultural distance to express the user preference and combines it with similarity to predict the user ratings and recommend the services with higher rating. In addition, considering the extreme sparsity of the rating data, missing rating prediction based on collaboration filtering is introduced in our approach. The experimental results based on real-world datasets show that our approach outperforms the traditional recommendation approaches in terms of the reliability of recommendation.

  14. A Computer-Based Instrument That Identifies Common Science Misconceptions

    Science.gov (United States)

    Larrabee, Timothy G.; Stein, Mary; Barman, Charles

    2006-01-01

    This article describes the rationale for and development of a computer-based instrument that helps identify commonly held science misconceptions. The instrument, known as the Science Beliefs Test, is a 47-item instrument that targets topics in chemistry, physics, biology, earth science, and astronomy. The use of an online data collection system…

  15. A comparative approach to closed-loop computation.

    Science.gov (United States)

    Roth, E; Sponberg, S; Cowan, N J

    2014-04-01

    Neural computation is inescapably closed-loop: the nervous system processes sensory signals to shape motor output, and motor output consequently shapes sensory input. Technological advances have enabled neuroscientists to close, open, and alter feedback loops in a wide range of experimental preparations. The experimental capability of manipulating the topology-that is, how information can flow between subsystems-provides new opportunities to understand the mechanisms and computations underlying behavior. These experiments encompass a spectrum of approaches from fully open-loop, restrained preparations to the fully closed-loop character of free behavior. Control theory and system identification provide a clear computational framework for relating these experimental approaches. We describe recent progress and new directions for translating experiments at one level in this spectrum to predictions at another level. Operating across this spectrum can reveal new understanding of how low-level neural mechanisms relate to high-level function during closed-loop behavior. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Morphology control in polymer blend fibers—a high throughput computing approach

    Science.gov (United States)

    Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar

    2016-08-01

    Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.

  17. An Integrated Bioinformatics and Computational Biology Approach Identifies New BH3-Only Protein Candidates.

    Science.gov (United States)

    Hawley, Robert G; Chen, Yuzhong; Riz, Irene; Zeng, Chen

    2012-05-04

    In this study, we utilized an integrated bioinformatics and computational biology approach in search of new BH3-only proteins belonging to the BCL2 family of apoptotic regulators. The BH3 (BCL2 homology 3) domain mediates specific binding interactions among various BCL2 family members. It is composed of an amphipathic α-helical region of approximately 13 residues that has only a few amino acids that are highly conserved across all members. Using a generalized motif, we performed a genome-wide search for novel BH3-containing proteins in the NCBI Consensus Coding Sequence (CCDS) database. In addition to known pro-apoptotic BH3-only proteins, 197 proteins were recovered that satisfied the search criteria. These were categorized according to α-helical content and predictive binding to BCL-xL (encoded by BCL2L1) and MCL-1, two representative anti-apoptotic BCL2 family members, using position-specific scoring matrix models. Notably, the list is enriched for proteins associated with autophagy as well as a broad spectrum of cellular stress responses such as endoplasmic reticulum stress, oxidative stress, antiviral defense, and the DNA damage response. Several potential novel BH3-containing proteins are highlighted. In particular, the analysis strongly suggests that the apoptosis inhibitor and DNA damage response regulator, AVEN, which was originally isolated as a BCL-xL-interacting protein, is a functional BH3-only protein representing a distinct subclass of BCL2 family members.

  18. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  19. Computational neuropharmacology: dynamical approaches in drug discovery.

    Science.gov (United States)

    Aradi, Ildiko; Erdi, Péter

    2006-05-01

    Computational approaches that adopt dynamical models are widely accepted in basic and clinical neuroscience research as indispensable tools with which to understand normal and pathological neuronal mechanisms. Although computer-aided techniques have been used in pharmaceutical research (e.g. in structure- and ligand-based drug design), the power of dynamical models has not yet been exploited in drug discovery. We suggest that dynamical system theory and computational neuroscience--integrated with well-established, conventional molecular and electrophysiological methods--offer a broad perspective in drug discovery and in the search for novel targets and strategies for the treatment of neurological and psychiatric diseases.

  20. Parallel approach to identifying the well-test interpretation model using a neurocomputer

    Science.gov (United States)

    May, Edward A., Jr.; Dagli, Cihan H.

    1996-03-01

    The well test is one of the primary diagnostic and predictive tools used in the analysis of oil and gas wells. In these tests, a pressure recording device is placed in the well and the pressure response is recorded over time under controlled flow conditions. The interpreted results are indicators of the well's ability to flow and the damage done to the formation surrounding the wellbore during drilling and completion. The results are used for many purposes, including reservoir modeling (simulation) and economic forecasting. The first step in the analysis is the identification of the Well-Test Interpretation (WTI) model, which determines the appropriate solution method. Mis-identification of the WTI model occurs due to noise and non-ideal reservoir conditions. Previous studies have shown that a feed-forward neural network using the backpropagation algorithm can be used to identify the WTI model. One of the drawbacks to this approach is, however, training time, which can run into days of CPU time on personal computers. In this paper a similar neural network is applied using both a personal computer and a neurocomputer. Input data processing, network design, and performance are discussed and compared. The results show that the neurocomputer greatly eases the burden of training and allows the network to outperform a similar network running on a personal computer.

  1. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    Science.gov (United States)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  2. 76 FR 37111 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Science.gov (United States)

    2011-06-24

    ... Business Information by Computer Sciences Corporation and Its Identified Subcontractors AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: EPA has authorized its contractor, Computer Sciences Corporation of Chantilly, VA and Its Identified Subcontractors, to access information which has...

  3. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic

  4. Global identifiability of linear compartmental models--a computer algebra algorithm.

    Science.gov (United States)

    Audoly, S; D'Angiò, L; Saccomani, M P; Cobelli, C

    1998-01-01

    A priori global identifiability deals with the uniqueness of the solution for the unknown parameters of a model and is, thus, a prerequisite for parameter estimation of biological dynamic models. Global identifiability is however difficult to test, since it requires solving a system of algebraic nonlinear equations which increases both in nonlinearity degree and number of terms and unknowns with increasing model order. In this paper, a computer algebra tool, GLOBI (GLOBal Identifiability) is presented, which combines the topological transfer function method with the Buchberger algorithm, to test global identifiability of linear compartmental models. GLOBI allows for the automatic testing of a priori global identifiability of general structure compartmental models from general multi input-multi output experiments. Examples of usage of GLOBI to analyze a priori global identifiability of some complex biological compartmental models are provided.

  5. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words.

    Science.gov (United States)

    Wang, Bingkun; Huang, Yongfeng; Wu, Xian; Li, Xing

    2015-01-01

    With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  6. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  7. A Computer-Aided FPS-Oriented Approach for Construction Briefing

    Institute of Scientific and Technical Information of China (English)

    Xiaochun Luo; Qiping Shen

    2008-01-01

    Function performance specification (FPS) is one of the value management (VM) techniques de- veloped for the explicit statement of optimum product definition. This technique is widely used in software engineering and manufacturing industry, and proved to be successful to perform product defining tasks. This paper describes an FPS-odented approach for construction briefing, which is critical to the successful deliv- ery of construction projects. Three techniques, i.e., function analysis system technique, shared space, and computer-aided toolkit, are incorporated into the proposed approach. A computer-aided toolkit is developed to facilitate the implementation of FPS in the briefing processes. This approach can facilitate systematic, ef- ficient identification, clarification, and representation of client requirements in trail running. The limitations of the approach and future research work are also discussed at the end of the paper.

  8. Introducing Computational Approaches in Intermediate Mechanics

    Science.gov (United States)

    Cook, David M.

    2006-12-01

    In the winter of 2003, we at Lawrence University moved Lagrangian mechanics and rigid body dynamics from a required sophomore course to an elective junior/senior course, freeing 40% of the time for computational approaches to ordinary differential equations (trajectory problems, the large amplitude pendulum, non-linear dynamics); evaluation of integrals (finding centers of mass and moment of inertia tensors, calculating gravitational potentials for various sources); and finding eigenvalues and eigenvectors of matrices (diagonalizing the moment of inertia tensor, finding principal axes), and to generating graphical displays of computed results. Further, students begin to use LaTeX to prepare some of their submitted problem solutions. Placed in the middle of the sophomore year, this course provides the background that permits faculty members as appropriate to assign computer-based exercises in subsequent courses. Further, students are encouraged to use our Computational Physics Laboratory on their own initiative whenever that use seems appropriate. (Curricular development supported in part by the W. M. Keck Foundation, the National Science Foundation, and Lawrence University.)

  9. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The sco...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  10. A unified approach to linking experimental, statistical and computational analysis of spike train data.

    Directory of Open Access Journals (Sweden)

    Liang Meng

    Full Text Available A fundamental issue in neuroscience is how to identify the multiple biophysical mechanisms through which neurons generate observed patterns of spiking activity. In previous work, we proposed a method for linking observed patterns of spiking activity to specific biophysical mechanisms based on a state space modeling framework and a sequential Monte Carlo, or particle filter, estimation algorithm. We have shown, in simulation, that this approach is able to identify a space of simple biophysical models that were consistent with observed spiking data (and included the model that generated the data, but have yet to demonstrate the application of the method to identify realistic currents from real spike train data. Here, we apply the particle filter to spiking data recorded from rat layer V cortical neurons, and correctly identify the dynamics of an slow, intrinsic current. The underlying intrinsic current is successfully identified in four distinct neurons, even though the cells exhibit two distinct classes of spiking activity: regular spiking and bursting. This approach--linking statistical, computational, and experimental neuroscience--provides an effective technique to constrain detailed biophysical models to specific mechanisms consistent with observed spike train data.

  11. Modelling Creativity: Identifying Key Components through a Corpus-Based Approach.

    Science.gov (United States)

    Jordanous, Anna; Keller, Bill

    2016-01-01

    Creativity is a complex, multi-faceted concept encompassing a variety of related aspects, abilities, properties and behaviours. If we wish to study creativity scientifically, then a tractable and well-articulated model of creativity is required. Such a model would be of great value to researchers investigating the nature of creativity and in particular, those concerned with the evaluation of creative practice. This paper describes a unique approach to developing a suitable model of how creative behaviour emerges that is based on the words people use to describe the concept. Using techniques from the field of statistical natural language processing, we identify a collection of fourteen key components of creativity through an analysis of a corpus of academic papers on the topic. Words are identified which appear significantly often in connection with discussions of the concept. Using a measure of lexical similarity to help cluster these words, a number of distinct themes emerge, which collectively contribute to a comprehensive and multi-perspective model of creativity. The components provide an ontology of creativity: a set of building blocks which can be used to model creative practice in a variety of domains. The components have been employed in two case studies to evaluate the creativity of computational systems and have proven useful in articulating achievements of this work and directions for further research.

  12. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  13. Computer-oriented approach to fault-tree construction

    International Nuclear Information System (INIS)

    Salem, S.L.; Apostolakis, G.E.; Okrent, D.

    1976-11-01

    A methodology for systematically constructing fault trees for general complex systems is developed and applied, via the Computer Automated Tree (CAT) program, to several systems. A means of representing component behavior by decision tables is presented. The method developed allows the modeling of components with various combinations of electrical, fluid and mechanical inputs and outputs. Each component can have multiple internal failure mechanisms which combine with the states of the inputs to produce the appropriate output states. The generality of this approach allows not only the modeling of hardware, but human actions and interactions as well. A procedure for constructing and editing fault trees, either manually or by computer, is described. The techniques employed result in a complete fault tree, in standard form, suitable for analysis by current computer codes. Methods of describing the system, defining boundary conditions and specifying complex TOP events are developed in order to set up the initial configuration for which the fault tree is to be constructed. The approach used allows rapid modifications of the decision tables and systems to facilitate the analysis and comparison of various refinements and changes in the system configuration and component modeling

  14. Medical imaging in clinical applications algorithmic and computer-based approaches

    CERN Document Server

    Bhateja, Vikrant; Hassanien, Aboul

    2016-01-01

    This volume comprises of 21 selected chapters, including two overview chapters devoted to abdominal imaging in clinical applications supported computer aided diagnosis approaches as well as different techniques for solving the pectoral muscle extraction problem in the preprocessing part of the CAD systems for detecting breast cancer in its early stage using digital mammograms. The aim of this book is to stimulate further research in medical imaging applications based algorithmic and computer based approaches and utilize them in real-world clinical applications. The book is divided into four parts, Part-I: Clinical Applications of Medical Imaging, Part-II: Classification and clustering, Part-III: Computer Aided Diagnosis (CAD) Tools and Case Studies and Part-IV: Bio-inspiring based Computer Aided diagnosis techniques. .

  15. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words

    Directory of Open Access Journals (Sweden)

    Bingkun Wang

    2015-01-01

    Full Text Available With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  16. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words

    Science.gov (United States)

    Huang, Yongfeng; Wu, Xian; Li, Xing

    2015-01-01

    With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods. PMID:26106409

  17. Examining the Roles of Blended Learning Approaches in Computer-Supported Collaborative Learning (CSCL) Environments: A Delphi Study

    Science.gov (United States)

    So, Hyo-Jeong; Bonk, Curtis J.

    2010-01-01

    In this study, a Delphi method was used to identify and predict the roles of blended learning approaches in computer-supported collaborative learning (CSCL) environments. The Delphi panel consisted of experts in online learning from different geographic regions of the world. This study discusses findings related to (a) pros and cons of blended…

  18. A simplified approach to characterizing a kilovoltage source spectrum for accurate dose computation

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, Yannick; Kouznetsov, Alexei; Tambasco, Mauro [Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy and Department of Oncology, University of Calgary and Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada)

    2012-06-15

    Purpose: To investigate and validate the clinical feasibility of using half-value layer (HVL) and peak tube potential (kVp) for characterizing a kilovoltage (kV) source spectrum for the purpose of computing kV x-ray dose accrued from imaging procedures. To use this approach to characterize a Varian Registered-Sign On-Board Imager Registered-Sign (OBI) source and perform experimental validation of a novel in-house hybrid dose computation algorithm for kV x-rays. Methods: We characterized the spectrum of an imaging kV x-ray source using the HVL and the kVp as the sole beam quality identifiers using third-party freeware Spektr to generate the spectra. We studied the sensitivity of our dose computation algorithm to uncertainties in the beam's HVL and kVp by systematically varying these spectral parameters. To validate our approach experimentally, we characterized the spectrum of a Varian Registered-Sign OBI system by measuring the HVL using a Farmer-type Capintec ion chamber (0.06 cc) in air and compared dose calculations using our computationally validated in-house kV dose calculation code to measured percent depth-dose and transverse dose profiles for 80, 100, and 125 kVp open beams in a homogeneous phantom and a heterogeneous phantom comprising tissue, lung, and bone equivalent materials. Results: The sensitivity analysis of the beam quality parameters (i.e., HVL, kVp, and field size) on dose computation accuracy shows that typical measurement uncertainties in the HVL and kVp ({+-}0.2 mm Al and {+-}2 kVp, respectively) source characterization parameters lead to dose computation errors of less than 2%. Furthermore, for an open beam with no added filtration, HVL variations affect dose computation accuracy by less than 1% for a 125 kVp beam when field size is varied from 5 Multiplication-Sign 5 cm{sup 2} to 40 Multiplication-Sign 40 cm{sup 2}. The central axis depth dose calculations and experimental measurements for the 80, 100, and 125 kVp energies agreed within

  19. A dynamical-systems approach for computing ice-affected streamflow

    Science.gov (United States)

    Holtschlag, David J.

    1996-01-01

    A dynamical-systems approach was developed and evaluated for computing ice-affected streamflow. The approach provides for dynamic simulation and parameter estimation of site-specific equations relating ice effects to routinely measured environmental variables. Comparison indicates that results from the dynamical-systems approach ranked higher than results from 11 analytical methods previously investigated on the basis of accuracy and feasibility criteria. Additional research will likely lead to further improvements in the approach.

  20. A scalable approach to modeling groundwater flow on massively parallel computers

    International Nuclear Information System (INIS)

    Ashby, S.F.; Falgout, R.D.; Tompson, A.F.B.

    1995-12-01

    We describe a fully scalable approach to the simulation of groundwater flow on a hierarchy of computing platforms, ranging from workstations to massively parallel computers. Specifically, we advocate the use of scalable conceptual models in which the subsurface model is defined independently of the computational grid on which the simulation takes place. We also describe a scalable multigrid algorithm for computing the groundwater flow velocities. We axe thus able to leverage both the engineer's time spent developing the conceptual model and the computing resources used in the numerical simulation. We have successfully employed this approach at the LLNL site, where we have run simulations ranging in size from just a few thousand spatial zones (on workstations) to more than eight million spatial zones (on the CRAY T3D)-all using the same conceptual model

  1. Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories

    Science.gov (United States)

    Ng, Hok Kwan; Sridhar, Banavar

    2016-01-01

    This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.

  2. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  3. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  4. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  5. A Model for the Acceptance of Cloud Computing Technology Using DEMATEL Technique and System Dynamics Approach

    Directory of Open Access Journals (Sweden)

    seyyed mohammad zargar

    2018-03-01

    Full Text Available Cloud computing is a new method to provide computing resources and increase computing power in organizations. Despite the many benefits this method shares, it has not been universally used because of some obstacles including security issues and has become a concern for IT managers in organization. In this paper, the general definition of cloud computing is presented. In addition, having reviewed previous studies, the researchers identified effective variables on technology acceptance and, especially, cloud computing technology. Then, using DEMATEL technique, the effectiveness and permeability of the variable were determined. The researchers also designed a model to show the existing dynamics in cloud computing technology using system dynamics approach. The validity of the model was confirmed through evaluation methods in dynamics model by using VENSIM software. Finally, based on different conditions of the proposed model, a variety of scenarios were designed. Then, the implementation of these scenarios was simulated within the proposed model. The results showed that any increase in data security, government support and user training can lead to the increase in the adoption and use of cloud computing technology.

  6. Computed tomography of the lung. A pattern approach. 2. ed.

    International Nuclear Information System (INIS)

    Verschakelen, Johny A.; Wever, Walter de

    2018-01-01

    Computed Tomography of the Lung: A Pattern Approach aims to enable the reader to recognize and understand the CT signs of lung diseases and diseases with pulmonary involvement as a sound basis for diagnosis. After an introductory chapter, basic anatomy and its relevance to the interpretation of CT appearances is discussed. Advice is then provided on how to approach a CT scan of the lungs, and the different distribution and appearance patterns of disease are described. Subsequent chapters focus on the nature of these patterns, identify which diseases give rise to them, and explain how to differentiate between the diseases. The concluding chapter presents a large number of typical and less typical cases that will help the reader to practice application of the knowledge gained from the earlier chapters. Since the first edition, the book has been adapted and updated, with the inclusion of many new figures and case studies. It will be an invaluable asset both for radiologists and pulmonologists in training and for more experienced specialists wishing to update their knowledge.

  7. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  8. A computational approach to chemical etiologies of diabetes

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Brunak, Søren; Grandjean, Philippe

    2013-01-01

    Computational meta-analysis can link environmental chemicals to genes and proteins involved in human diseases, thereby elucidating possible etiologies and pathogeneses of non-communicable diseases. We used an integrated computational systems biology approach to examine possible pathogenetic...... linkages in type 2 diabetes (T2D) through genome-wide associations, disease similarities, and published empirical evidence. Ten environmental chemicals were found to be potentially linked to T2D, the highest scores were observed for arsenic, 2,3,7,8-tetrachlorodibenzo-p-dioxin, hexachlorobenzene...

  9. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  10. Design of tailor-made chemical blend using a decomposition-based computer-aided approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Gernaey, Krist; Manan, Z.A.

    2011-01-01

    Computer aided techniques form an efficient approach to solve chemical product design problems such as the design of blended liquid products (chemical blending). In chemical blending, one tries to find the best candidate, which satisfies the product targets defined in terms of desired product...... methodology for blended liquid products that identifies a set of feasible chemical blends. The blend design problem is formulated as a Mixed Integer Nonlinear Programming (MINLP) model where the objective is to find the optimal blended gasoline or diesel product subject to types of chemicals...... and their compositions and a set of desired target properties of the blended product as design constraints. This blend design problem is solved using a decomposition approach, which eliminates infeasible and/or redundant candidates gradually through a hierarchy of (property) model based constraints. This decomposition...

  11. Perturbation approach for nuclear magnetic resonance solid-state quantum computation

    Directory of Open Access Journals (Sweden)

    G. P. Berman

    2003-01-01

    Full Text Available A dynamics of a nuclear-spin quantum computer with a large number (L=1000 of qubits is considered using a perturbation approach. Small parameters are introduced and used to compute the error in an implementation of an entanglement between remote qubits, using a sequence of radio-frequency pulses. The error is computed up to the different orders of the perturbation theory and tested using exact numerical solution.

  12. Learn the Lagrangian: A Vector-Valued RKHS Approach to Identifying Lagrangian Systems.

    Science.gov (United States)

    Cheng, Ching-An; Huang, Han-Pang

    2016-12-01

    We study the modeling of Lagrangian systems with multiple degrees of freedom. Based on system dynamics, canonical parametric models require ad hoc derivations and sometimes simplification for a computable solution; on the other hand, due to the lack of prior knowledge in the system's structure, modern nonparametric models in machine learning face the curse of dimensionality, especially in learning large systems. In this paper, we bridge this gap by unifying the theories of Lagrangian systems and vector-valued reproducing kernel Hilbert space. We reformulate Lagrangian systems with kernels that embed the governing Euler-Lagrange equation-the Lagrangian kernels-and show that these kernels span a subspace capturing the Lagrangian's projection as inverse dynamics. By such property, our model uses only inputs and outputs as in machine learning and inherits the structured form as in system dynamics, thereby removing the need for the mundane derivations for new systems as well as the generalization problem in learning from scratches. In effect, it learns the system's Lagrangian, a simpler task than directly learning the dynamics. To demonstrate, we applied the proposed kernel to identify the robot inverse dynamics in simulations and experiments. Our results present a competitive novel approach to identifying Lagrangian systems, despite using only inputs and outputs.

  13. An Augmented Incomplete Factorization Approach for Computing the Schur Complement in Stochastic Optimization

    KAUST Repository

    Petra, Cosmin G.; Schenk, Olaf; Lubin, Miles; Gä ertner, Klaus

    2014-01-01

    We present a scalable approach and implementation for solving stochastic optimization problems on high-performance computers. In this work we revisit the sparse linear algebra computations of the parallel solver PIPS with the goal of improving the shared-memory performance and decreasing the time to solution. These computations consist of solving sparse linear systems with multiple sparse right-hand sides and are needed in our Schur-complement decomposition approach to compute the contribution of each scenario to the Schur matrix. Our novel approach uses an incomplete augmented factorization implemented within the PARDISO linear solver and an outer BiCGStab iteration to efficiently absorb pivot perturbations occurring during factorization. This approach is capable of both efficiently using the cores inside a computational node and exploiting sparsity of the right-hand sides. We report on the performance of the approach on highperformance computers when solving stochastic unit commitment problems of unprecedented size (billions of variables and constraints) that arise in the optimization and control of electrical power grids. Our numerical experiments suggest that supercomputers can be efficiently used to solve power grid stochastic optimization problems with thousands of scenarios under the strict "real-time" requirements of power grid operators. To our knowledge, this has not been possible prior to the present work. © 2014 Society for Industrial and Applied Mathematics.

  14. Cloud Computing - A Unified Approach for Surveillance Issues

    Science.gov (United States)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  15. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin; Alkhalifah, Tariq Ali

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  16. An approach to identify urban groundwater recharge

    Directory of Open Access Journals (Sweden)

    E. Vázquez-Suñé

    2010-10-01

    Full Text Available Evaluating the proportion in which waters from different origins are mixed in a given water sample is relevant for many hydrogeological problems, such as quantifying total recharge, assessing groundwater pollution risks, or managing water resources. Our work is motivated by urban hydrogeology, where waters with different chemical signature can be identified (losses from water supply and sewage networks, infiltration from surface runoff and other water bodies, lateral aquifers inflows, .... The relative contribution of different sources to total recharge can be quantified by means of solute mass balances, but application is hindered by the large number of potential origins. Hence, the need to incorporate data from a large number of conservative species, the uncertainty in sources concentrations and measurement errors. We present a methodology to compute mixing ratios and end-members composition, which consists of (i Identification of potential recharge sources, (ii Selection of tracers, (iii Characterization of the hydrochemical composition of potential recharge sources and mixed water samples, and (iv Computation of mixing ratios and reevaluation of end-members. The analysis performed in a data set from samples of the Barcelona city aquifers suggests that the main contributors to total recharge are the water supply network losses (22%, the sewage network losses (30%, rainfall, concentrated in the non-urbanized areas (17%, from runoff infiltration (20%, and the Besòs River (11%. Regarding species, halogens (chloride, fluoride and bromide, sulfate, total nitrogen, and stable isotopes (18O, 2H, and 34S behaved quite conservatively. Boron, residual alkalinity, EDTA and Zn did not. Yet, including these species in the computations did not affect significantly the proportion estimations.

  17. Promises and Pitfalls of Computer-Supported Mindfulness: Exploring a Situated Mobile Approach

    Directory of Open Access Journals (Sweden)

    Ralph Vacca

    2017-12-01

    Full Text Available Computer-supported mindfulness (CSM is a burgeoning area filled with varied approaches such as mobile apps and EEG headbands. However, many of the approaches focus on providing meditation guidance. The ubiquity of mobile devices may provide new opportunities to support mindfulness practices that are more situated in everyday life. In this paper, a new situated mindfulness approach is explored through a specific mobile app design. Through an experimental design, the approach is compared to traditional audio-based mindfulness meditation, and a mind wandering control, over a one-week period. The study demonstrates the viability for a situated mobile mindfulness approach to induce mindfulness states. However, phenomenological aspects of the situated mobile approach suggest both promises and pitfalls for computer-supported mindfulness using a situated approach.

  18. Towards scalable quantum communication and computation: Novel approaches and realizations

    Science.gov (United States)

    Jiang, Liang

    Quantum information science involves exploration of fundamental laws of quantum mechanics for information processing tasks. This thesis presents several new approaches towards scalable quantum information processing. First, we consider a hybrid approach to scalable quantum computation, based on an optically connected network of few-qubit quantum registers. Specifically, we develop a novel scheme for scalable quantum computation that is robust against various imperfections. To justify that nitrogen-vacancy (NV) color centers in diamond can be a promising realization of the few-qubit quantum register, we show how to isolate a few proximal nuclear spins from the rest of the environment and use them for the quantum register. We also demonstrate experimentally that the nuclear spin coherence is only weakly perturbed under optical illumination, which allows us to implement quantum logical operations that use the nuclear spins to assist the repetitive-readout of the electronic spin. Using this technique, we demonstrate more than two-fold improvement in signal-to-noise ratio. Apart from direct application to enhance the sensitivity of the NV-based nano-magnetometer, this experiment represents an important step towards the realization of robust quantum information processors using electronic and nuclear spin qubits. We then study realizations of quantum repeaters for long distance quantum communication. Specifically, we develop an efficient scheme for quantum repeaters based on atomic ensembles. We use dynamic programming to optimize various quantum repeater protocols. In addition, we propose a new protocol of quantum repeater with encoding, which efficiently uses local resources (about 100 qubits) to identify and correct errors, to achieve fast one-way quantum communication over long distances. Finally, we explore quantum systems with topological order. Such systems can exhibit remarkable phenomena such as quasiparticles with anyonic statistics and have been proposed as

  19. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  20. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    Integration of case study approach, project design and computer modeling in managerial accounting education ... Journal of Fundamental and Applied Sciences ... in the Laboratory of Management Accounting and Controlling Systems at the ...

  1. Digging deeper on "deep" learning: A computational ecology approach.

    Science.gov (United States)

    Buscema, Massimo; Sacco, Pier Luigi

    2017-01-01

    We propose an alternative approach to "deep" learning that is based on computational ecologies of structurally diverse artificial neural networks, and on dynamic associative memory responses to stimuli. Rather than focusing on massive computation of many different examples of a single situation, we opt for model-based learning and adaptive flexibility. Cross-fertilization of learning processes across multiple domains is the fundamental feature of human intelligence that must inform "new" artificial intelligence.

  2. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Perrin H. Beatty

    2016-10-01

    Full Text Available A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields.

  3. Gene-based Association Approach Identify Genes Across Stress Traits in Fruit Flies

    DEFF Research Database (Denmark)

    Rohde, Palle Duun; Edwards, Stefan McKinnon; Sarup, Pernille Merete

    Identification of genes explaining variation in quantitative traits or genetic risk factors of human diseases requires both good phenotypic- and genotypic data, but also efficient statistical methods. Genome-wide association studies may reveal association between phenotypic variation and variation...... approach grouping variants accordingly to gene position, thus lowering the number of statistical tests performed and increasing the probability of identifying genes with small to moderate effects. Using this approach we identify numerous genes associated with different types of stresses in Drosophila...... melanogaster, but also identify common genes that affects the stress traits....

  4. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  5. Computation-aware algorithm selection approach for interlaced-to-progressive conversion

    Science.gov (United States)

    Park, Sang-Jun; Jeon, Gwanggil; Jeong, Jechang

    2010-05-01

    We discuss deinterlacing results in a computationally constrained and varied environment. The proposed computation-aware algorithm selection approach (CASA) for fast interlaced to progressive conversion algorithm consists of three methods: the line-averaging (LA) method for plain regions, the modified edge-based line-averaging (MELA) method for medium regions, and the proposed covariance-based adaptive deinterlacing (CAD) method for complex regions. The proposed CASA uses two criteria, mean-squared error (MSE) and CPU time, for assigning the method. We proposed a CAD method. The principle idea of CAD is based on the correspondence between the high and low-resolution covariances. We estimated the local covariance coefficients from an interlaced image using Wiener filtering theory and then used these optimal minimum MSE interpolation coefficients to obtain a deinterlaced image. The CAD method, though more robust than most known methods, was not found to be very fast compared to the others. To alleviate this issue, we proposed an adaptive selection approach using a fast deinterlacing algorithm rather than using only one CAD algorithm. The proposed hybrid approach of switching between the conventional schemes (LA and MELA) and our CAD was proposed to reduce the overall computational load. A reliable condition to be used for switching the schemes was presented after a wide set of initial training processes. The results of computer simulations showed that the proposed methods outperformed a number of methods presented in the literature.

  6. Identifying Green Infrastructure from Social Media and Crowdsourcing- An Image Based Machine-Learning Approach.

    Science.gov (United States)

    Rai, A.; Minsker, B. S.

    2016-12-01

    In this work we introduce a novel dataset GRID: GReen Infrastructure Detection Dataset and a framework for identifying urban green storm water infrastructure (GI) designs (wetlands/ponds, urban trees, and rain gardens/bioswales) from social media and satellite aerial images using computer vision and machine learning methods. Along with the hydrologic benefits of GI, such as reducing runoff volumes and urban heat islands, GI also provides important socio-economic benefits such as stress recovery and community cohesion. However, GI is installed by many different parties and cities typically do not know where GI is located, making study of its impacts or siting new GI difficult. We use object recognition learning methods (template matching, sliding window approach, and Random Hough Forest method) and supervised machine learning algorithms (e.g., support vector machines) as initial screening approaches to detect potential GI sites, which can then be investigated in more detail using on-site surveys. Training data were collected from GPS locations of Flickr and Instagram image postings and Amazon Mechanical Turk identification of each GI type. Sliding window method outperformed other methods and achieved an average F measure, which is combined metric for precision and recall performance measure of 0.78.

  7. Energy-aware memory management for embedded multimedia systems a computer-aided design approach

    CERN Document Server

    Balasa, Florin

    2011-01-01

    Energy-Aware Memory Management for Embedded Multimedia Systems: A Computer-Aided Design Approach presents recent computer-aided design (CAD) ideas that address memory management tasks, particularly the optimization of energy consumption in the memory subsystem. It explains how to efficiently implement CAD solutions, including theoretical methods and novel algorithms. The book covers various energy-aware design techniques, including data-dependence analysis techniques, memory size estimation methods, extensions of mapping approaches, and memory banking approaches. It shows how these techniques

  8. A Cognitive Computing Approach for Classification of Complaints in the Insurance Industry

    Science.gov (United States)

    Forster, J.; Entrup, B.

    2017-10-01

    In this paper we present and evaluate a cognitive computing approach for classification of dissatisfaction and four complaint specific complaint classes in correspondence documents between insurance clients and an insurance company. A cognitive computing approach includes the combination classical natural language processing methods, machine learning algorithms and the evaluation of hypothesis. The approach combines a MaxEnt machine learning algorithm with language modelling, tf-idf and sentiment analytics to create a multi-label text classification model. The result is trained and tested with a set of 2500 original insurance communication documents written in German, which have been manually annotated by the partnering insurance company. With a F1-Score of 0.9, a reliable text classification component has been implemented and evaluated. A final outlook towards a cognitive computing insurance assistant is given in the end.

  9. Site-Mutation of Hydrophobic Core Residues Synchronically Poise Super Interleukin 2 for Signaling: Identifying Distant Structural Effects through Affordable Computations

    Directory of Open Access Journals (Sweden)

    Longcan Mei

    2018-03-01

    Full Text Available A superkine variant of interleukin-2 with six site mutations away from the binding interface developed from the yeast display technique has been previously characterized as undergoing a distal structure alteration which is responsible for its super-potency and provides an elegant case study with which to get insight about how to utilize allosteric effect to achieve desirable protein functions. By examining the dynamic network and the allosteric pathways related to those mutated residues using various computational approaches, we found that nanosecond time scale all-atom molecular dynamics simulations can identify the dynamic network as efficient as an ensemble algorithm. The differentiated pathways for the six core residues form a dynamic network that outlines the area of structure alteration. The results offer potentials of using affordable computing power to predict allosteric structure of mutants in knowledge-based mutagenesis.

  10. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    OpenAIRE

    Grover Kearns

    2010-01-01

    Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants). Accounting stu...

  11. Computer aided approach for qualitative risk assessment of engineered systems

    International Nuclear Information System (INIS)

    Crowley, W.K.; Arendt, J.S.; Fussell, J.B.; Rooney, J.J.; Wagner, D.P.

    1978-01-01

    This paper outlines a computer aided methodology for determining the relative contributions of various subsystems and components to the total risk associated with an engineered system. Major contributors to overall task risk are identified through comparison of an expected frequency density function with an established risk criterion. Contributions that are inconsistently high are also identified. The results from this analysis are useful for directing efforts for improving system safety and performance. An analysis of uranium hexafluoride handling risk at a gaseous diffusion uranium enrichment plant using a preliminary version of the computer program EXCON is briefly described and illustrated

  12. Objective definition of rosette shape variation using a combined computer vision and data mining approach.

    Directory of Open Access Journals (Sweden)

    Anyela Camargo

    Full Text Available Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.

  13. Computational Diagnostic: A Novel Approach to View Medical Data.

    Energy Technology Data Exchange (ETDEWEB)

    Mane, K. K. (Ketan Kirtiraj); Börner, K. (Katy)

    2007-01-01

    A transition from traditional paper-based medical records to electronic health record is largely underway. The use of electronic records offers tremendous potential to personalize patient diagnosis and treatment. In this paper, we discuss a computational diagnostic tool that uses digital medical records to help doctors gain better insight about a patient's medical condition. The paper details different interactive features of the tool which offer potential to practice evidence-based medicine and advance patient diagnosis practices. The healthcare industry is a constantly evolving domain. Research from this domain is often translated into better understanding of different medical conditions. This new knowledge often contributes towards improved diagnosis and treatment solutions for patients. But the healthcare industry lags behind to seek immediate benefits of the new knowledge as it still adheres to the traditional paper-based approach to keep track of medical records. However recently we notice a drive that promotes a transition towards electronic health record (EHR). An EHR stores patient medical records in digital format and offers potential to replace the paper health records. Earlier attempts of an EHR replicated the paper layout on the screen, representation of medical history of a patient in a graphical time-series format, interactive visualization with 2D/3D generated images from an imaging device. But an EHR can be much more than just an 'electronic view' of the paper record or a collection of images from an imaging device. In this paper, we present an EHR called 'Computational Diagnostic Tool', that provides a novel computational approach to look at patient medical data. The developed EHR system is knowledge driven and acts as clinical decision support tool. The EHR tool provides two visual views of the medical data. Dynamic interaction with data is supported to help doctors practice evidence-based decisions and make judicious

  14. D-Wave's Approach to Quantum Computing: 1000-qubits and Counting!

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    In this talk I will describe D-Wave's approach to quantum computing, including the system architecture of our 1000-qubit D-Wave 2X, its programming model, and performance benchmarks. Furthermore, I will describe how the native optimization and sampling capabilities of the quantum processor can be exploited to tackle problems in a variety of fields including medicine, machine learning, physics, and computational finance.

  15. A Systematic Approach to Determining the Identifiability of Multistage Carcinogenesis Models.

    Science.gov (United States)

    Brouwer, Andrew F; Meza, Rafael; Eisenberg, Marisa C

    2017-07-01

    Multistage clonal expansion (MSCE) models of carcinogenesis are continuous-time Markov process models often used to relate cancer incidence to biological mechanism. Identifiability analysis determines what model parameter combinations can, theoretically, be estimated from given data. We use a systematic approach, based on differential algebra methods traditionally used for deterministic ordinary differential equation (ODE) models, to determine identifiable combinations for a generalized subclass of MSCE models with any number of preinitation stages and one clonal expansion. Additionally, we determine the identifiable combinations of the generalized MSCE model with up to four clonal expansion stages, and conjecture the results for any number of clonal expansion stages. The results improve upon previous work in a number of ways and provide a framework to find the identifiable combinations for further variations on the MSCE models. Finally, our approach, which takes advantage of the Kolmogorov backward equations for the probability generating functions of the Markov process, demonstrates that identifiability methods used in engineering and mathematics for systems of ODEs can be applied to continuous-time Markov processes. © 2016 Society for Risk Analysis.

  16. Highly efficient computer algorithm for identifying layer thickness of atomically thin 2D materials

    Science.gov (United States)

    Lee, Jekwan; Cho, Seungwan; Park, Soohyun; Bae, Hyemin; Noh, Minji; Kim, Beom; In, Chihun; Yang, Seunghoon; Lee, Sooun; Seo, Seung Young; Kim, Jehyun; Lee, Chul-Ho; Shim, Woo-Young; Jo, Moon-Ho; Kim, Dohun; Choi, Hyunyong

    2018-03-01

    The fields of layered material research, such as transition-metal dichalcogenides (TMDs), have demonstrated that the optical, electrical and mechanical properties strongly depend on the layer number N. Thus, efficient and accurate determination of N is the most crucial step before the associated device fabrication. An existing experimental technique using an optical microscope is the most widely used one to identify N. However, a critical drawback of this approach is that it relies on extensive laboratory experiences to estimate N; it requires a very time-consuming image-searching task assisted by human eyes and secondary measurements such as atomic force microscopy and Raman spectroscopy, which are necessary to ensure N. In this work, we introduce a computer algorithm based on the image analysis of a quantized optical contrast. We show that our algorithm can apply to a wide variety of layered materials, including graphene, MoS2, and WS2 regardless of substrates. The algorithm largely consists of two parts. First, it sets up an appropriate boundary between target flakes and substrate. Second, to compute N, it automatically calculates the optical contrast using an adaptive RGB estimation process between each target, which results in a matrix with different integer Ns and returns a matrix map of Ns onto the target flake position. Using a conventional desktop computational power, the time taken to display the final N matrix was 1.8 s on average for the image size of 1280 pixels by 960 pixels and obtained a high accuracy of 90% (six estimation errors among 62 samples) when compared to the other methods. To show the effectiveness of our algorithm, we also apply it to TMD flakes transferred on optically transparent c-axis sapphire substrates and obtain a similar result of the accuracy of 94% (two estimation errors among 34 samples).

  17. The soft computing-based approach to investigate allergic diseases: a systematic review.

    Science.gov (United States)

    Tartarisco, Gennaro; Tonacci, Alessandro; Minciullo, Paola Lucia; Billeci, Lucia; Pioggia, Giovanni; Incorvaia, Cristoforo; Gangemi, Sebastiano

    2017-01-01

    Early recognition of inflammatory markers and their relation to asthma, adverse drug reactions, allergic rhinitis, atopic dermatitis and other allergic diseases is an important goal in allergy. The vast majority of studies in the literature are based on classic statistical methods; however, developments in computational techniques such as soft computing-based approaches hold new promise in this field. The aim of this manuscript is to systematically review the main soft computing-based techniques such as artificial neural networks, support vector machines, bayesian networks and fuzzy logic to investigate their performances in the field of allergic diseases. The review was conducted following PRISMA guidelines and the protocol was registered within PROSPERO database (CRD42016038894). The research was performed on PubMed and ScienceDirect, covering the period starting from September 1, 1990 through April 19, 2016. The review included 27 studies related to allergic diseases and soft computing performances. We observed promising results with an overall accuracy of 86.5%, mainly focused on asthmatic disease. The review reveals that soft computing-based approaches are suitable for big data analysis and can be very powerful, especially when dealing with uncertainty and poorly characterized parameters. Furthermore, they can provide valuable support in case of lack of data and entangled cause-effect relationships, which make it difficult to assess the evolution of disease. Although most works deal with asthma, we believe the soft computing approach could be a real breakthrough and foster new insights into other allergic diseases as well.

  18. Identifying niche-mediated regulatory factors of stem cell phenotypic state: a systems biology approach.

    Science.gov (United States)

    Ravichandran, Srikanth; Del Sol, Antonio

    2017-02-01

    Understanding how the cellular niche controls the stem cell phenotype is often hampered due to the complexity of variegated niche composition, its dynamics, and nonlinear stem cell-niche interactions. Here, we propose a systems biology view that considers stem cell-niche interactions as a many-body problem amenable to simplification by the concept of mean field approximation. This enables approximation of the niche effect on stem cells as a constant field that induces sustained activation/inhibition of specific stem cell signaling pathways in all stem cells within heterogeneous populations exhibiting the same phenotype (niche determinants). This view offers a new basis for the development of single cell-based computational approaches for identifying niche determinants, which has potential applications in regenerative medicine and tissue engineering. © 2017 The Authors. FEBS Letters published by John Wiley & Sons Ltd on behalf of Federation of European Biochemical Societies.

  19. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  20. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    International Nuclear Information System (INIS)

    Higdon, Dave; McDonnell, Jordan D; Schunck, Nicolas; Sarich, Jason; Wild, Stefan M

    2015-01-01

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model η(θ), where θ denotes the uncertain, best input setting. Hence the statistical model is of the form y=η(θ)+ϵ, where ϵ accounts for measurement, and possibly other, error sources. When nonlinearity is present in η(⋅), the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model η(⋅). This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. We also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory. (paper)

  1. 'Omics' approaches in tomato aimed at identifying candidate genes ...

    African Journals Online (AJOL)

    adriana

    2013-12-04

    Dec 4, 2013 ... approaches could be combined in order to identify candidate genes for the genetic control of ascorbic ..... applied to other traits under the complex control of many ... Engineering increased vitamin C levels in ... Chem. Biol. 13:532–538. Giovannucci E, Rimm EB, Liu Y, Stampfer MJ, Willett WC (2002). A.

  2. A computational approach to animal breeding.

    Science.gov (United States)

    Berger-Wolf, Tanya Y; Moore, Cristopher; Saia, Jared

    2007-02-07

    We propose a computational model of mating strategies for controlled animal breeding programs. A mating strategy in a controlled breeding program is a heuristic with some optimization criteria as a goal. Thus, it is appropriate to use the computational tools available for analysis of optimization heuristics. In this paper, we propose the first discrete model of the controlled animal breeding problem and analyse heuristics for two possible objectives: (1) breeding for maximum diversity and (2) breeding a target individual. These two goals are representative of conservation biology and agricultural livestock management, respectively. We evaluate several mating strategies and provide upper and lower bounds for the expected number of matings. While the population parameters may vary and can change the actual number of matings for a particular strategy, the order of magnitude of the number of expected matings and the relative competitiveness of the mating heuristics remains the same. Thus, our simple discrete model of the animal breeding problem provides a novel viable and robust approach to designing and comparing breeding strategies in captive populations.

  3. Computation within the auxiliary field approach

    International Nuclear Information System (INIS)

    Baeurle, S.A.

    2003-01-01

    Recently, the classical auxiliary field methodology has been developed as a new simulation technique for performing calculations within the framework of classical statistical mechanics. Since the approach suffers from a sign problem, a judicious choice of the sampling algorithm, allowing a fast statistical convergence and an efficient generation of field configurations, is of fundamental importance for a successful simulation. In this paper we focus on the computational aspects of this simulation methodology. We introduce two different types of algorithms, the single-move auxiliary field Metropolis Monte Carlo algorithm and two new classes of force-based algorithms, which enable multiple-move propagation. In addition, to further optimize the sampling, we describe a preconditioning scheme, which permits to treat each field degree of freedom individually with regard to the evolution through the auxiliary field configuration space. Finally, we demonstrate the validity and assess the competitiveness of these algorithms on a representative practical example. We believe that they may also provide an interesting possibility for enhancing the computational efficiency of other auxiliary field methodologies

  4. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  5. An Objective Approach to Identify Spectral Distinctiveness for Hearing Impairment

    Directory of Open Access Journals (Sweden)

    Yeou-Jiunn Chen

    2013-01-01

    Full Text Available To facilitate the process of developing speech perception, speech-language pathologists have to teach a subject with hearing loss the differences between two syllables by manually enhancing acoustic cues of speech. However, this process is time consuming and difficult. Thus, this study proposes an objective approach to automatically identify the regions of spectral distinctiveness between two syllables, which is used for speech-perception training. To accurately represent the characteristics of speech, mel-frequency cepstrum coefficients are selected as analytical parameters. The mismatch between two syllables in time domain is handled by dynamic time warping. Further, a filter bank is adopted to estimate the components in different frequency bands, which are also represented as mel-frequency cepstrum coefficients. The spectral distinctiveness in different frequency bands is then easily estimated by using Euclidean metrics. Finally, a morphological gradient operator is applied to automatically identify the regions of spectral distinctiveness. To evaluate the proposed approach, the identified regions are manipulated and then the manipulated syllables are measured by a close-set based speech-perception test. The experimental results demonstrated that the identified regions of spectral distinctiveness are very useful in speech perception, which indeed can help speech-language pathologists in speech-perception training.

  6. A discrete wavelet spectrum approach for identifying non-monotonic trends in hydroclimate data

    Science.gov (United States)

    Sang, Yan-Fang; Sun, Fubao; Singh, Vijay P.; Xie, Ping; Sun, Jian

    2018-01-01

    The hydroclimatic process is changing non-monotonically and identifying its trends is a great challenge. Building on the discrete wavelet transform theory, we developed a discrete wavelet spectrum (DWS) approach for identifying non-monotonic trends in hydroclimate time series and evaluating their statistical significance. After validating the DWS approach using two typical synthetic time series, we examined annual temperature and potential evaporation over China from 1961-2013 and found that the DWS approach detected both the warming and the warming hiatus in temperature, and the reversed changes in potential evaporation. Further, the identified non-monotonic trends showed stable significance when the time series was longer than 30 years or so (i.e. the widely defined climate timescale). The significance of trends in potential evaporation measured at 150 stations in China, with an obvious non-monotonic trend, was underestimated and was not detected by the Mann-Kendall test. Comparatively, the DWS approach overcame the problem and detected those significant non-monotonic trends at 380 stations, which helped understand and interpret the spatiotemporal variability in the hydroclimatic process. Our results suggest that non-monotonic trends of hydroclimate time series and their significance should be carefully identified, and the DWS approach proposed has the potential for wide use in the hydrological and climate sciences.

  7. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  8. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  9. Computational Approaches for Prediction of Pathogen-Host Protein-Protein Interactions

    Directory of Open Access Journals (Sweden)

    Esmaeil eNourani

    2015-02-01

    Full Text Available Infectious diseases are still among the major and prevalent health problems, mostly because of the drug resistance of novel variants of pathogens. Molecular interactions between pathogens and their hosts are the key part of the infection mechanisms. Novel antimicrobial therapeutics to fight drug resistance is only possible in case of a thorough understanding of pathogen-host interaction (PHI systems. Existing databases, which contain experimentally verified PHI data, suffer from scarcity of reported interactions due to the technically challenging and time consuming process of experiments. This has motivated many researchers to address the problem by proposing computational approaches for analysis and prediction of PHIs. The computational methods primarily utilize sequence information, protein structure and known interactions. Classic machine learning techniques are used when there are sufficient known interactions to be used as training data. On the opposite case, transfer and multi task learning methods are preferred. Here, we present an overview of these computational approaches for PHI prediction, discussing their weakness and abilities, with future directions.

  10. A computational design approach for virtual screening of peptide interactions across K+ channel families

    Directory of Open Access Journals (Sweden)

    Craig A. Doupnik

    2015-01-01

    Full Text Available Ion channels represent a large family of membrane proteins with many being well established targets in pharmacotherapy. The ‘druggability’ of heteromeric channels comprised of different subunits remains obscure, due largely to a lack of channel-specific probes necessary to delineate their therapeutic potential in vivo. Our initial studies reported here, investigated the family of inwardly rectifying potassium (Kir channels given the availability of high resolution crystal structures for the eukaryotic constitutively active Kir2.2 channel. We describe a ‘limited’ homology modeling approach that can yield chimeric Kir channels having an outer vestibule structure representing nearly any known vertebrate or invertebrate channel. These computationally-derived channel structures were tested in silico for ‘docking’ to NMR structures of tertiapin (TPN, a 21 amino acid peptide found in bee venom. TPN is a highly selective and potent blocker for the epithelial rat Kir1.1 channel, but does not block human or zebrafish Kir1.1 channel isoforms. Our Kir1.1 channel-TPN docking experiments recapitulated published in vitro findings for TPN-sensitive and TPN-insensitive channels. Additionally, in silico site-directed mutagenesis identified ‘hot spots’ within the channel outer vestibule that mediate energetically favorable docking scores and correlate with sites previously identified with in vitro thermodynamic mutant-cycle analysis. These ‘proof-of-principle’ results establish a framework for virtual screening of re-engineered peptide toxins for interactions with computationally derived Kir channels that currently lack channel-specific blockers. When coupled with electrophysiological validation, this virtual screening approach may accelerate the drug discovery process, and can be readily applied to other ion channels families where high resolution structures are available.

  11. A Biologically-Based Computational Approach to Drug Repurposing for Anthrax Infection

    Directory of Open Access Journals (Sweden)

    Jane P. F. Bai

    2017-03-01

    Full Text Available Developing drugs to treat the toxic effects of lethal toxin (LT and edema toxin (ET produced by B. anthracis is of global interest. We utilized a computational approach to score 474 drugs/compounds for their ability to reverse the toxic effects of anthrax toxins. For each toxin or drug/compound, we constructed an activity network by using its differentially expressed genes, molecular targets, and protein interactions. Gene expression profiles of drugs were obtained from the Connectivity Map and those of anthrax toxins in human alveolar macrophages were obtained from the Gene Expression Omnibus. Drug rankings were based on the ability of a drug/compound’s mode of action in the form of a signaling network to reverse the effects of anthrax toxins; literature reports were used to verify the top 10 and bottom 10 drugs/compounds identified. Simvastatin and bepridil with reported in vitro potency for protecting cells from LT and ET toxicities were computationally ranked fourth and eighth. The other top 10 drugs were fenofibrate, dihydroergotamine, cotinine, amantadine, mephenytoin, sotalol, ifosfamide, and mefloquine; literature mining revealed their potential protective effects from LT and ET toxicities. These drugs are worthy of investigation for their therapeutic benefits and might be used in combination with antibiotics for treating B. anthracis infection.

  12. A Crisis Management Approach To Mission Survivability In Computational Multi-Agent Systems

    Directory of Open Access Journals (Sweden)

    Aleksander Byrski

    2010-01-01

    Full Text Available In this paper we present a biologically-inspired approach for mission survivability (consideredas the capability of fulfilling a task such as computation that allows the system to be aware ofthe possible threats or crises that may arise. This approach uses the notion of resources usedby living organisms to control their populations.We present the concept of energetic selectionin agent-based evolutionary systems as well as the means to manipulate the configuration ofthe computation according to the crises or user’s specific demands.

  13. A New Approach to Practical Active-Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Buus; Nordholt, Peter Sebastian; Orlandi, Claudio

    2012-01-01

    We propose a new approach to practical two-party computation secure against an active adversary. All prior practical protocols were based on Yao’s garbled circuits. We use an OT-based approach and get efficiency via OT extension in the random oracle model. To get a practical protocol we introduce...... a number of novel techniques for relating the outputs and inputs of OTs in a larger construction....

  14. Creating the computer player: an engaging and collaborative approach to introduce computational thinking by combining ‘unplugged’ activities with visual programming

    Directory of Open Access Journals (Sweden)

    Anna Gardeli

    2017-11-01

    Full Text Available Ongoing research is being conducted on appropriate course design, practices and teacher interventions for improving the efficiency of computer science and programming courses in K-12 education. The trend is towards a more constructivist problem-based learning approach. Computational thinking, which refers to formulating and solving problems in a form that can be efficiently processed by a computer, raises an important educational challenge. Our research aims to explore possible ways of enriching computer science teaching with a focus on development of computational thinking. We have prepared and evaluated a learning intervention for introducing computer programming to children between 10 and 14 years old; this involves students working in groups to program the behavior of the computer player of a well-known game. The programming process is split into two parts. First, students design a high-level version of their algorithm during an ‘unplugged’ pen & paper phase, and then they encode their solution as an executable program in a visual programming environment. Encouraging evaluation results have been achieved regarding the educational and motivational value of the proposed approach.

  15. Data science in R a case studies approach to computational reasoning and problem solving

    CERN Document Server

    Nolan, Deborah

    2015-01-01

    Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar

  16. Algorithmic design of a noise-resistant and efficient closed-loop deep brain stimulation system: A computational approach.

    Directory of Open Access Journals (Sweden)

    Sofia D Karamintziou

    Full Text Available Advances in the field of closed-loop neuromodulation call for analysis and modeling approaches capable of confronting challenges related to the complex neuronal response to stimulation and the presence of strong internal and measurement noise in neural recordings. Here we elaborate on the algorithmic aspects of a noise-resistant closed-loop subthalamic nucleus deep brain stimulation system for advanced Parkinson's disease and treatment-refractory obsessive-compulsive disorder, ensuring remarkable performance in terms of both efficiency and selectivity of stimulation, as well as in terms of computational speed. First, we propose an efficient method drawn from dynamical systems theory, for the reliable assessment of significant nonlinear coupling between beta and high-frequency subthalamic neuronal activity, as a biomarker for feedback control. Further, we present a model-based strategy through which optimal parameters of stimulation for minimum energy desynchronizing control of neuronal activity are being identified. The strategy integrates stochastic modeling and derivative-free optimization of neural dynamics based on quadratic modeling. On the basis of numerical simulations, we demonstrate the potential of the presented modeling approach to identify, at a relatively low computational cost, stimulation settings potentially associated with a significantly higher degree of efficiency and selectivity compared with stimulation settings determined post-operatively. Our data reinforce the hypothesis that model-based control strategies are crucial for the design of novel stimulation protocols at the backstage of clinical applications.

  17. Algorithmic design of a noise-resistant and efficient closed-loop deep brain stimulation system: A computational approach.

    Science.gov (United States)

    Karamintziou, Sofia D; Custódio, Ana Luísa; Piallat, Brigitte; Polosan, Mircea; Chabardès, Stéphan; Stathis, Pantelis G; Tagaris, George A; Sakas, Damianos E; Polychronaki, Georgia E; Tsirogiannis, George L; David, Olivier; Nikita, Konstantina S

    2017-01-01

    Advances in the field of closed-loop neuromodulation call for analysis and modeling approaches capable of confronting challenges related to the complex neuronal response to stimulation and the presence of strong internal and measurement noise in neural recordings. Here we elaborate on the algorithmic aspects of a noise-resistant closed-loop subthalamic nucleus deep brain stimulation system for advanced Parkinson's disease and treatment-refractory obsessive-compulsive disorder, ensuring remarkable performance in terms of both efficiency and selectivity of stimulation, as well as in terms of computational speed. First, we propose an efficient method drawn from dynamical systems theory, for the reliable assessment of significant nonlinear coupling between beta and high-frequency subthalamic neuronal activity, as a biomarker for feedback control. Further, we present a model-based strategy through which optimal parameters of stimulation for minimum energy desynchronizing control of neuronal activity are being identified. The strategy integrates stochastic modeling and derivative-free optimization of neural dynamics based on quadratic modeling. On the basis of numerical simulations, we demonstrate the potential of the presented modeling approach to identify, at a relatively low computational cost, stimulation settings potentially associated with a significantly higher degree of efficiency and selectivity compared with stimulation settings determined post-operatively. Our data reinforce the hypothesis that model-based control strategies are crucial for the design of novel stimulation protocols at the backstage of clinical applications.

  18. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2006-01-01

    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  19. A Brief Review of Computer-Assisted Approaches to Rational Design of Peptide Vaccines

    Directory of Open Access Journals (Sweden)

    Ashesh Nandy

    2016-05-01

    Full Text Available The growing incidences of new viral diseases and increasingly frequent viral epidemics have strained therapeutic and preventive measures; the high mutability of viral genes puts additional strains on developmental efforts. Given the high cost and time requirements for new drugs development, vaccines remain as a viable alternative, but there too traditional techniques of live-attenuated or inactivated vaccines have the danger of allergenic reactions and others. Peptide vaccines have, over the last several years, begun to be looked on as more appropriate alternatives, which are economically affordable, require less time for development and hold the promise of multi-valent dosages. The developments in bioinformatics, proteomics, immunogenomics, structural biology and other sciences have spurred the growth of vaccinomics where computer assisted approaches serve to identify suitable peptide targets for eventual development of vaccines. In this mini-review we give a brief overview of some of the recent trends in computer assisted vaccine development with emphasis on the primary selection procedures of probable peptide candidates for vaccine development.

  20. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  1. A discrete wavelet spectrum approach for identifying non-monotonic trends in hydroclimate data

    Directory of Open Access Journals (Sweden)

    Y.-F. Sang

    2018-01-01

    Full Text Available The hydroclimatic process is changing non-monotonically and identifying its trends is a great challenge. Building on the discrete wavelet transform theory, we developed a discrete wavelet spectrum (DWS approach for identifying non-monotonic trends in hydroclimate time series and evaluating their statistical significance. After validating the DWS approach using two typical synthetic time series, we examined annual temperature and potential evaporation over China from 1961–2013 and found that the DWS approach detected both the warming and the warming hiatus in temperature, and the reversed changes in potential evaporation. Further, the identified non-monotonic trends showed stable significance when the time series was longer than 30 years or so (i.e. the widely defined climate timescale. The significance of trends in potential evaporation measured at 150 stations in China, with an obvious non-monotonic trend, was underestimated and was not detected by the Mann–Kendall test. Comparatively, the DWS approach overcame the problem and detected those significant non-monotonic trends at 380 stations, which helped understand and interpret the spatiotemporal variability in the hydroclimatic process. Our results suggest that non-monotonic trends of hydroclimate time series and their significance should be carefully identified, and the DWS approach proposed has the potential for wide use in the hydrological and climate sciences.

  2. An Integrative data mining approach to identifying Adverse ...

    Science.gov (United States)

    The Adverse Outcome Pathway (AOP) framework is a tool for making biological connections and summarizing key information across different levels of biological organization to connect biological perturbations at the molecular level to adverse outcomes for an individual or population. Computational approaches to explore and determine these connections can accelerate the assembly of AOPs. By leveraging the wealth of publicly available data covering chemical effects on biological systems, computationally-predicted AOPs (cpAOPs) were assembled via data mining of high-throughput screening (HTS) in vitro data, in vivo data and other disease phenotype information. Frequent Itemset Mining (FIM) was used to find associations between the gene targets of ToxCast HTS assays and disease data from Comparative Toxicogenomics Database (CTD) by using the chemicals as the common aggregators between datasets. The method was also used to map gene expression data to disease data from CTD. A cpAOP network was defined by considering genes and diseases as nodes and FIM associations as edges. This network contained 18,283 gene to disease associations for the ToxCast data and 110,253 for CTD gene expression. Two case studies show the value of the cpAOP network by extracting subnetworks focused either on fatty liver disease or the Aryl Hydrocarbon Receptor (AHR). The subnetwork surrounding fatty liver disease included many genes known to play a role in this disease. When querying the cpAOP

  3. Mathematics of shape description a morphological approach to image processing and computer graphics

    CERN Document Server

    Ghosh, Pijush K

    2009-01-01

    Image processing problems are often not well defined because real images are contaminated with noise and other uncertain factors. In Mathematics of Shape Description, the authors take a mathematical approach to address these problems using the morphological and set-theoretic approach to image processing and computer graphics by presenting a simple shape model using two basic shape operators called Minkowski addition and decomposition. This book is ideal for professional researchers and engineers in Information Processing, Image Measurement, Shape Description, Shape Representation and Computer Graphics. Post-graduate and advanced undergraduate students in pure and applied mathematics, computer sciences, robotics and engineering will also benefit from this book.  Key FeaturesExplains the fundamental and advanced relationships between algebraic system and shape description through the set-theoretic approachPromotes interaction of image processing geochronology and mathematics in the field of algebraic geometryP...

  4. Application of artificial neural networks to identify equilibration in computer simulations

    Science.gov (United States)

    Leibowitz, Mitchell H.; Miller, Evan D.; Henry, Michael M.; Jankowski, Eric

    2017-11-01

    Determining which microstates generated by a thermodynamic simulation are representative of the ensemble for which sampling is desired is a ubiquitous, underspecified problem. Artificial neural networks are one type of machine learning algorithm that can provide a reproducible way to apply pattern recognition heuristics to underspecified problems. Here we use the open-source TensorFlow machine learning library and apply it to the problem of identifying which hypothetical observation sequences from a computer simulation are “equilibrated” and which are not. We generate training populations and test populations of observation sequences with embedded linear and exponential correlations. We train a two-neuron artificial network to distinguish the correlated and uncorrelated sequences. We find that this simple network is good enough for > 98% accuracy in identifying exponentially-decaying energy trajectories from molecular simulations.

  5. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    Frolov, I.; Vaguine, A.; Silin, A.

    1992-01-01

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  6. A statistical approach to identify candidate cues for nestmate recognition

    DEFF Research Database (Denmark)

    van Zweden, Jelle; Pontieri, Luigi; Pedersen, Jes Søe

    2014-01-01

    normalization, centroid,and distance calculation is most diagnostic to discriminate between NMR cues andother compounds. We find that using a “global centroid” instead of a “colony centroid”significantly improves the analysis. One reason may be that this new approach, unlikeprevious ones, provides...... than forF. exsecta, possibly due to less than ideal datasets. Nonetheless, some compound setsperformed better than others, showing that this approach can be used to identify candidatecompounds to be tested in bio-assays, and eventually crack the sophisticated code thatgoverns nestmate recognition....

  7. Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.

    Science.gov (United States)

    Montalvo-Acosta, Joel José; Cecchini, Marco

    2016-12-01

    The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  9. Approach to Computer Implementation of Mathematical Model of 3-Phase Induction Motor

    Science.gov (United States)

    Pustovetov, M. Yu

    2018-03-01

    This article discusses the development of the computer model of an induction motor based on the mathematical model in a three-phase stator reference frame. It uses an approach that allows combining during preparation of the computer model dual methods: means of visual programming circuitry (in the form of electrical schematics) and logical one (in the form of block diagrams). The approach enables easy integration of the model of an induction motor as part of more complex models of electrical complexes and systems. The developed computer model gives the user access to the beginning and the end of a winding of each of the three phases of the stator and rotor. This property is particularly important when considering the asymmetric modes of operation or when powered by the special circuitry of semiconductor converters.

  10. A model-based approach to identify binding sites in CLIP-Seq data.

    Directory of Open Access Journals (Sweden)

    Tao Wang

    Full Text Available Cross-linking immunoprecipitation coupled with high-throughput sequencing (CLIP-Seq has made it possible to identify the targeting sites of RNA-binding proteins in various cell culture systems and tissue types on a genome-wide scale. Here we present a novel model-based approach (MiClip to identify high-confidence protein-RNA binding sites from CLIP-seq datasets. This approach assigns a probability score for each potential binding site to help prioritize subsequent validation experiments. The MiClip algorithm has been tested in both HITS-CLIP and PAR-CLIP datasets. In the HITS-CLIP dataset, the signal/noise ratios of miRNA seed motif enrichment produced by the MiClip approach are between 17% and 301% higher than those by the ad hoc method for the top 10 most enriched miRNAs. In the PAR-CLIP dataset, the MiClip approach can identify ∼50% more validated binding targets than the original ad hoc method and two recently published methods. To facilitate the application of the algorithm, we have released an R package, MiClip (http://cran.r-project.org/web/packages/MiClip/index.html, and a public web-based graphical user interface software (http://galaxy.qbrc.org/tool_runner?tool_id=mi_clip for customized analysis.

  11. Approach and tool for computer animation of fields in electrical apparatus

    International Nuclear Information System (INIS)

    Miltchev, Radoslav; Yatchev, Ivan S.; Ritchie, Ewen

    2002-01-01

    The paper presents a technical approach and post-processing tool for creating and displaying computer animation. The approach enables handling of two- and three-dimensional physical field phenomena results obtained from finite element software or to display movement processes in electrical apparatus simulations. The main goal of this work is to extend auxiliary features built in general-purpose CAD software working in the Windows environment. Different storage techniques were examined and the one employing image capturing was chosen. The developed tool provides benefits of independent visualisation, creating scenarios and facilities for exporting animations in common file fon-nats for distribution on different computer platforms. It also provides a valuable educational tool.(Author)

  12. Identifying Broadband Rotational Spectra with Neural Networks

    Science.gov (United States)

    Zaleski, Daniel P.; Prozument, Kirill

    2017-06-01

    A typical broadband rotational spectrum may contain several thousand observable transitions, spanning many species. Identifying the individual spectra, particularly when the dynamic range reaches 1,000:1 or even 10,000:1, can be challenging. One approach is to apply automated fitting routines. In this approach, combinations of 3 transitions can be created to form a "triple", which allows fitting of the A, B, and C rotational constants in a Watson-type Hamiltonian. On a standard desktop computer, with a target molecule of interest, a typical AUTOFIT routine takes 2-12 hours depending on the spectral density. A new approach is to utilize machine learning to train a computer to recognize the patterns (frequency spacing and relative intensities) inherit in rotational spectra and to identify the individual spectra in a raw broadband rotational spectrum. Here, recurrent neural networks have been trained to identify different types of rotational spectra and classify them accordingly. Furthermore, early results in applying convolutional neural networks for spectral object recognition in broadband rotational spectra appear promising. Perez et al. "Broadband Fourier transform rotational spectroscopy for structure determination: The water heptamer." Chem. Phys. Lett., 2013, 571, 1-15. Seifert et al. "AUTOFIT, an Automated Fitting Tool for Broadband Rotational Spectra, and Applications to 1-Hexanal." J. Mol. Spectrosc., 2015, 312, 13-21. Bishop. "Neural networks for pattern recognition." Oxford university press, 1995.

  13. Computational Approach to Annotating Variants of Unknown Significance in Clinical Next Generation Sequencing.

    Science.gov (United States)

    Schulz, Wade L; Tormey, Christopher A; Torres, Richard

    2015-01-01

    Next generation sequencing (NGS) has become a common technology in the clinical laboratory, particularly for the analysis of malignant neoplasms. However, most mutations identified by NGS are variants of unknown clinical significance (VOUS). Although the approach to define these variants differs by institution, software algorithms that predict variant effect on protein function may be used. However, these algorithms commonly generate conflicting results, potentially adding uncertainty to interpretation. In this review, we examine several computational tools used to predict whether a variant has clinical significance. In addition to describing the role of these tools in clinical diagnostics, we assess their efficacy in analyzing known pathogenic and benign variants in hematologic malignancies. Copyright© by the American Society for Clinical Pathology (ASCP).

  14. Identifying prognostic features by bottom-up approach and correlating to drug repositioning.

    Directory of Open Access Journals (Sweden)

    Wei Li

    Full Text Available Traditionally top-down method was used to identify prognostic features in cancer research. That is to say, differentially expressed genes usually in cancer versus normal were identified to see if they possess survival prediction power. The problem is that prognostic features identified from one set of patient samples can rarely be transferred to other datasets. We apply bottom-up approach in this study: survival correlated or clinical stage correlated genes were selected first and prioritized by their network topology additionally, then a small set of features can be used as a prognostic signature.Gene expression profiles of a cohort of 221 hepatocellular carcinoma (HCC patients were used as a training set, 'bottom-up' approach was applied to discover gene-expression signatures associated with survival in both tumor and adjacent non-tumor tissues, and compared with 'top-down' approach. The results were validated in a second cohort of 82 patients which was used as a testing set.Two sets of gene signatures separately identified in tumor and adjacent non-tumor tissues by bottom-up approach were developed in the training cohort. These two signatures were associated with overall survival times of HCC patients and the robustness of each was validated in the testing set, and each predictive performance was better than gene expression signatures reported previously. Moreover, genes in these two prognosis signature gave some indications for drug-repositioning on HCC. Some approved drugs targeting these markers have the alternative indications on hepatocellular carcinoma.Using the bottom-up approach, we have developed two prognostic gene signatures with a limited number of genes that associated with overall survival times of patients with HCC. Furthermore, prognostic markers in these two signatures have the potential to be therapeutic targets.

  15. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant.

    Directory of Open Access Journals (Sweden)

    William S Beatty

    Full Text Available The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international, whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result

  16. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant

    Science.gov (United States)

    Beatty, William S.; Kesler, Dylan C.; Webb, Elisabeth B.; Raedeke, Andrew H.; Naylor, Luke W.; Humburg, Dale D.

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted

  17. Identification of evolutionarily conserved Momordica charantia microRNAs using computational approach and its utility in phylogeny analysis.

    Science.gov (United States)

    Thirugnanasambantham, Krishnaraj; Saravanan, Subramanian; Karikalan, Kulandaivelu; Bharanidharan, Rajaraman; Lalitha, Perumal; Ilango, S; HairulIslam, Villianur Ibrahim

    2015-10-01

    Momordica charantia (bitter gourd, bitter melon) is a monoecious Cucurbitaceae with anti-oxidant, anti-microbial, anti-viral and anti-diabetic potential. Molecular studies on this economically valuable plant are very essential to understand its phylogeny and evolution. MicroRNAs (miRNAs) are conserved, small, non-coding RNA with ability to regulate gene expression by bind the 3' UTR region of target mRNA and are evolved at different rates in different plant species. In this study we have utilized homology based computational approach and identified 27 mature miRNAs for the first time from this bio-medically important plant. The phylogenetic tree developed from binary data derived from the data on presence/absence of the identified miRNAs were noticed to be uncertain and biased. Most of the identified miRNAs were highly conserved among the plant species and sequence based phylogeny analysis of miRNAs resolved the above difficulties in phylogeny approach using miRNA. Predicted gene targets of the identified miRNAs revealed their importance in regulation of plant developmental process. Reported miRNAs held sequence conservation in mature miRNAs and the detailed phylogeny analysis of pre-miRNA sequences revealed genus specific segregation of clusters. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Sepsis reconsidered: Identifying novel metrics for behavioral landscape characterization with a high-performance computing implementation of an agent-based model.

    Science.gov (United States)

    Cockrell, Chase; An, Gary

    2017-10-07

    Sepsis affects nearly 1 million people in the United States per year, has a mortality rate of 28-50% and requires more than $20 billion a year in hospital costs. Over a quarter century of research has not yielded a single reliable diagnostic test or a directed therapeutic agent for sepsis. Central to this insufficiency is the fact that sepsis remains a clinical/physiological diagnosis representing a multitude of molecularly heterogeneous pathological trajectories. Advances in computational capabilities offered by High Performance Computing (HPC) platforms call for an evolution in the investigation of sepsis to attempt to define the boundaries of traditional research (bench, clinical and computational) through the use of computational proxy models. We present a novel investigatory and analytical approach, derived from how HPC resources and simulation are used in the physical sciences, to identify the epistemic boundary conditions of the study of clinical sepsis via the use of a proxy agent-based model of systemic inflammation. Current predictive models for sepsis use correlative methods that are limited by patient heterogeneity and data sparseness. We address this issue by using an HPC version of a system-level validated agent-based model of sepsis, the Innate Immune Response ABM (IIRBM), as a proxy system in order to identify boundary conditions for the possible behavioral space for sepsis. We then apply advanced analysis derived from the study of Random Dynamical Systems (RDS) to identify novel means for characterizing system behavior and providing insight into the tractability of traditional investigatory methods. The behavior space of the IIRABM was examined by simulating over 70 million sepsis patients for up to 90 days in a sweep across the following parameters: cardio-respiratory-metabolic resilience; microbial invasiveness; microbial toxigenesis; and degree of nosocomial exposure. In addition to using established methods for describing parameter space, we

  19. An approach to identify issues affecting ERP implementation in Indian SMEs

    Directory of Open Access Journals (Sweden)

    Rana Basu

    2012-06-01

    Full Text Available Purpose: The purpose of this paper is to present the findings of a study which is based on the results of a comprehensive compilation of literature and subsequent analysis of ERP implementation success issues in context to Indian Small and Medium scale Enterprises (SME’s. This paper attempts to explore the existing literature and highlight those issues on ERP implementation and further to this the researchers applied TOPSIS (Technique for order preference by similarity to ideal solution method to prioritize issues affecting successful implementation of ERP. Design/methodology/approach: Based on the literature review certain issues leading to successful ERP implementation have been identified and to identify key issues Pareto Analysis (80-20 Rule have been applied. Further to extraction of key issues a survey based on TOPSIS was carried out in Indian small and medium scale enterprises. Findings: Based on review of literature 25 issues have been identified and further Pareto analysis has been done to extract key issues which is further prioritized by applying Topsis method. Research limitations/implications: Beside those identified issues there may be other issues that need to be explored. There is scope to enhance this study by taking into consideration different type of industries and by extending number of respondents. Practical implications: By identifying key issues for SMEs, managers can better prioritize issues to make implementation process smooth without disruption. ERP vendors can take inputs from this study to change their implementation approach while targeting small scale enterprises. Originality/value: There is no published literature available which followed a similar approach in identification of the critical issues affecting ERP in small and mid-sized companies in India or in any developing economy.

  20. ceRNAs in plants: computational approaches and associated challenges for target mimic research.

    Science.gov (United States)

    Paschoal, Alexandre Rossi; Lozada-Chávez, Irma; Domingues, Douglas Silva; Stadler, Peter F

    2017-05-30

    The competing endogenous RNA hypothesis has gained increasing attention as a potential global regulatory mechanism of microRNAs (miRNAs), and as a powerful tool to predict the function of many noncoding RNAs, including miRNAs themselves. Most studies have been focused on animals, although target mimic (TMs) discovery as well as important computational and experimental advances has been developed in plants over the past decade. Thus, our contribution summarizes recent progresses in computational approaches for research of miRNA:TM interactions. We divided this article in three main contributions. First, a general overview of research on TMs in plants is presented with practical descriptions of the available literature, tools, data, databases and computational reports. Second, we describe a common protocol for the computational and experimental analyses of TM. Third, we provide a bioinformatics approach for the prediction of TM motifs potentially cross-targeting both members within the same or from different miRNA families, based on the identification of consensus miRNA-binding sites from known TMs across sequenced genomes, transcriptomes and known miRNAs. This computational approach is promising because, in contrast to animals, miRNA families in plants are large with identical or similar members, several of which are also highly conserved. From the three consensus TM motifs found with our approach: MIM166, MIM171 and MIM159/319, the last one has found strong support on the recent experimental work by Reichel and Millar [Specificity of plant microRNA TMs: cross-targeting of mir159 and mir319. J Plant Physiol 2015;180:45-8]. Finally, we stress the discussion on the major computational and associated experimental challenges that have to be faced in future ceRNA studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. A New Approach to Practical Active-Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Buus; Nordholt, Peter Sebastian; Orlandi, Claudio

    2011-01-01

    We propose a new approach to practical two-party computation secure against an active adversary. All prior practical protocols were based on Yao's garbled circuits. We use an OT-based approach and get efficiency via OT extension in the random oracle model. To get a practical protocol we introduce...... a number of novel techniques for relating the outputs and inputs of OTs in a larger construction. We also report on an implementation of this approach, that shows that our protocol is more efficient than any previous one: For big enough circuits, we can evaluate more than 20000 Boolean gates per second...

  2. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches

    KAUST Repository

    Jiang, Hanlun

    2016-12-06

    MicroRNA (miRNA) and Argonaute (AGO) protein together form the RNA-induced silencing complex (RISC) that plays an essential role in the regulation of gene expression. Elucidating the underlying mechanism of AGO-miRNA recognition is thus of great importance not only for the in-depth understanding of miRNA function but also for inspiring new drugs targeting miRNAs. In this chapter we introduce a combined computational approach of molecular dynamics (MD) simulations, Markov state models (MSMs), and protein-RNA docking to investigate AGO-miRNA recognition. Constructed from MD simulations, MSMs can elucidate the conformational dynamics of AGO at biologically relevant timescales. Protein-RNA docking can then efficiently identify the AGO conformations that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms of molecular recognition between large, flexible, and complex biomolecules.

  3. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches.

    Science.gov (United States)

    Jiang, Hanlun; Zhu, Lizhe; Héliou, Amélie; Gao, Xin; Bernauer, Julie; Huang, Xuhui

    2017-01-01

    MicroRNA (miRNA) and Argonaute (AGO) protein together form the RNA-induced silencing complex (RISC) that plays an essential role in the regulation of gene expression. Elucidating the underlying mechanism of AGO-miRNA recognition is thus of great importance not only for the in-depth understanding of miRNA function but also for inspiring new drugs targeting miRNAs. In this chapter we introduce a combined computational approach of molecular dynamics (MD) simulations, Markov state models (MSMs), and protein-RNA docking to investigate AGO-miRNA recognition. Constructed from MD simulations, MSMs can elucidate the conformational dynamics of AGO at biologically relevant timescales. Protein-RNA docking can then efficiently identify the AGO conformations that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms of molecular recognition between large, flexible, and complex biomolecules.

  4. Demand side management scheme in smart grid with cloud computing approach using stochastic dynamic programming

    Directory of Open Access Journals (Sweden)

    S. Sofana Reka

    2016-09-01

    Full Text Available This paper proposes a cloud computing framework in smart grid environment by creating small integrated energy hub supporting real time computing for handling huge storage of data. A stochastic programming approach model is developed with cloud computing scheme for effective demand side management (DSM in smart grid. Simulation results are obtained using GUI interface and Gurobi optimizer in Matlab in order to reduce the electricity demand by creating energy networks in a smart hub approach.

  5. a Holistic Approach for Inspection of Civil Infrastructures Based on Computer Vision Techniques

    Science.gov (United States)

    Stentoumis, C.; Protopapadakis, E.; Doulamis, A.; Doulamis, N.

    2016-06-01

    In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  6. Uncovering stability mechanisms in microbial ecosystems - combining microcosm experiments, computational modelling and ecological theory in a multidisciplinary approach

    Science.gov (United States)

    Worrich, Anja; König, Sara; Banitz, Thomas; Centler, Florian; Frank, Karin; Kästner, Matthias; Miltner, Anja; Thullner, Martin; Wick, Lukas

    2015-04-01

    Although bacterial degraders in soil are commonly exposed to fluctuating environmental conditions, the functional performance of the biodegradation processes can often be maintained by resistance and resilience mechanisms. However, there is still a gap in the mechanistic understanding of key factors contributing to the stability of such an ecosystem service. Therefore we developed an integrated approach combining microcosm experiments, simulation models and ecological theory to directly make use of the strengths of these disciplines. In a continuous interplay process, data, hypotheses, and central questions are exchanged between disciplines to initiate new experiments and models to ultimately identify buffer mechanisms and factors providing functional stability. We focus on drying and rewetting-cycles in soil ecosystems, which are a major abiotic driver for bacterial activity. Functional recovery of the system was found to depend on different spatial processes in the computational model. In particular, bacterial motility is a prerequisite for biodegradation if either bacteria or substrate are heterogeneously distributed. Hence, laboratory experiments focussing on bacterial dispersal processes were conducted and confirmed this finding also for functional resistance. Obtained results will be incorporated into the model in the next step. Overall, the combination of computational modelling and laboratory experiments identified spatial processes as the main driving force for functional stability in the considered system, and has proved a powerful methodological approach.

  7. Discovery and Development of ATP-Competitive mTOR Inhibitors Using Computational Approaches.

    Science.gov (United States)

    Luo, Yao; Wang, Ling

    2017-11-16

    The mammalian target of rapamycin (mTOR) is a central controller of cell growth, proliferation, metabolism, and angiogenesis. This protein is an attractive target for new anticancer drug development. Significant progress has been made in hit discovery, lead optimization, drug candidate development and determination of the three-dimensional (3D) structure of mTOR. Computational methods have been applied to accelerate the discovery and development of mTOR inhibitors helping to model the structure of mTOR, screen compound databases, uncover structure-activity relationship (SAR) and optimize the hits, mine the privileged fragments and design focused libraries. Besides, computational approaches were also applied to study protein-ligand interactions mechanisms and in natural product-driven drug discovery. Herein, we survey the most recent progress on the application of computational approaches to advance the discovery and development of compounds targeting mTOR. Future directions in the discovery of new mTOR inhibitors using computational methods are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  8. Wavelets-Computational Aspects of Sterian Realistic Approach to Uncertainty Principle in High Energy Physics: A Transient Approach

    Directory of Open Access Journals (Sweden)

    Cristian Toma

    2013-01-01

    Full Text Available This study presents wavelets-computational aspects of Sterian-realistic approach to uncertainty principle in high energy physics. According to this approach, one cannot make a device for the simultaneous measuring of the canonical conjugate variables in reciprocal Fourier spaces. However, such aspects regarding the use of conjugate Fourier spaces can be also noticed in quantum field theory, where the position representation of a quantum wave is replaced by momentum representation before computing the interaction in a certain point of space, at a certain moment of time. For this reason, certain properties regarding the switch from one representation to another in these conjugate Fourier spaces should be established. It is shown that the best results can be obtained using wavelets aspects and support macroscopic functions for computing (i wave-train nonlinear relativistic transformation, (ii reflection/refraction with a constant shift, (iii diffraction considered as interaction with a null phase shift without annihilation of associated wave, (iv deflection by external electromagnetic fields without phase loss, and (v annihilation of associated wave-train through fast and spatially extended phenomena according to uncertainty principle.

  9. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    Science.gov (United States)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-05-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  10. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    Science.gov (United States)

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  11. Identifying Benefits and risks associated with utilizing cloud computing

    OpenAIRE

    Shayan, Jafar; Azarnik, Ahmad; Chuprat, Suriayati; Karamizadeh, Sasan; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is an emerging computing model where IT and computing operations are delivered as services in highly scalable and cost effective manner. Recently, embarking this new model in business has become popular. Companies in diverse sectors intend to leverage cloud computing architecture, platforms and applications in order to gain higher competitive advantages. Likewise other models, cloud computing brought advantages to attract business but meanwhile fostering cloud has led to some ...

  12. Three novel approaches to structural identifiability analysis in mixed-effects models.

    Science.gov (United States)

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not

  13. Combinatorial computational chemistry approach to the design of metal catalysts for deNOx

    International Nuclear Information System (INIS)

    Endou, Akira; Jung, Changho; Kusagaya, Tomonori; Kubo, Momoji; Selvam, Parasuraman; Miyamoto, Akira

    2004-01-01

    Combinatorial chemistry is an efficient technique for the synthesis and screening of a large number of compounds. Recently, we introduced the combinatorial approach to computational chemistry for catalyst design and proposed a new method called ''combinatorial computational chemistry''. In the present study, we have applied this combinatorial computational chemistry approach to the design of precious metal catalysts for deNO x . As the first step of the screening of the metal catalysts, we studied Rh, Pd, Ag, Ir, Pt, and Au clusters regarding the adsorption properties towards NO molecule. It was demonstrated that the energetically most stable adsorption state of NO on Ir model cluster, which was irrespective of both the shape and number of atoms including the model clusters

  14. Identifying inhibitory compounds in lignocellulosic biomass hydrolysates using an exometabolomics approach

    NARCIS (Netherlands)

    Zha, Y.; Westerhuis, J.A.; Muilwijk, B.; Overkamp, K.M.; Nijmeijer, B.M.; Coulier, L.; Smilde, A.K.; Punt, P.J.

    2014-01-01

    Background: Inhibitors are formed that reduce the fermentation performance of fermenting yeast during the pretreatment process of lignocellulosic biomass. An exometabolomics approach was applied to systematically identify inhibitors in lignocellulosic biomass hydrolysates.Results: We studied the

  15. Identifying inhibitory compounds in lignocellulosic biomass hydrolysates using an exometabolomics approach

    NARCIS (Netherlands)

    Zha, Y.; Westerhuis, J.A.; Muilwijk, B.; Overkamp, K.M.; Nijmeijer, B.M.; Coulier, L.; Smilde, A.K.; Punt, P.J.

    2014-01-01

    BACKGROUND: Inhibitors are formed that reduce the fermentation performance of fermenting yeast during the pretreatment process of lignocellulosic biomass. An exometabolomics approach was applied to systematically identify inhibitors in lignocellulosic biomass hydrolysates. RESULTS: We studied the

  16. A computational approach to negative priming

    Science.gov (United States)

    Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael

    2007-09-01

    Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).

  17. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  18. Identifying Ghanaian Pre-Service Teachers' Readiness for Computer Use: A Technology Acceptance Model Approach

    Science.gov (United States)

    Gyamfi, Stephen Adu

    2016-01-01

    This study extends the technology acceptance model to identify factors that influence technology acceptance among pre-service teachers in Ghana. Data from 380 usable questionnaires were tested against the research model. Utilising the extended technology acceptance model (TAM) as a research framework, the study found that: pre-service teachers'…

  19. Novel computational approaches characterizing knee physiotherapy

    Directory of Open Access Journals (Sweden)

    Wangdo Kim

    2014-01-01

    Full Text Available A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physiotherapy by introducing a new dimension of foot loading to the knee axis alignment producing an improved functional status of the patient. New physiotherapeutic applications are then possible by aligning foot loading with the functional axis of the knee joint during the treatment of patients with osteoarthritis.

  20. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  1. A HOLISTIC APPROACH FOR INSPECTION OF CIVIL INFRASTRUCTURES BASED ON COMPUTER VISION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    C. Stentoumis

    2016-06-01

    Full Text Available In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  2. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  3. Novel approaches to identify protective malaria vaccine candidates

    Directory of Open Access Journals (Sweden)

    Wan Ni eChia

    2014-11-01

    Full Text Available Efforts to develop vaccines against malaria have been the focus of substantial research activities for decades. Several categories of candidate vaccines are currently being developed for protection against malaria, based on antigens corresponding to the pre-erythrocytic, blood-stage or sexual stages of the parasite. Long lasting sterile protection from Plasmodium falciparum sporozoite challenge has been observed in human following vaccination with whole parasite formulations, clearly demonstrating that a protective immune response targeting predominantly the pre-erythrocytic stages can develop against malaria. However, most of vaccine candidates currently being investigated, which are mostly subunits vaccines, have not been able to induce substantial (>50% protection thus far. This is due to the fact that the antigens responsible for protection against the different parasite stages are still yet to be known and relevant correlates of protection have remained elusive. For a vaccine to be developed in a timely manner, novel approaches are required. In this article, we review the novel approaches that have been developed to identify the antigens for the development of an effective malaria vaccine.

  4. Alternative approaches for identifying acute systemic toxicity: Moving from research to regulatory testing.

    Science.gov (United States)

    Hamm, Jon; Sullivan, Kristie; Clippinger, Amy J; Strickland, Judy; Bell, Shannon; Bhhatarai, Barun; Blaauboer, Bas; Casey, Warren; Dorman, David; Forsby, Anna; Garcia-Reyero, Natàlia; Gehen, Sean; Graepel, Rabea; Hotchkiss, Jon; Lowit, Anna; Matheson, Joanna; Reaves, Elissa; Scarano, Louis; Sprankle, Catherine; Tunkel, Jay; Wilson, Dan; Xia, Menghang; Zhu, Hao; Allen, David

    2017-06-01

    Acute systemic toxicity testing provides the basis for hazard labeling and risk management of chemicals. A number of international efforts have been directed at identifying non-animal alternatives for in vivo acute systemic toxicity tests. A September 2015 workshop, Alternative Approaches for Identifying Acute Systemic Toxicity: Moving from Research to Regulatory Testing, reviewed the state-of-the-science of non-animal alternatives for this testing and explored ways to facilitate implementation of alternatives. Workshop attendees included representatives from international regulatory agencies, academia, nongovernmental organizations, and industry. Resources identified as necessary for meaningful progress in implementing alternatives included compiling and making available high-quality reference data, training on use and interpretation of in vitro and in silico approaches, and global harmonization of testing requirements. Attendees particularly noted the need to characterize variability in reference data to evaluate new approaches. They also noted the importance of understanding the mechanisms of acute toxicity, which could be facilitated by the development of adverse outcome pathways. Workshop breakout groups explored different approaches to reducing or replacing animal use for acute toxicity testing, with each group crafting a roadmap and strategy to accomplish near-term progress. The workshop steering committee has organized efforts to implement the recommendations of the workshop participants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Computational Approach for Studying Optical Properties of DNA Systems in Solution

    DEFF Research Database (Denmark)

    Nørby, Morten Steen; Svendsen, Casper Steinmann; Olsen, Jógvan Magnus Haugaard

    2016-01-01

    In this paper we present a study of the methodological aspects regarding calculations of optical properties for DNA systems in solution. Our computational approach will be built upon a fully polarizable QM/MM/Continuum model within a damped linear response theory framework. In this approach...... the environment is given a highly advanced description in terms of the electrostatic potential through the polarizable embedding model. Furthermore, bulk solvent effects are included in an efficient manner through a conductor-like screening model. With the aim of reducing the computational cost we develop a set...... of averaged partial charges and distributed isotropic dipole-dipole polarizabilities for DNA suitable for describing the classical region in ground-state and excited-state calculations. Calculations of the UV-spectrum of the 2-aminopurine optical probe embedded in a DNA double helical structure are presented...

  6. Understanding Gulf War Illness: An Integrative Modeling Approach

    Science.gov (United States)

    2017-10-01

    using a novel mathematical model. The computational biology approach will enable the consortium to quickly identify targets of dysfunction and find... computer / mathematical paradigms for evaluation of treatment strategies 12-30 50% Develop pilot clinical trials on basis of animal studies 24-36 60...the goal of testing chemical treatments. The immune and autonomic biomarkers will be tested using a computational modeling approach allowing for a

  7. New Approaches to the Computer Simulation of Amorphous Alloys: A Review.

    Science.gov (United States)

    Valladares, Ariel A; Díaz-Celaya, Juan A; Galván-Colín, Jonathan; Mejía-Mendoza, Luis M; Reyes-Retana, José A; Valladares, Renela M; Valladares, Alexander; Alvarez-Ramirez, Fernando; Qu, Dongdong; Shen, Jun

    2011-04-13

    In this work we review our new methods to computer generate amorphous atomic topologies of several binary alloys: SiH, SiN, CN; binary systems based on group IV elements like SiC; the GeSe 2 chalcogenide; aluminum-based systems: AlN and AlSi, and the CuZr amorphous alloy. We use an ab initio approach based on density functionals and computationally thermally-randomized periodically-continued cells with at least 108 atoms. The computational thermal process to generate the amorphous alloys is the undermelt-quench approach, or one of its variants, that consists in linearly heating the samples to just below their melting (or liquidus) temperatures, and then linearly cooling them afterwards. These processes are carried out from initial crystalline conditions using short and long time steps. We find that a step four-times the default time step is adequate for most of the simulations. Radial distribution functions (partial and total) are calculated and compared whenever possible with experimental results, and the agreement is very good. For some materials we report studies of the effect of the topological disorder on their electronic and vibrational densities of states and on their optical properties.

  8. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  9. Computational Approach to Identify Different Injuries by Firearms.

    Science.gov (United States)

    Costa, Sarah Teixeira; Freire, Alexandre Rodrigues; Matoso, Rodrigo Ivo; Daruge Júnior, Eduardo; Rossi, Ana Cláudia; Prado, Felippe Bevilacqua

    2017-03-01

    Complications arise in the analysis of gunshot wounds to the maxillofacial region, when neither the projectile nor the gun is found at the crime scene. We simulated 5- and 15-cm firing distances at a human mandible to investigate the external morphology of entrance wounds based on fire range. The ammunition models, .40-caliber S&W, .380-caliber, and 9 × 19-mm Luger, were constructed with free-form NURBS surfaces. In a dynamic simulation, projectiles were fired against mandibular body 3D model at 5 and 15 cm. All entrance wounds presented oval aspect. Maximum diameter and von Mises stress values were 16.5 mm and 50.8 MPa, both for .40-caliber S&W fired at 5 cm. The maximum energy loss was 138.4 J for .40 S&W fired at 15 cm. In conclusion, the mandible was most affected by .40-caliber S&W and morphological differences were observable in holes caused by different incoming projectile calibers fired at different distances. © 2017 American Academy of Forensic Sciences.

  10. Identifying bioaccumulative halogenated organic compounds using a nontargeted analytical approach: seabirds as sentinels.

    Directory of Open Access Journals (Sweden)

    Christopher J Millow

    Full Text Available Persistent organic pollutants (POPs are typically monitored via targeted mass spectrometry, which potentially identifies only a fraction of the contaminants actually present in environmental samples. With new anthropogenic compounds continuously introduced to the environment, novel and proactive approaches that provide a comprehensive alternative to targeted methods are needed in order to more completely characterize the diversity of known and unknown compounds likely to cause adverse effects. Nontargeted mass spectrometry attempts to extensively screen for compounds, providing a feasible approach for identifying contaminants that warrant future monitoring. We employed a nontargeted analytical method using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOF-MS to characterize halogenated organic compounds (HOCs in California Black skimmer (Rynchops niger eggs. Our study identified 111 HOCs; 84 of these compounds were regularly detected via targeted approaches, while 27 were classified as typically unmonitored or unknown. Typically unmonitored compounds of note in bird eggs included tris(4-chlorophenylmethane (TCPM, tris(4-chlorophenylmethanol (TCPMOH, triclosan, permethrin, heptachloro-1'-methyl-1,2'-bipyrrole (MBP, as well as four halogenated unknown compounds that could not be identified through database searching or the literature. The presence of these compounds in Black skimmer eggs suggests they are persistent, bioaccumulative, potentially biomagnifying, and maternally transferring. Our results highlight the utility and importance of employing nontargeted analytical tools to assess true contaminant burdens in organisms, as well as to demonstrate the value in using environmental sentinels to proactively identify novel contaminants.

  11. Security approaches in using tablet computers for primary data collection in clinical research.

    Science.gov (United States)

    Wilcox, Adam B; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project.

  12. Screening of photosynthetic pigments for herbicidal activity with a new computational molecular approach.

    Science.gov (United States)

    Krishnaraj, R Navanietha; Chandran, Saravanan; Pal, Parimal; Berchmans, Sheela

    2013-12-01

    There is an immense interest among the researchers to identify new herbicides which are effective against the herbs without affecting the environment. In this work, photosynthetic pigments are used as the ligands to predict their herbicidal activity. The enzyme 5-enolpyruvylshikimate-3-phosphate (EPSP) synthase is a good target for the herbicides. Homology modeling of the target enzyme is done using Modeler 9.11 and the model is validated. Docking studies were performed with AutoDock Vina algorithm to predict the binding of the natural pigments such as β-carotene, chlorophyll a, chlorophyll b, phycoerythrin and phycocyanin to the target. β-carotene, phycoerythrin and phycocyanin have higher binding energies indicating the herbicidal activity of the pigments. This work reports a procedure to screen herbicides with computational molecular approach. These pigments will serve as potential bioherbicides in the future.

  13. Novel computational approaches for the analysis of cosmic magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)

    2016-07-01

    In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.

  14. A computational approach to evaluate the androgenic affinity of iprodione, procymidone, vinclozolin and their metabolites.

    Directory of Open Access Journals (Sweden)

    Corrado Lodovico Galli

    Full Text Available Our research is aimed at devising and assessing a computational approach to evaluate the affinity of endocrine active substances (EASs and their metabolites towards the ligand binding domain (LBD of the androgen receptor (AR in three distantly related species: human, rat, and zebrafish. We computed the affinity for all the selected molecules following a computational approach based on molecular modelling and docking. Three different classes of molecules with well-known endocrine activity (iprodione, procymidone, vinclozolin, and a selection of their metabolites were evaluated. Our approach was demonstrated useful as the first step of chemical safety evaluation since ligand-target interaction is a necessary condition for exerting any biological effect. Moreover, a different sensitivity concerning AR LBD was computed for the tested species (rat being the least sensitive of the three. This evidence suggests that, in order not to over-/under-estimate the risks connected with the use of a chemical entity, further in vitro and/or in vivo tests should be carried out only after an accurate evaluation of the most suitable cellular system or animal species. The introduction of in silico approaches to evaluate hazard can accelerate discovery and innovation with a lower economic effort than with a fully wet strategy.

  15. A computational approach to evaluate the androgenic affinity of iprodione, procymidone, vinclozolin and their metabolites.

    Science.gov (United States)

    Galli, Corrado Lodovico; Sensi, Cristina; Fumagalli, Amos; Parravicini, Chiara; Marinovich, Marina; Eberini, Ivano

    2014-01-01

    Our research is aimed at devising and assessing a computational approach to evaluate the affinity of endocrine active substances (EASs) and their metabolites towards the ligand binding domain (LBD) of the androgen receptor (AR) in three distantly related species: human, rat, and zebrafish. We computed the affinity for all the selected molecules following a computational approach based on molecular modelling and docking. Three different classes of molecules with well-known endocrine activity (iprodione, procymidone, vinclozolin, and a selection of their metabolites) were evaluated. Our approach was demonstrated useful as the first step of chemical safety evaluation since ligand-target interaction is a necessary condition for exerting any biological effect. Moreover, a different sensitivity concerning AR LBD was computed for the tested species (rat being the least sensitive of the three). This evidence suggests that, in order not to over-/under-estimate the risks connected with the use of a chemical entity, further in vitro and/or in vivo tests should be carried out only after an accurate evaluation of the most suitable cellular system or animal species. The introduction of in silico approaches to evaluate hazard can accelerate discovery and innovation with a lower economic effort than with a fully wet strategy.

  16. Fast reactor safety and computational thermo-fluid dynamics approaches

    International Nuclear Information System (INIS)

    Ninokata, Hisashi; Shimizu, Takeshi

    1993-01-01

    This article provides a brief description of the safety principle on which liquid metal cooled fast breeder reactors (LMFBRs) is based and the roles of computations in the safety practices. A number of thermohydraulics models have been developed to date that successfully describe several of the important types of fluids and materials motion encountered in the analysis of postulated accidents in LMFBRs. Most of these models use a mixture of implicit and explicit numerical solution techniques in solving a set of conservation equations formulated in Eulerian coordinates, with special techniques included to specific situations. Typical computational thermo-fluid dynamics approaches are discussed in particular areas of analyses of the physical phenomena relevant to the fuel subassembly thermohydraulics design and that involve describing the motion of molten materials in the core over a large scale. (orig.)

  17. Persistent Identifier Practice for Big Data Management at NCI

    Directory of Open Access Journals (Sweden)

    Jingbo Wang

    2017-04-01

    Full Text Available The National Computational Infrastructure (NCI manages over 10 PB research data, which is co-located with the high performance computer (Raijin and an HPC class 3000 core OpenStack cloud system (Tenjin. In support of this integrated High Performance Computing/High Performance Data (HPC/HPD infrastructure, NCI’s data management practices includes building catalogues, DOI minting, data curation, data publishing, and data delivery through a variety of data services. The metadata catalogues, DOIs, THREDDS, and Vocabularies, all use different Uniform Resource Locator (URL styles. A Persistent IDentifier (PID service provides an important utility to manage URLs in a consistent, controlled and monitored manner to support the robustness of our national ‘Big Data’ infrastructure. In this paper we demonstrate NCI’s approach of utilising the NCI’s 'PID Service 'to consistently manage its persistent identifiers with various applications.

  18. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P assessment of FFR. Average execution time went down from 196.3 ± 78.5 s for the CFD model to ∼2.4 ± 0.44 s for the machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  19. Self-regulated learning in higher education: strategies adopted by computer programming students when supported by the SimProgramming approach

    Directory of Open Access Journals (Sweden)

    Daniela Pedrosa

    Full Text Available Abstract The goal of the SimProgramming approach is to help students overcome their learning difficulties in the transition from entry-level to advanced computer programming, developing an appropriate set of learning strategies. We implemented it at the University of Trás-os-Montes e Alto Douro (Portugal, in two courses (PM3 and PM4 of the bachelor programmes in Informatics Engineering and ICT. We conducted semi-structured interviews with students (n=38 at the end of the courses, to identify the students’ strategies for self-regulation of learning in the assignment. We found that students changed some of their strategies from one course edition to the following one and that changes are related to the SimProgramming approach. We believe that changes to the educational approach were appropriate to support the assignment goals. We recommend applying the SimProgramming approach in other educational contexts, to improve educational practices by including techniques to help students in their learning.

  20. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  1. How people learn while playing serious games: A computational modelling approach

    NARCIS (Netherlands)

    Westera, Wim

    2017-01-01

    This paper proposes a computational modelling approach for investigating the interplay of learning and playing in serious games. A formal model is introduced that allows for studying the details of playing a serious game under diverse conditions. The dynamics of player action and motivation is based

  2. DIRProt: a computational approach for discriminating insecticide resistant proteins from non-resistant proteins.

    Science.gov (United States)

    Meher, Prabina Kumar; Sahu, Tanmaya Kumar; Banchariya, Anjali; Rao, Atmakuri Ramakrishna

    2017-03-24

    Insecticide resistance is a major challenge for the control program of insect pests in the fields of crop protection, human and animal health etc. Resistance to different insecticides is conferred by the proteins encoded from certain class of genes of the insects. To distinguish the insecticide resistant proteins from non-resistant proteins, no computational tool is available till date. Thus, development of such a computational tool will be helpful in predicting the insecticide resistant proteins, which can be targeted for developing appropriate insecticides. Five different sets of feature viz., amino acid composition (AAC), di-peptide composition (DPC), pseudo amino acid composition (PAAC), composition-transition-distribution (CTD) and auto-correlation function (ACF) were used to map the protein sequences into numeric feature vectors. The encoded numeric vectors were then used as input in support vector machine (SVM) for classification of insecticide resistant and non-resistant proteins. Higher accuracies were obtained under RBF kernel than that of other kernels. Further, accuracies were observed to be higher for DPC feature set as compared to others. The proposed approach achieved an overall accuracy of >90% in discriminating resistant from non-resistant proteins. Further, the two classes of resistant proteins i.e., detoxification-based and target-based were discriminated from non-resistant proteins with >95% accuracy. Besides, >95% accuracy was also observed for discrimination of proteins involved in detoxification- and target-based resistance mechanisms. The proposed approach not only outperformed Blastp, PSI-Blast and Delta-Blast algorithms, but also achieved >92% accuracy while assessed using an independent dataset of 75 insecticide resistant proteins. This paper presents the first computational approach for discriminating the insecticide resistant proteins from non-resistant proteins. Based on the proposed approach, an online prediction server DIRProt has

  3. A true minimally invasive approach for cochlear implantation: high accuracy in cranial base navigation through flat-panel-based volume computed tomography.

    Science.gov (United States)

    Majdani, Omid; Bartling, Soenke H; Leinung, Martin; Stöver, Timo; Lenarz, Minoo; Dullin, Christian; Lenarz, Thomas

    2008-02-01

    High-precision intraoperative navigation using high-resolution flat-panel volume computed tomography makes feasible the possibility of minimally invasive cochlear implant surgery, including cochleostomy. Conventional cochlear implant surgery is typically performed via mastoidectomy with facial recess to identify and avoid damage to vital anatomic landmarks. To accomplish this procedure via a minimally invasive approach--without performing mastoidectomy--in a precise fashion, image-guided technology is necessary. With such an approach, surgical time and expertise may be reduced, and hearing preservation may be improved. Flat-panel volume computed tomography was used to scan 4 human temporal bones. A drilling channel was planned preoperatively from the mastoid surface to the round window niche, providing a margin of safety to all functional important structures (e.g., facial nerve, chorda tympani, incus). Postoperatively, computed tomographic imaging and conventional surgical exploration of the drilled route to the cochlea were performed. All 4 specimens showed a cochleostomy located at the scala tympani anterior inferior to the round window. The chorda tympani was damaged in 1 specimen--this was preoperatively planned as a narrow facial recess was encountered. Using flat-panel volume computed tomography for image-guided surgical navigation, we were able to perform minimally invasive cochlear implant surgery defined as a narrow, single-channel mastoidotomy with cochleostomy. Although this finding is preliminary, it is technologically achievable.

  4. Rapid, computer vision-enabled murine screening system identifies neuropharmacological potential of two new mechanisms

    Directory of Open Access Journals (Sweden)

    Steven L Roberds

    2011-09-01

    Full Text Available The lack of predictive in vitro models for behavioral phenotypes impedes rapid advancement in neuropharmacology and psychopharmacology. In vivo behavioral assays are more predictive of activity in human disorders, but such assays are often highly resource-intensive. Here we describe the successful application of a computer vision-enabled system to identify potential neuropharmacological activity of two new mechanisms. The analytical system was trained using multiple drugs that are used clinically to treat depression, schizophrenia, anxiety, and other psychiatric or behavioral disorders. During blinded testing the PDE10 inhibitor TP-10 produced a signature of activity suggesting potential antipsychotic activity. This finding is consistent with TP-10’s activity in multiple rodent models that is similar to that of clinically used antipsychotic drugs. The CK1ε inhibitor PF-670462 produced a signature consistent with anxiolytic activity and, at the highest dose tested, behavioral effects similar to that of opiate analgesics. Neither TP-10 nor PF-670462 was included in the training set. Thus, computer vision-based behavioral analysis can facilitate drug discovery by identifying neuropharmacological effects of compounds acting through new mechanisms.

  5. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  6. Computer-aided approach for design of tailor-made blended products

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Gernaey, Krist; Woodley, John

    2012-01-01

    A computer-aided methodology has been developed for the design of blended (mixture) products. Through this methodology, it is possible to identify the most suitable chemicals for blending, and “tailor” the blend according to specified product needs (usually product attributes, e.g. performance...... as well as regulatory). The product design methodology has four tasks. First, the design problem is defined: the product needs are identified, translated into target properties and the constraints for each target property are defined. Secondly, target property models are retrieved from a property model...

  7. Computer Modeling of Platinum Reforming Reactors | Momoh ...

    African Journals Online (AJOL)

    This paper, instead of using a theoretical approach has considered a computer model as means of assessing the reformate composition for three-stage fixed bed reactors in platforming unit. This is done by identifying many possible hydrocarbon transformation reactions that are peculiar to the process unit, identify the ...

  8. Computational intelligence approach for NOx emissions minimization in a coal-fired utility boiler

    International Nuclear Information System (INIS)

    Zhou Hao; Zheng Ligang; Cen Kefa

    2010-01-01

    The current work presented a computational intelligence approach used for minimizing NO x emissions in a 300 MW dual-furnaces coal-fired utility boiler. The fundamental idea behind this work included NO x emissions characteristics modeling and NO x emissions optimization. First, an objective function aiming at estimating NO x emissions characteristics from nineteen operating parameters of the studied boiler was represented by a support vector regression (SVR) model. Second, four levels of primary air velocities (PA) and six levels of secondary air velocities (SA) were regulated by using particle swarm optimization (PSO) so as to achieve low NO x emissions combustion. To reduce the time demanding, a more flexible stopping condition was used to improve the computational efficiency without the loss of the quality of the optimization results. The results showed that the proposed approach provided an effective way to reduce NO x emissions from 399.7 ppm to 269.3 ppm, which was much better than a genetic algorithm (GA) based method and was slightly better than an ant colony optimization (ACO) based approach reported in the earlier work. The main advantage of PSO was that the computational cost, typical of less than 25 s under a PC system, is much less than those required for ACO. This meant the proposed approach would be more applicable to online and real-time applications for NO x emissions minimization in actual power plant boilers.

  9. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    Kim, Dong-Seok; Ok, Seung-Yong; Song, Junho; Koh, Hyun-Moo

    2013-01-01

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  10. Favorable noise uniformity properties of Fourier-based interpolation and reconstruction approaches in single-slice helical computed tomography

    International Nuclear Information System (INIS)

    La Riviere, Patrick J.; Pan Xiaochuan

    2002-01-01

    Volumes reconstructed by standard methods from single-slice helical computed tomography (CT) data have been shown to have noise levels that are highly nonuniform relative to those in conventional CT. These noise nonuniformities can affect low-contrast object detectability and have also been identified as the cause of the zebra artifacts that plague maximum intensity projection (MIP) images of such volumes. While these spatially variant noise levels have their root in the peculiarities of the helical scan geometry, there is also a strong dependence on the interpolation and reconstruction algorithms employed. In this paper, we seek to develop image reconstruction strategies that eliminate or reduce, at its source, the nonuniformity of noise levels in helical CT relative to that in conventional CT. We pursue two approaches, independently and in concert. We argue, and verify, that Fourier-based longitudinal interpolation approaches lead to more uniform noise ratios than do the standard 360LI and 180LI approaches. We also demonstrate that a Fourier-based fan-to-parallel rebinning algorithm, used as an alternative to fanbeam filtered backprojection for slice reconstruction, also leads to more uniform noise ratios, even when making use of the 180LI and 360LI interpolation approaches

  11. A Novel Approach to Identifying Trajectories of Mobility Change in Older Adults.

    Directory of Open Access Journals (Sweden)

    Rachel E Ward

    Full Text Available To validate trajectories of late-life mobility change using a novel approach designed to overcome the constraints of modest sample size and few follow-up time points.Using clinical reasoning and distribution-based methodology, we identified trajectories of mobility change (Late Life Function and Disability Instrument across 2 years in 391 participants age ≥65 years from a prospective cohort study designed to identify modifiable impairments predictive of mobility in late-life. We validated our approach using model fit indices and comparing baseline mobility-related factors between trajectories.Model fit indices confirmed that the optimal number of trajectories were between 4 and 6. Mobility-related factors varied across trajectories with the most unfavorable values in poor mobility trajectories and the most favorable in high mobility trajectories. These factors included leg strength, trunk extension endurance, knee flexion range of motion, limb velocity, physical performance measures, and the number and prevalence of medical conditions including osteoarthritis and back pain.Our findings support the validity of this approach and may facilitate the investigation of a broader scope of research questions within aging populations of varied sizes and traits.

  12. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    Science.gov (United States)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  13. A Context-Aware Ubiquitous Learning Approach for Providing Instant Learning Support in Personal Computer Assembly Activities

    Science.gov (United States)

    Hsu, Ching-Kun; Hwang, Gwo-Jen

    2014-01-01

    Personal computer assembly courses have been recognized as being essential in helping students understand computer structure as well as the functionality of each computer component. In this study, a context-aware ubiquitous learning approach is proposed for providing instant assistance to individual students in the learning activity of a…

  14. Design of New Competitive Dengue Ns2b/Ns3 Protease Inhibitors—A Computational Approach

    Directory of Open Access Journals (Sweden)

    Noorsaadah Abd. Rahman

    2011-02-01

    Full Text Available Dengue is a serious disease which has become a global health burden in the last decade. Currently, there are no approved vaccines or antiviral therapies to combat the disease. The increasing spread and severity of the dengue virus infection emphasizes the importance of drug discovery strategies that could efficiently and cost-effectively identify antiviral drug leads for development into potent drugs. To this effect, several computational approaches were applied in this work. Initially molecular docking studies of reference ligands to the DEN2 NS2B/NS3 serine protease were carried out. These reference ligands consist of reported competitive inhibitors extracted from Boesenbergia rotunda (i.e., 4-hydroxypanduratin A and panduratin A and three other synthesized panduratin A derivative compounds (i.e., 246DA, 2446DA and 20H46DA. The design of new lead inhibitors was carried out in two stages. In the first stage, the enzyme complexed to the reference ligands was minimized and their complexation energies (i.e., sum of interaction energy and binding energy were computed. New compounds as potential dengue inhibitors were then designed by putting various substituents successively on the benzyl ring A of the reference molecule. These substituted benzyl compounds were then computed for their enzyme-ligand complexation energies. New enzyme-ligand complexes, exhibiting the lowest complexation energies and closest to the computed energy for the reference compounds, were then chosen for the next stage manipulation and design, which involved substituting positions 4 and 5 of the benzyl ring A (positions 3 and 4 for 2446DA with various substituents.

  15. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  16. Linking Individual Learning Styles to Approach-Avoidance Motivational Traits and Computational Aspects of Reinforcement Learning.

    Directory of Open Access Journals (Sweden)

    Kristoffer Carl Aberg

    Full Text Available Learning how to gain rewards (approach learning and avoid punishments (avoidance learning is fundamental for everyday life. While individual differences in approach and avoidance learning styles have been related to genetics and aging, the contribution of personality factors, such as traits, remains undetermined. Moreover, little is known about the computational mechanisms mediating differences in learning styles. Here, we used a probabilistic selection task with positive and negative feedbacks, in combination with computational modelling, to show that individuals displaying better approach (vs. avoidance learning scored higher on measures of approach (vs. avoidance trait motivation, but, paradoxically, also displayed reduced learning speed following positive (vs. negative outcomes. These data suggest that learning different types of information depend on associated reward values and internal motivational drives, possibly determined by personality traits.

  17. Linking Individual Learning Styles to Approach-Avoidance Motivational Traits and Computational Aspects of Reinforcement Learning

    Science.gov (United States)

    Carl Aberg, Kristoffer; Doell, Kimberly C.; Schwartz, Sophie

    2016-01-01

    Learning how to gain rewards (approach learning) and avoid punishments (avoidance learning) is fundamental for everyday life. While individual differences in approach and avoidance learning styles have been related to genetics and aging, the contribution of personality factors, such as traits, remains undetermined. Moreover, little is known about the computational mechanisms mediating differences in learning styles. Here, we used a probabilistic selection task with positive and negative feedbacks, in combination with computational modelling, to show that individuals displaying better approach (vs. avoidance) learning scored higher on measures of approach (vs. avoidance) trait motivation, but, paradoxically, also displayed reduced learning speed following positive (vs. negative) outcomes. These data suggest that learning different types of information depend on associated reward values and internal motivational drives, possibly determined by personality traits. PMID:27851807

  18. Current Trend Towards Using Soft Computing Approaches to Phase Synchronization in Communication Systems

    Science.gov (United States)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    1999-01-01

    This paper surveys recent advances in communications that utilize soft computing approaches to phase synchronization. Soft computing, as opposed to hard computing, is a collection of complementary methodologies that act in producing the most desirable control, decision, or estimation strategies. Recently, the communications area has explored the use of the principal constituents of soft computing, namely, fuzzy logic, neural networks, and genetic algorithms, for modeling, control, and most recently for the estimation of phase in phase-coherent communications. If the receiver in a digital communications system is phase-coherent, as is often the case, phase synchronization is required. Synchronization thus requires estimation and/or control at the receiver of an unknown or random phase offset.

  19. Decommissioning costing approach based on the standardised list of costing items. Lessons learnt by the OMEGA computer code

    International Nuclear Information System (INIS)

    Daniska, Vladimir; Rehak, Ivan; Vasko, Marek; Ondra, Frantisek; Bezak, Peter; Pritrsky, Jozef; Zachar, Matej; Necas, Vladimir

    2011-01-01

    The document 'A Proposed Standardised List of Items for Costing Purposes' was issues in 1999 by OECD/NEA, IAEA and European Commission (EC) for promoting the harmonisation in decommissioning costing. It is a systematic list of decommissioning activities classified in chapters 01 to 11 with three numbered levels. Four cost group are defined for cost at each level. Document constitutes the standardised matrix of decommissioning activities and cost groups with definition of content of items. Knowing what is behind the items makes the comparison of cost for decommissioning projects transparent. Two approaches are identified for use of the standardised cost structure. First approach converts the cost data from existing specific cost structures into the standardised cost structure for the purpose of cost presentation. Second approach uses the standardised cost structure as the base for the cost calculation structure; the calculated cost data are formatted in the standardised cost format directly; several additional advantages may be identified in this approach. The paper presents the costing methodology based on the standardised cost structure and lessons learnt from last ten years of the implementation of the standardised cost structure as the cost calculation structure in the computer code OMEGA. Code include also on-line management of decommissioning waste, decay of radioactively, evaluation of exposure, generation and optimisation of the Gantt chart of a decommissioning project, which makes the OMEGA code an effective tool for planning and optimisation of decommissioning processes. (author)

  20. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...

  1. Сlassification of methods of production of computer forensic by usage approach of graph theory

    OpenAIRE

    Anna Ravilyevna Smolina; Alexander Alexandrovich Shelupanov

    2016-01-01

    Сlassification of methods of production of computer forensic by usage approach of graph theory is proposed. If use this classification, it is possible to accelerate and simplify the search of methods of production of computer forensic and this process to automatize.

  2. Performance Measurements in a High Throughput Computing Environment

    CERN Document Server

    AUTHOR|(CDS)2145966; Gribaudo, Marco

    The IT infrastructures of companies and research centres are implementing new technologies to satisfy the increasing need of computing resources for big data analysis. In this context, resource profiling plays a crucial role in identifying areas where the improvement of the utilisation efficiency is needed. In order to deal with the profiling and optimisation of computing resources, two complementary approaches can be adopted: the measurement-based approach and the model-based approach. The measurement-based approach gathers and analyses performance metrics executing benchmark applications on computing resources. Instead, the model-based approach implies the design and implementation of a model as an abstraction of the real system, selecting only those aspects relevant to the study. This Thesis originates from a project carried out by the author within the CERN IT department. CERN is an international scientific laboratory that conducts fundamental researches in the domain of elementary particle physics. The p...

  3. A computational approach identifies two regions of Hepatitis C Virus E1 protein as interacting domains involved in viral fusion process

    Directory of Open Access Journals (Sweden)

    El Sawaf Gamal

    2009-07-01

    Full Text Available Abstract Background The E1 protein of Hepatitis C Virus (HCV can be dissected into two distinct hydrophobic regions: a central domain containing an hypothetical fusion peptide (FP, and a C-terminal domain (CT comprising two segments, a pre-anchor and a trans-membrane (TM region. In the currently accepted model of the viral fusion process, the FP and the TM regions are considered to be closely juxtaposed in the post-fusion structure and their physical interaction cannot be excluded. In the present study, we took advantage of the natural sequence variability present among HCV strains to test, by purely sequence-based computational tools, the hypothesis that in this virus the fusion process involves the physical interaction of the FP and CT regions of E1. Results Two computational approaches were applied. The first one is based on the co-evolution paradigm of interacting peptides and consequently on the correlation between the distance matrices generated by the sequence alignment method applied to FP and CT primary structures, respectively. In spite of the relatively low random genetic drift between genotypes, co-evolution analysis of sequences from five HCV genotypes revealed a greater correlation between the FP and CT domains than respect to a control HCV sequence from Core protein, so giving a clear, albeit still inconclusive, support to the physical interaction hypothesis. The second approach relies upon a non-linear signal analysis method widely used in protein science called Recurrence Quantification Analysis (RQA. This method allows for a direct comparison of domains for the presence of common hydrophobicity patterns, on which the physical interaction is based upon. RQA greatly strengthened the reliability of the hypothesis by the scoring of a lot of cross-recurrences between FP and CT peptides hydrophobicity patterning largely outnumbering chance expectations and pointing to putative interaction sites. Intriguingly, mutations in the CT

  4. A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.

    Science.gov (United States)

    De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc

    2010-09-01

    In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.

  5. Online-Based Approaches to Identify Real Journals and Publishers from Hijacked Ones.

    Science.gov (United States)

    Asadi, Amin; Rahbar, Nader; Asadi, Meisam; Asadi, Fahime; Khalili Paji, Kokab

    2017-02-01

    The aim of the present paper was to introduce some online-based approaches to evaluate scientific journals and publishers and to differentiate them from the hijacked ones, regardless of their disciplines. With the advent of open-access journals, many hijacked journals and publishers have deceitfully assumed the mantle of authenticity in order to take advantage of researchers and students. Although these hijacked journals and publishers can be identified through checking their advertisement techniques and their websites, these ways do not always result in their identification. There exist certain online-based approaches, such as using Master Journal List provided by Thomson Reuters, and Scopus database, and using the DOI of a paper, to certify the realness of a journal or publisher. It is indispensable that inexperienced students and researchers know these methods so as to identify hijacked journals and publishers with a higher level of probability.

  6. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.; Cook, J.R.

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  7. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells.

    Science.gov (United States)

    Trevino, Victor; Cassese, Alberto; Nagy, Zsuzsanna; Zhuang, Xiaodong; Herbert, John; Antczak, Philipp; Clarke, Kim; Davies, Nicholas; Rahman, Ayesha; Campbell, Moray J; Guindani, Michele; Bicknell, Roy; Vannucci, Marina; Falciani, Francesco

    2016-04-01

    The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell communication networks

  8. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells.

    Directory of Open Access Journals (Sweden)

    Victor Trevino

    2016-04-01

    Full Text Available The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell

  9. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex

  10. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  11. Сlassification of methods of production of computer forensic by usage approach of graph theory

    Directory of Open Access Journals (Sweden)

    Anna Ravilyevna Smolina

    2016-06-01

    Full Text Available Сlassification of methods of production of computer forensic by usage approach of graph theory is proposed. If use this classification, it is possible to accelerate and simplify the search of methods of production of computer forensic and this process to automatize.

  12. Computational approaches to screen candidate ligands with anti- Parkinson's activity using R programming.

    Science.gov (United States)

    Jayadeepa, R M; Niveditha, M S

    2012-01-01

    It is estimated that by 2050 over 100 million people will be affected by the Parkinson's disease (PD). We propose various computational approaches to screen suitable candidate ligand with anti-Parkinson's activity from phytochemicals. Five different types of dopamine receptors have been identified in the brain, D1-D5. Dopamine receptor D3 was selected as the target receptor. The D3 receptor exists in areas of the brain outside the basal ganglia, such as the limbic system, and thus may play a role in the cognitive and emotional changes noted in Parkinson's disease. A ligand library of 100 molecules with anti-Parkinson's activity was collected from literature survey. Nature is the best combinatorial chemist and possibly has answers to all diseases of mankind. Failure of some synthetic drugs and its side effects have prompted many researches to go back to ancient healing methods which use herbal medicines to give relief. Hence, the candidate ligands with anti-Parkinson's were selected from herbal sources through literature survey. Lipinski rules were applied to screen the suitable molecules for the study, the resulting 88 molecules were energy minimized, and subjected to docking using Autodock Vina. The top eleven molecules were screened according to the docking score generated by Autodock Vina Commercial drug Ropinirole was computed similarly and was compared with the 11 phytochemicals score, the screened molecules were subjected to toxicity analysis and to verify toxic property of phytochemicals. R Programming was applied to remove the bias from the top eleven molecules. Using cluster analysis and Confusion Matrix two phytochemicals were computationally selected namely Rosmarinic acid and Gingkolide A for further studies on the disease Parkinson's.

  13. Soft computing approach to 3D lung nodule segmentation in CT.

    Science.gov (United States)

    Badura, P; Pietka, E

    2014-10-01

    This paper presents a novel, multilevel approach to the segmentation of various types of pulmonary nodules in computed tomography studies. It is based on two branches of computational intelligence: the fuzzy connectedness (FC) and the evolutionary computation. First, the image and auxiliary data are prepared for the 3D FC analysis during the first stage of an algorithm - the masks generation. Its main goal is to process some specific types of nodules connected to the pleura or vessels. It consists of some basic image processing operations as well as dedicated routines for the specific cases of nodules. The evolutionary computation is performed on the image and seed points in order to shorten the FC analysis and improve its accuracy. After the FC application, the remaining vessels are removed during the postprocessing stage. The method has been validated using the first dataset of studies acquired and described by the Lung Image Database Consortium (LIDC) and by its latest release - the LIDC-IDRI (Image Database Resource Initiative) database. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Computational approaches in the design of synthetic receptors - A review.

    Science.gov (United States)

    Cowen, Todd; Karim, Kal; Piletsky, Sergey

    2016-09-14

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as "plastic antibodies" - high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller-Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. A Knowledge Engineering Approach to Developing Educational Computer Games for Improving Students' Differentiating Knowledge

    Science.gov (United States)

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Yang, Li-Hsueh; Huang, Iwen

    2013-01-01

    Educational computer games have been recognized as being a promising approach for motivating students to learn. Nevertheless, previous studies have shown that without proper learning strategies or supportive models, the learning achievement of students might not be as good as expected. In this study, a knowledge engineering approach is proposed…

  16. Solvent effect on indocyanine dyes: A computational approach

    International Nuclear Information System (INIS)

    Bertolino, Chiara A.; Ferrari, Anna M.; Barolo, Claudia; Viscardi, Guido; Caputo, Giuseppe; Coluccia, Salvatore

    2006-01-01

    The solvatochromic behaviour of a series of indocyanine dyes (Dyes I-VIII) was investigated by quantum chemical calculations. The effect of the polymethine chain length and of the indolenine structure has been satisfactorily reproduced by semiempirical Pariser-Parr-Pople (PPP) calculations. The solvatochromism of 3,3,3',3'-tetramethyl-N,N'-diethylindocarbocyanine iodide (Dye I) has been deeply investigated within the ab initio time-dependent density functional theory (TD-DFT) approach. Dye I undergoes non-polar solvation and a linear correlation has been individuated between absorption shifts and refractive index. Computed absorption λ max and oscillator strengths obtained by TD-DFT are in good agreement with the experimental data

  17. Identifying technology innovations for marginalized smallholders-A conceptual approach.

    Science.gov (United States)

    Malek, Mohammad Abdul; Gatzweiler, Franz W; Von Braun, Joachim

    2017-05-01

    This paper adds a contribution in the existing literature in terms of theoretical and conceptual background for the identification of idle potentials of marginal rural areas and people by means of technological and institutional innovations. The approach follows ex-ante assessment for identifying suitable technology and institutional innovations for marginalized smallholders in marginal areas-divided into three main parts (mapping, surveying and evaluating) and several steps. Finally, it contributes to the inclusion of marginalized smallholders by an improved way of understanding the interactions between technology needs, farming systems, ecological resources and poverty characteristics in the different segments of the poor, and to link these insights with productivity enhancing technologies.

  18. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    International Nuclear Information System (INIS)

    Khan, Junaid Ali; Raja, Muhammad Asif Zahoor; Qureshi, Ijaz Mansoor

    2011-01-01

    We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed. (general)

  19. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  20. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    Science.gov (United States)

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  1. Computer simulation of HTGR fuel microspheres using a Monte-Carlo statistical approach

    International Nuclear Information System (INIS)

    Hedrick, C.E.

    1976-01-01

    The concept and computational aspects of a Monte-Carlo statistical approach in relating structure of HTGR fuel microspheres to the uranium content of fuel samples have been verified. Results of the preliminary validation tests and the benefits to be derived from the program are summarized

  2. Combined computational and experimental approach to improve the assessment of mitral regurgitation by echocardiography.

    Science.gov (United States)

    Sonntag, Simon J; Li, Wei; Becker, Michael; Kaestner, Wiebke; Büsen, Martin R; Marx, Nikolaus; Merhof, Dorit; Steinseifer, Ulrich

    2014-05-01

    Mitral regurgitation (MR) is one of the most frequent valvular heart diseases. To assess MR severity, color Doppler imaging (CDI) is the clinical standard. However, inadequate reliability, poor reproducibility and heavy user-dependence are known limitations. A novel approach combining computational and experimental methods is currently under development aiming to improve the quantification. A flow chamber for a circulatory flow loop was developed. Three different orifices were used to mimic variations of MR. The flow field was recorded simultaneously by a 2D Doppler ultrasound transducer and Particle Image Velocimetry (PIV). Computational Fluid Dynamics (CFD) simulations were conducted using the same geometry and boundary conditions. The resulting computed velocity field was used to simulate synthetic Doppler signals. Comparison between PIV and CFD shows a high level of agreement. The simulated CDI exhibits the same characteristics as the recorded color Doppler images. The feasibility of the proposed combination of experimental and computational methods for the investigation of MR is shown and the numerical methods are successfully validated against the experiments. Furthermore, it is discussed how the approach can be used in the long run as a platform to improve the assessment of MR quantification.

  3. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Sonia Yassa

    2013-01-01

    Full Text Available We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  4. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    Science.gov (United States)

    Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361

  5. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  6. On the sighting of unicorns: A variational approach to computing invariant sets in dynamical systems

    Science.gov (United States)

    Junge, Oliver; Kevrekidis, Ioannis G.

    2017-06-01

    We propose to compute approximations to invariant sets in dynamical systems by minimizing an appropriate distance between a suitably selected finite set of points and its image under the dynamics. We demonstrate, through computational experiments, that this approach can successfully converge to approximations of (maximal) invariant sets of arbitrary topology, dimension, and stability, such as, e.g., saddle type invariant sets with complicated dynamics. We further propose to extend this approach by adding a Lennard-Jones type potential term to the objective function, which yields more evenly distributed approximating finite point sets, and illustrate the procedure through corresponding numerical experiments.

  7. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  8. Identifying Mother-Child Interaction Styles Using a Person-Centered Approach.

    Science.gov (United States)

    Nelson, Jackie A; O'Brien, Marion; Grimm, Kevin J; Leerkes, Esther M

    2014-05-01

    Parent-child conflict in the context of a supportive relationship has been discussed as a potentially constructive interaction pattern; the current study is the first to test this using a holistic analytic approach. Interaction styles, defined as mother-child conflict in the context of maternal sensitivity, were identified and described with demographic and stress-related characteristics of families. Longitudinal associations were tested between interaction styles and children's later social competence. Participants included 814 partnered mothers with a first-grade child. Latent profile analysis identified agreeable , dynamic , and disconnected interaction styles. Mothers' intimacy with a partner, depressive symptoms, and authoritarian childrearing beliefs, along with children's later conflict with a best friend and externalizing problems, were associated with group membership. Notably, the dynamic style, characterized by high sensitivity and high conflict, included families who experienced psychological and relational stressors. Findings are discussed with regard to how family stressors shape parent-child interaction patterns.

  9. Identifying Core Mobile Learning Faculty Competencies Based Integrated Approach: A Delphi Study

    Science.gov (United States)

    Elbarbary, Rafik Said

    2015-01-01

    This study is based on the integrated approach as a concept framework to identify, categorize, and rank a key component of mobile learning core competencies for Egyptian faculty members in higher education. The field investigation framework used four rounds Delphi technique to determine the importance rate of each component of core competencies…

  10. Identifying controlling variables for math computation fluency through experimental analysis: the interaction of stimulus control and reinforcing consequences.

    Science.gov (United States)

    Hofstadter-Duke, Kristi L; Daly, Edward J

    2015-03-01

    This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.

  11. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  12. An Efficient Approach for Fast and Accurate Voltage Stability Margin Computation in Large Power Grids

    Directory of Open Access Journals (Sweden)

    Heng-Yi Su

    2016-11-01

    Full Text Available This paper proposes an efficient approach for the computation of voltage stability margin (VSM in a large-scale power grid. The objective is to accurately and rapidly determine the load power margin which corresponds to voltage collapse phenomena. The proposed approach is based on the impedance match-based technique and the model-based technique. It combines the Thevenin equivalent (TE network method with cubic spline extrapolation technique and the continuation technique to achieve fast and accurate VSM computation for a bulk power grid. Moreover, the generator Q limits are taken into account for practical applications. Extensive case studies carried out on Institute of Electrical and Electronics Engineers (IEEE benchmark systems and the Taiwan Power Company (Taipower, Taipei, Taiwan system are used to demonstrate the effectiveness of the proposed approach.

  13. A Social Network Approach to Provisioning and Management of Cloud Computing Services for Enterprises

    DEFF Research Database (Denmark)

    Kuada, Eric; Olesen, Henning

    2011-01-01

    This paper proposes a social network approach to the provisioning and management of cloud computing services termed Opportunistic Cloud Computing Services (OCCS), for enterprises; and presents the research issues that need to be addressed for its implementation. We hypothesise that OCCS...... will facilitate the adoption process of cloud computing services by enterprises. OCCS deals with the concept of enterprises taking advantage of cloud computing services to meet their business needs without having to pay or paying a minimal fee for the services. The OCCS network will be modelled and implemented...... as a social network of enterprises collaborating strategically for the provisioning and consumption of cloud computing services without entering into any business agreements. We conclude that it is possible to configure current cloud service technologies and management tools for OCCS but there is a need...

  14. Radiological management of blunt polytrauma with computed tomography and angiography: an integrated approach

    Energy Technology Data Exchange (ETDEWEB)

    Kurdziel, J.C.; Dondelinger, R.F.; Hemmer, M.

    1987-01-01

    107 polytraumatized patients, who had experienced blunt trauma have been worked up at admission with computed tomography of the thorax, abdomen and pelvis following computed tomography study of the brain: significant lesions were revealed in 98 (90%) patients. 79 (74%) patients showed trauma to the thorax, in 69 (64%) patients abdominal or pelvic trauma was evidenced. No false positive diagnosis was established. 5 traumatic findings were missed. Emergency angiography was indicated in 3 (3%) patients, following computed tomography examination. 3 other trauma patients were submitted directly to angiography without computed tomography examination during the time period this study was completed. Embolization was carried out in 5/6 patients. No thoracotomy was needed. 13 (12%) patients underwent laparotomy following computed tomography. Overall mortality during hospital stay was 14% (15/107). No patient died from visceral bleeding. Conservative management of blunt polytrauma patients can be advocated in almost 90% of visceral lesions. Computed tomography coupled with angiography and embolization represent an adequate integrated approach to the management of blunt polytrauma patients.

  15. Radiological management of blunt polytrauma with computed tomography and angiography: an integrated approach

    International Nuclear Information System (INIS)

    Kurdziel, J.C.; Dondelinger, R.F.; Hemmer, M.

    1987-01-01

    107 polytraumatized patients, who had experienced blunt trauma have been worked up at admission with computed tomography of the thorax, abdomen and pelvis following computed tomography study of the brain: significant lesions were revealed in 98 (90%) patients. 79 (74%) patients showed trauma to the thorax, in 69 (64%) patients abdominal or pelvic trauma was evidenced. No false positive diagnosis was established. 5 traumatic findings were missed. Emergency angiography was indicated in 3 (3%) patients, following computed tomography examination. 3 other trauma patients were submitted directly to angiography without computed tomography examination during the time period this study was completed. Embolization was carried out in 5/6 patients. No thoracotomy was needed. 13 (12%) patients underwent laparotomy following computed tomography. Overall mortality during hospital stay was 14% (15/107). No patient died from visceral bleeding. Conservative management of blunt polytrauma patients can be advocated in almost 90% of visceral lesions. Computed tomography coupled with angiography and embolization represent an adequate integrated approach to the management of blunt polytrauma patients

  16. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    Science.gov (United States)

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  17. Infinitesimal symmetries: a computational approach

    International Nuclear Information System (INIS)

    Kersten, P.H.M.

    1985-01-01

    This thesis is concerned with computational aspects in the determination of infinitesimal symmetries and Lie-Baecklund transformations of differential equations. Moreover some problems are calculated explicitly. A brief introduction to some concepts in the theory of symmetries and Lie-Baecklund transformations, relevant for this thesis, are given. The mathematical formalism is shortly reviewed. The jet bundle formulation is chosen, in which, by its algebraic nature, objects can be described very precisely. Consequently it is appropriate for implementation. A number of procedures are discussed, which enable to carry through computations with the help of a computer. These computations are very extensive in practice. The Lie algebras of infinitesimal symmetries of a number of differential equations in Mathematical Physics are established and some of their applications are discussed, i.e., Maxwell equations, nonlinear diffusion equation, nonlinear Schroedinger equation, nonlinear Dirac equations and self dual SU(2) Yang-Mills equations. Lie-Baecklund transformations of Burgers' equation, Classical Boussinesq equation and the Massive Thirring Model are determined. Furthermore, nonlocal Lie-Baecklund transformations of the last equation are derived. (orig.)

  18. Templet Web: the use of volunteer computing approach in PaaS-style cloud

    Science.gov (United States)

    Vostokin, Sergei; Artamonov, Yuriy; Tsarev, Daniil

    2018-03-01

    This article presents the Templet Web cloud service. The service is designed for high-performance scientific computing automation. The use of high-performance technology is specifically required by new fields of computational science such as data mining, artificial intelligence, machine learning, and others. Cloud technologies provide a significant cost reduction for high-performance scientific applications. The main objectives to achieve this cost reduction in the Templet Web service design are: (a) the implementation of "on-demand" access; (b) source code deployment management; (c) high-performance computing programs development automation. The distinctive feature of the service is the approach mainly used in the field of volunteer computing, when a person who has access to a computer system delegates his access rights to the requesting user. We developed an access procedure, algorithms, and software for utilization of free computational resources of the academic cluster system in line with the methods of volunteer computing. The Templet Web service has been in operation for five years. It has been successfully used for conducting laboratory workshops and solving research problems, some of which are considered in this article. The article also provides an overview of research directions related to service development.

  19. Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.

    Science.gov (United States)

    Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin

    2015-02-01

    To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.

  20. A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS

    OpenAIRE

    Monika Raghuvanshi*, Rahul Patel

    2016-01-01

    In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...

  1. A Stochastic Approach for Blurred Image Restoration and Optical Flow Computation on Field Image Sequence

    Institute of Scientific and Technical Information of China (English)

    高文; 陈熙霖

    1997-01-01

    The blur in target images caused by camera vibration due to robot motion or hand shaking and by object(s) moving in the background scene is different to deal with in the computer vision system.In this paper,the authors study the relation model between motion and blur in the case of object motion existing in video image sequence,and work on a practical computation algorithm for both motion analysis and blut image restoration.Combining the general optical flow and stochastic process,the paper presents and approach by which the motion velocity can be calculated from blurred images.On the other hand,the blurred image can also be restored using the obtained motion information.For solving a problem with small motion limitation on the general optical flow computation,a multiresolution optical flow algoritm based on MAP estimation is proposed. For restoring the blurred image ,an iteration algorithm and the obtained motion velocity are used.The experiment shows that the proposed approach for both motion velocity computation and blurred image restoration works well.

  2. Identifying western yellow-billed cuckoo breeding habitat with a dual modelling approach

    Science.gov (United States)

    Johnson, Matthew J.; Hatten, James R.; Holmes, Jennifer A.; Shafroth, Patrick B.

    2017-01-01

    The western population of the yellow-billed cuckoo (Coccyzus americanus) was recently listed as threatened under the federal Endangered Species Act. Yellow-billed cuckoo conservation efforts require the identification of features and area requirements associated with high quality, riparian forest habitat at spatial scales that range from nest microhabitat to landscape, as well as lower-suitability areas that can be enhanced or restored. Spatially explicit models inform conservation efforts by increasing ecological understanding of a target species, especially at landscape scales. Previous yellow-billed cuckoo modelling efforts derived plant-community maps from aerial photography, an expensive and oftentimes inconsistent approach. Satellite models can remotely map vegetation features (e.g., vegetation density, heterogeneity in vegetation density or structure) across large areas with near perfect repeatability, but they usually cannot identify plant communities. We used aerial photos and satellite imagery, and a hierarchical spatial scale approach, to identify yellow-billed cuckoo breeding habitat along the Lower Colorado River and its tributaries. Aerial-photo and satellite models identified several key features associated with yellow-billed cuckoo breeding locations: (1) a 4.5 ha core area of dense cottonwood-willow vegetation, (2) a large native, heterogeneously dense forest (72 ha) around the core area, and (3) moderately rough topography. The odds of yellow-billed cuckoo occurrence decreased rapidly as the amount of tamarisk cover increased or when cottonwood-willow vegetation was limited. We achieved model accuracies of 75–80% in the project area the following year after updating the imagery and location data. The two model types had very similar probability maps, largely predicting the same areas as high quality habitat. While each model provided unique information, a dual-modelling approach provided a more complete picture of yellow-billed cuckoo habitat

  3. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  4. Computer science approach to quantum control

    International Nuclear Information System (INIS)

    Janzing, D.

    2006-01-01

    Whereas it is obvious that every computation process is a physical process it has hardly been recognized that many complex physical processes bear similarities to computation processes. This is in particular true for the control of physical systems on the nanoscopic level: usually the system can only be accessed via a rather limited set of elementary control operations and for many purposes only a concatenation of a large number of these basic operations will implement the desired process. This concatenation is in many cases quite similar to building complex programs from elementary steps and principles for designing algorithm may thus be a paradigm for designing control processes. For instance, one can decrease the temperature of one part of a molecule by transferring its heat to the remaining part where it is then dissipated to the environment. But the implementation of such a process involves a complex sequence of electromagnetic pulses. This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. We show that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer. Likewise, the implementation of a heat engine on the nanoscale requires to process the heat in a way that is similar to information processing and it can be shown that heat engines with maximal efficiency would be powerful computers, too. In the same way as problems in computer science can be classified by complexity classes we can also classify control problems according to their complexity. Moreover, we directly relate these complexity classes for control problems to the classes in computer science. Unifying notions of complexity in computer science and physics has therefore two aspects: on the one hand, computer science methods help to analyze the complexity of physical processes. On the other hand, reasonable

  5. Vehicle systems and payload requirements evaluation. [computer programs for identifying launch vehicle system requirements

    Science.gov (United States)

    Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.

    1975-01-01

    Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.

  6. Low rank approach to computing first and higher order derivatives using automatic differentiation

    International Nuclear Information System (INIS)

    Reed, J. A.; Abdel-Khalik, H. S.; Utke, J.

    2012-01-01

    This manuscript outlines a new approach for increasing the efficiency of applying automatic differentiation (AD) to large scale computational models. By using the principles of the Efficient Subspace Method (ESM), low rank approximations of the derivatives for first and higher orders can be calculated using minimized computational resources. The output obtained from nuclear reactor calculations typically has a much smaller numerical rank compared to the number of inputs and outputs. This rank deficiency can be exploited to reduce the number of derivatives that need to be calculated using AD. The effective rank can be determined according to ESM by computing derivatives with AD at random inputs. Reduced or pseudo variables are then defined and new derivatives are calculated with respect to the pseudo variables. Two different AD packages are used: OpenAD and Rapsodia. OpenAD is used to determine the effective rank and the subspace that contains the derivatives. Rapsodia is then used to calculate derivatives with respect to the pseudo variables for the desired order. The overall approach is applied to two simple problems and to MATWS, a safety code for sodium cooled reactors. (authors)

  7. A tripartite approach identifies the major sunflower seed albumins.

    Science.gov (United States)

    Jayasena, Achala S; Franke, Bastian; Rosengren, Johan; Mylne, Joshua S

    2016-03-01

    We have used a combination of genomic, transcriptomic, and proteomic approaches to identify the napin-type albumin genes in sunflower and define their contributions to the seed albumin pool. Seed protein content is determined by the expression of what are typically large gene families. A major class of seed storage proteins is the napin-type, water soluble albumins. In this work we provide a comprehensive analysis of the napin-type albumin content of the common sunflower (Helianthus annuus) by analyzing a draft genome, a transcriptome and performing a proteomic analysis of the seed albumin fraction. We show that although sunflower contains at least 26 genes for napin-type albumins, only 15 of these are present at the mRNA level. We found protein evidence for 11 of these but the albumin content of mature seeds is dominated by the encoded products of just three genes. So despite high genetic redundancy for albumins, only a small sub-set of this gene family contributes to total seed albumin content. The three genes identified as producing the majority of sunflower seed albumin are potential future candidates for manipulation through genetics and breeding.

  8. A multi-indicator approach for identifying shoreline sewage pollution hotspots adjacent to coral reefs.

    Science.gov (United States)

    Abaya, Leilani M; Wiegner, Tracy N; Colbert, Steven L; Beets, James P; Carlson, Kaile'a M; Kramer, K Lindsey; Most, Rebecca; Couch, Courtney S

    2018-04-01

    Sewage pollution is contributing to the global decline of coral reefs. Identifying locations where it is entering waters near reefs is therefore a management priority. Our study documented shoreline sewage pollution hotspots in a coastal community with a fringing coral reef (Puakō, Hawai'i) using dye tracer studies, sewage indicator measurements, and a pollution scoring tool. Sewage reached shoreline waters within 9 h to 3 d. Fecal indicator bacteria concentrations were high and variable, and δ 15 N macroalgal values were indicative of sewage at many stations. Shoreline nutrient concentrations were two times higher than those in upland groundwater. Pollution hotspots were identified with a scoring tool using three sewage indicators. It confirmed known locations of sewage pollution from dye tracer studies. Our study highlights the need for a multi-indicator approach and scoring tool to identify sewage pollution hotspots. This approach will be useful for other coastal communities grappling with sewage pollution. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Elucidating Ligand-Modulated Conformational Landscape of GPCRs Using Cloud-Computing Approaches.

    Science.gov (United States)

    Shukla, Diwakar; Lawrenz, Morgan; Pande, Vijay S

    2015-01-01

    G-protein-coupled receptors (GPCRs) are a versatile family of membrane-bound signaling proteins. Despite the recent successes in obtaining crystal structures of GPCRs, much needs to be learned about the conformational changes associated with their activation. Furthermore, the mechanism by which ligands modulate the activation of GPCRs has remained elusive. Molecular simulations provide a way of obtaining detailed an atomistic description of GPCR activation dynamics. However, simulating GPCR activation is challenging due to the long timescales involved and the associated challenge of gaining insights from the "Big" simulation datasets. Here, we demonstrate how cloud-computing approaches have been used to tackle these challenges and obtain insights into the activation mechanism of GPCRs. In particular, we review the use of Markov state model (MSM)-based sampling algorithms for sampling milliseconds of dynamics of a major drug target, the G-protein-coupled receptor β2-AR. MSMs of agonist and inverse agonist-bound β2-AR reveal multiple activation pathways and how ligands function via modulation of the ensemble of activation pathways. We target this ensemble of conformations with computer-aided drug design approaches, with the goal of designing drugs that interact more closely with diverse receptor states, for overall increased efficacy and specificity. We conclude by discussing how cloud-based approaches present a powerful and broadly available tool for studying the complex biological systems routinely. © 2015 Elsevier Inc. All rights reserved.

  10. Screening and syndromic approaches to identify gonorrhea and chlamydial infection among women.

    Science.gov (United States)

    Sloan, N L; Winikoff, B; Haberland, N; Coggins, C; Elias, C

    2000-03-01

    The standard diagnostic tools to identify sexually transmitted infections are often expensive and have laboratory and infrastructure requirements that make them unavailable to family planning and primary health-care clinics in developing countries. Therefore, inexpensive, accessible tools that rely on symptoms, signs, and/or risk factors have been developed to identify and treat reproductive tract infections without the need for laboratory diagnostics. Studies were reviewed that used standard diagnostic tests to identify gonorrhea and cervical chlamydial infection among women and that provided adequate information about the usefulness of the tools for screening. Aggregation of the studies' results suggest that risk factors, algorithms, and risk scoring for syndromic management are poor indicators of gonorrhea and chlamydial infection in samples of both low and high prevalence and, consequently, are not effective mechanisms with which to identify or manage these conditions. The development and evaluation of other approaches to identify gonorrhea and chlamydial infections, including inexpensive and simple laboratory screening tools, periodic universal treatment, and other alternatives must be given priority.

  11. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    Science.gov (United States)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  12. Blueprinting Approach in Support of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2012-03-01

    Full Text Available Current cloud service offerings, i.e., Software-as-a-service (SaaS, Platform-as-a-service (PaaS and Infrastructure-as-a-service (IaaS offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of Service-based Application (SBA developers to configure and syndicate offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e., SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, this paper introduces a formal Blueprint Template for unambiguously describing a blueprint, as well as a Blueprint Lifecycle that guides developers through the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within an EC’s FP7 project is reported and an associated blueprint prototype implementation is presented.

  13. Automated computation of arbor densities: a step toward identifying neuronal cell types

    Directory of Open Access Journals (Sweden)

    Uygar eSümbül

    2014-11-01

    Full Text Available The shape and position of a neuron convey information regarding its molecular and functional identity. The identification of cell types from structure, a classic method, relies on the time-consuming step of arbor tracing. However, as genetic tools and imaging methods make data-driven approaches to neuronal circuit analysis feasible, the need for automated processing increases. Here, we first establish that mouse retinal ganglion cell types can be as precise about distributing their arbor volumes across the inner plexiform layer as they are about distributing the skeletons of the arbors. Then, we describe an automated approach to computing the spatial distribution of the dendritic arbors, or arbor density, with respect to a global depth coordinate based on this observation. Our method involves three-dimensional reconstruction of neuronal arbors by a supervised machine learning algorithm, post-processing of the enhanced stacks to remove somata and isolate the neuron of interest, and registration of neurons to each other using automatically detected arbors of the starburst amacrine interneurons as fiducial markers. In principle, this method could be generalizable to other structures of the CNS, provided that they allow sparse labeling of the cells and contain a reliable axis of spatial reference.

  14. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  15. The Effect of Round Window vs Cochleostomy Surgical Approaches on Cochlear Implant Electrode Position: A Flat-Panel Computed Tomography Study.

    Science.gov (United States)

    Jiam, Nicole T; Jiradejvong, Patpong; Pearl, Monica S; Limb, Charles J

    2016-09-01

    The round window insertion (RWI) and cochleostomy approaches are the 2 most common surgical techniques used in cochlear implantation (CI). However, there is no consensus on which approach is ideal for electrode array insertion, in part because visualization of intracochlear electrode position is challenging, so postoperative assessment of intracochlear electrode contact is lacking. To measure and compare electrode array position between RWI and cochleostomy approaches for CI insertion. Retrospective case-comparison study of 17 CI users with Med-El standard-length electrode arrays who underwent flat-panel computed tomography scans after CI surgery at a tertiary referral center. The data was analyzed in October 2015. Flat-panel computed tomography scans were collected between January 1 and August 31, 2013, for 22 electrode arrays. The surgical technique was identified by a combination of operative notes and imaging. Eight cochleae underwent RWI and 14 cochleae underwent cochleostomy approaches anterior and inferior to the round window. Interscalar electrode position and electrode centroid distance to the osseous spiral lamina, lateral bony wall, and central axis of the modiolus. Nine participants were men, and 8, women; the mean age was 54.4 (range, 21-64) years. Electrode position was significantly closer to cochlear neural elements with RWI than cochleostomy approaches. Between the 2 surgical approaches, the RWI technique produced shorter distances between the electrode and the modiolus (mean difference, -0.33 [95% CI, -0.29 to -0.39] mm in the apical electrode; -1.42 [95% CI, -1.24 to -1.57] mm in the basal electrode). This difference, which was most prominent in the first third and latter third of the basal turn, decreased after the basal turn. The RWI approach was associated with an increased likelihood of perimodiolar placement. Opting to use RWI over cochleostomy approaches in CI candidates may position electrodes closer to cochlear neural substrates and

  16. Results of computer assisted mini-incision subvastus approach for total knee arthroplasty.

    Science.gov (United States)

    Turajane, Thana; Larbpaiboonpong, Viroj; Kongtharvonskul, Jatupon; Maungsiri, Samart

    2009-12-01

    Mini-incision subvastus approach is soft tissue preservation of the knee. Advantages of the mini-incision subvastus approach included reduced blood loss, reduced pain, self rehabilitation and faster recovery. However, the improved visualization, component alignment, and more blood preservation have been debatable to achieve the better outcome and preventing early failure of the Total Knee Arthroplasty (TKA). The computer navigation has been introduced to improve alignment and blood loss. The purpose of this study was to evaluate the short term outcomes of the combination of computer assisted mini-incision subvastus approach for Total Knee Arthroplasty (CMS-TKA). A prospective case series of the initial 80 patients who underwent computer assisted mini-incision subvastus approach for CMS-TKA from January 2007 to October 2008 was carried out. The patients' conditions were classified into 2 groups, the simple OA knee (varus deformity was less than 15 degree, BMI was less than 20%, no associated deformities) and the complex deformity (varus deformity was more than 15 degrees, BMI more was than 20%, associated with flexion contractor). There were 59 patients in group 1 and 21 patients in group 2. Of the 80 knees, 38 were on the left and 42 on the right. The results of CMS-TKA [the mean (range)] in group 1: group 2 were respectively shown as the incision length [10.88 (8-13): 11.92 (10-14], the operation time [118 (111.88-125.12): 131 (119.29-143.71) minutes, lateral releases (0 in both groups), postoperative range of motion in flexion [94.5 (90-100): 95.25 (90-105) degree] and extension [1.75 (0-5): 1.5 (0-5) degree] Blood loss in 24 hours [489.09 (414.7-563.48): 520 (503.46-636.54) ml] and blood transfusion [1 (0-1) unit? in both groups], Tibiofemoral angle preoperative [Varus = 4 (varus 0-10): Varus = 17.14 (varus 15.7-18.5) degree, Tibiofemoral angle postoperative [Valgus = 1.38 (Valgus 0-4): Valgus = 2.85 (valgus 2.1-3.5) degree], Tibiofemoral angle outlier (85% both

  17. Computation of External Quality Factors for RF Structures by Means of Model Order Reduction and a Perturbation Approach

    CERN Document Server

    Flisgen, Thomas; van Rienen, Ursula

    2016-01-01

    External quality factors are significant quantities to describe losses via waveguide ports in radio frequency resonators. The current contribution presents a novel approach to determine external quality factors by means of a two-step procedure: First, a state-space model for the lossless radio frequency structure is generated and its model order is reduced. Subsequently, a perturbation method is applied on the reduced model so that external losses are accounted for. The advantage of this approach results from the fact that the challenges in dealing with lossy systems are shifted to the reduced order model. This significantly saves computational costs. The present paper provides a short overview on existing methods to compute external quality factors. Then, the novel approach is introduced and validated in terms of accuracy and computational time by means of commercial software.

  18. Computational approaches for discovery of common immunomodulators in fungal infections: towards broad-spectrum immunotherapeutic interventions.

    Science.gov (United States)

    Kidane, Yared H; Lawrence, Christopher; Murali, T M

    2013-10-07

    Fungi are the second most abundant type of human pathogens. Invasive fungal pathogens are leading causes of life-threatening infections in clinical settings. Toxicity to the host and drug-resistance are two major deleterious issues associated with existing antifungal agents. Increasing a host's tolerance and/or immunity to fungal pathogens has potential to alleviate these problems. A host's tolerance may be improved by modulating the immune system such that it responds more rapidly and robustly in all facets, ranging from the recognition of pathogens to their clearance from the host. An understanding of biological processes and genes that are perturbed during attempted fungal exposure, colonization, and/or invasion will help guide the identification of endogenous immunomodulators and/or small molecules that activate host-immune responses such as specialized adjuvants. In this study, we present computational techniques and approaches using publicly available transcriptional data sets, to predict immunomodulators that may act against multiple fungal pathogens. Our study analyzed data sets derived from host cells exposed to five fungal pathogens, namely, Alternaria alternata, Aspergillus fumigatus, Candida albicans, Pneumocystis jirovecii, and Stachybotrys chartarum. We observed statistically significant associations between host responses to A. fumigatus and C. albicans. Our analysis identified biological processes that were consistently perturbed by these two pathogens. These processes contained both immune response-inducing genes such as MALT1, SERPINE1, ICAM1, and IL8, and immune response-repressing genes such as DUSP8, DUSP6, and SPRED2. We hypothesize that these genes belong to a pool of common immunomodulators that can potentially be activated or suppressed (agonized or antagonized) in order to render the host more tolerant to infections caused by A. fumigatus and C. albicans. Our computational approaches and methodologies described here can now be applied to

  19. Discovery and annotation of small proteins using genomics, proteomics and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Xiaohan; Tschaplinski, Timothy J.; Hurst, Gregory B.; Jawdy, Sara; Abraham, Paul E.; Lankford, Patricia K.; Adams, Rachel M.; Shah, Manesh B.; Hettich, Robert L.; Lindquist, Erika; Kalluri, Udaya C.; Gunter, Lee E.; Pennacchio, Christa; Tuskan, Gerald A.

    2011-03-02

    Small proteins (10 200 amino acids aa in length) encoded by short open reading frames (sORF) play important regulatory roles in various biological processes, including tumor progression, stress response, flowering, and hormone signaling. However, ab initio discovery of small proteins has been relatively overlooked. Recent advances in deep transcriptome sequencing make it possible to efficiently identify sORFs at the genome level. In this study, we obtained 2.6 million expressed sequence tag (EST) reads from Populus deltoides leaf transcriptome and reconstructed full-length transcripts from the EST sequences. We identified an initial set of 12,852 sORFs encoding proteins of 10 200 aa in length. Three computational approaches were then used to enrich for bona fide protein-coding sORFs from the initial sORF set: (1) codingpotential prediction, (2) evolutionary conservation between P. deltoides and other plant species, and (3) gene family clustering within P. deltoides. As a result, a high-confidence sORF candidate set containing 1469 genes was obtained. Analysis of the protein domains, non-protein-coding RNA motifs, sequence length distribution, and protein mass spectrometry data supported this high-confidence sORF set. In the high-confidence sORF candidate set, known protein domains were identified in 1282 genes (higher-confidence sORF candidate set), out of which 611 genes, designated as highest-confidence candidate sORF set, were supported by proteomics data. Of the 611 highest-confidence candidate sORF genes, 56 were new to the current Populus genome annotation. This study not only demonstrates that there are potential sORF candidates to be annotated in sequenced genomes, but also presents an efficient strategy for discovery of sORFs in species with no genome annotation yet available.

  20. A hybrid agent-based computational economics and optimization approach for supplier selection problem

    Directory of Open Access Journals (Sweden)

    Zahra Pourabdollahi

    2017-12-01

    Full Text Available Supplier evaluation and selection problem is among the most important of logistics decisions that have been addressed extensively in supply chain management. The same logistics decision is also important in freight transportation since it identifies trade relationships between business establishments and determines commodity flows between production and consumption points. The commodity flows are then used as input to freight transportation models to determine cargo movements and their characteristics including mode choice and shipment size. Various approaches have been proposed to explore this latter problem in previous studies. Traditionally, potential suppliers are evaluated and selected using only price/cost as the influential criteria and the state-of-practice methods. This paper introduces a hybrid agent-based computational economics and optimization approach for supplier selection. The proposed model combines an agent-based multi-criteria supplier evaluation approach with a multi-objective optimization model to capture both behavioral and economical aspects of the supplier selection process. The model uses a system of ordered response models to determine importance weights of the different criteria in supplier evaluation from a buyers’ point of view. The estimated weights are then used to calculate a utility for each potential supplier in the market and rank them. The calculated utilities are then entered into a mathematical programming model in which best suppliers are selected by maximizing the total accrued utility for all buyers and minimizing total shipping costs while balancing the capacity of potential suppliers to ensure market clearing mechanisms. The proposed model, herein, was implemented under an operational agent-based supply chain and freight transportation framework for the Chicago Metropolitan Area.

  1. Integrative computational approach for genome-based study of microbial lipid-degrading enzymes.

    Science.gov (United States)

    Vorapreeda, Tayvich; Thammarongtham, Chinae; Laoteng, Kobkul

    2016-07-01

    Lipid-degrading or lipolytic enzymes have gained enormous attention in academic and industrial sectors. Several efforts are underway to discover new lipase enzymes from a variety of microorganisms with particular catalytic properties to be used for extensive applications. In addition, various tools and strategies have been implemented to unravel the functional relevance of the versatile lipid-degrading enzymes for special purposes. This review highlights the study of microbial lipid-degrading enzymes through an integrative computational approach. The identification of putative lipase genes from microbial genomes and metagenomic libraries using homology-based mining is discussed, with an emphasis on sequence analysis of conserved motifs and enzyme topology. Molecular modelling of three-dimensional structure on the basis of sequence similarity is shown to be a potential approach for exploring the structural and functional relationships of candidate lipase enzymes. The perspectives on a discriminative framework of cutting-edge tools and technologies, including bioinformatics, computational biology, functional genomics and functional proteomics, intended to facilitate rapid progress in understanding lipolysis mechanism and to discover novel lipid-degrading enzymes of microorganisms are discussed.

  2. Templet Web: the use of volunteer computing approach in PaaS-style cloud

    Directory of Open Access Journals (Sweden)

    Vostokin Sergei

    2018-03-01

    Full Text Available This article presents the Templet Web cloud service. The service is designed for high-performance scientific computing automation. The use of high-performance technology is specifically required by new fields of computational science such as data mining, artificial intelligence, machine learning, and others. Cloud technologies provide a significant cost reduction for high-performance scientific applications. The main objectives to achieve this cost reduction in the Templet Web service design are: (a the implementation of “on-demand” access; (b source code deployment management; (c high-performance computing programs development automation. The distinctive feature of the service is the approach mainly used in the field of volunteer computing, when a person who has access to a computer system delegates his access rights to the requesting user. We developed an access procedure, algorithms, and software for utilization of free computational resources of the academic cluster system in line with the methods of volunteer computing. The Templet Web service has been in operation for five years. It has been successfully used for conducting laboratory workshops and solving research problems, some of which are considered in this article. The article also provides an overview of research directions related to service development.

  3. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    OpenAIRE

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modificati...

  4. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  5. An innovative and integrated approach based on DNA walking to identify unauthorised GMOs.

    Science.gov (United States)

    Fraiture, Marie-Alice; Herman, Philippe; Taverniers, Isabel; De Loose, Marc; Deforce, Dieter; Roosens, Nancy H

    2014-03-15

    In the coming years, the frequency of unauthorised genetically modified organisms (GMOs) being present in the European food and feed chain will increase significantly. Therefore, we have developed a strategy to identify unauthorised GMOs containing a pCAMBIA family vector, frequently present in transgenic plants. This integrated approach is performed in two successive steps on Bt rice grains. First, the potential presence of unauthorised GMOs is assessed by the qPCR SYBR®Green technology targeting the terminator 35S pCAMBIA element. Second, its presence is confirmed via the characterisation of the junction between the transgenic cassette and the rice genome. To this end, a DNA walking strategy is applied using a first reverse primer followed by two semi-nested PCR rounds using primers that are each time nested to the previous reverse primer. This approach allows to rapidly identify the transgene flanking region and can easily be implemented by the enforcement laboratories. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Approaching multiphase flows from the perspective of computational fluid dynamics

    International Nuclear Information System (INIS)

    Banas, A.O.

    1992-01-01

    Thermalhydraulic simulation methodologies based on subchannel and porous-medium concepts are briefly reviewed and contrasted with the general approach of Computational Fluid Dynamics (CFD). An outline of the advanced CFD methods for single-phase turbulent flows is followed by a short discussion of the unified formulation of averaged equations for turbulent and multiphase flows. Some of the recent applications of CFD at Chalk River Laboratories are discussed, and the complementary role of CFD with regard to the established thermalhydraulic methods of analysis is indicated. (author). 8 refs

  7. A Multi-step and Multi-level approach for Computer Aided Molecular Design

    DEFF Research Database (Denmark)

    . The problem formulation step incorporates a knowledge base for the identification and setup of the design criteria. Candidate compounds are identified using a multi-level generate and test CAMD solution algorithm capable of designing molecules having a high level of molecular detail. A post solution step...... using an Integrated Computer Aided System (ICAS) for result analysis and verification is included in the methodology. Keywords: CAMD, separation processes, knowledge base, molecular design, solvent selection, substitution, group contribution, property prediction, ICAS Introduction The use of Computer...... Aided Molecular Design (CAMD) for the identification of compounds having specific physic...

  8. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  9. An efficient approach for computing the geometrical optics field reflected from a numerically specified surface

    Science.gov (United States)

    Mittra, R.; Rushdi, A.

    1979-01-01

    An approach for computing the geometrical optic fields reflected from a numerically specified surface is presented. The approach includes the step of deriving a specular point and begins with computing the reflected rays off the surface at the points where their coordinates, as well as the partial derivatives (or equivalently, the direction of the normal), are numerically specified. Then, a cluster of three adjacent rays are chosen to define a 'mean ray' and the divergence factor associated with this mean ray. Finally, the ampilitude, phase, and vector direction of the reflected field at a given observation point are derived by associating this point with the nearest mean ray and determining its position relative to such a ray.

  10. Computational prediction of neoantigens: do we need more data or new approaches?

    DEFF Research Database (Denmark)

    Eklund, Aron Charles; Szallasi, Zoltan Imre

    2018-01-01

    Personalized cancer immunotherapy may benefit from improved computational algorithms for identifying neoantigens. Recent results demonstrate that machine learning can improve accuracy. Additional improvements may require more genomic data paired with in vitro T cell reactivity measurements......, and more sophisticated algorithms that take into account T cell receptor specificity....

  11. A Hybrid Autonomic Computing-Based Approach to Distributed Constraint Satisfaction Problems

    Directory of Open Access Journals (Sweden)

    Abhishek Bhatia

    2015-03-01

    Full Text Available Distributed constraint satisfaction problems (DisCSPs are among the widely endeavored problems using agent-based simulation. Fernandez et al. formulated sensor and mobile tracking problem as a DisCSP, known as SensorDCSP In this paper, we adopt a customized ERE (environment, reactive rules and entities algorithm for the SensorDCSP, which is otherwise proven as a computationally intractable problem. An amalgamation of the autonomy-oriented computing (AOC-based algorithm (ERE and genetic algorithm (GA provides an early solution of the modeled DisCSP. Incorporation of GA into ERE facilitates auto-tuning of the simulation parameters, thereby leading to an early solution of constraint satisfaction. This study further contributes towards a model, built up in the NetLogo simulation environment, to infer the efficacy of the proposed approach.

  12. Approaching gender parity: Women in computer science at Afghanistan's Kabul University

    Science.gov (United States)

    Plane, Jandelyn

    from the program. Gender and STEM literature identifies parental encouragement, stereotypes and employment perceptions as influential characteristics. Afghan women in computer science received significant parental encouragement even from parents with no computer background. They do not seem to be influenced by any negative "geek" stereotypes, but they do perceive limitations when considering employment after graduation.

  13. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  14. a Radical Collaborative Approach: Developing a Model for Learning Theory, Human-Based Computation and Participant Motivation in a Rock-Art Heritage Application

    Science.gov (United States)

    Haubt, R.

    2016-06-01

    This paper explores a Radical Collaborative Approach in the global and centralized Rock-Art Database project to find new ways to look at rock-art by making information more accessible and more visible through public contributions. It looks at rock-art through the Key Performance Indicator (KPI), identified with the latest Australian State of the Environment Reports to help develop a better understanding of rock-art within a broader Cultural and Indigenous Heritage context. Using a practice-led approach the project develops a conceptual collaborative model that is deployed within the RADB Management System. Exploring learning theory, human-based computation and participant motivation the paper develops a procedure for deploying collaborative functions within the interface design of the RADB Management System. The paper presents the results of the collaborative model implementation and discusses considerations for the next iteration of the RADB Universe within an Agile Development Approach.

  15. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  16. Integrated systems approach identifies risk regulatory pathways and key regulators in coronary artery disease.

    Science.gov (United States)

    Zhang, Yan; Liu, Dianming; Wang, Lihong; Wang, Shuyuan; Yu, Xuexin; Dai, Enyu; Liu, Xinyi; Luo, Shanshun; Jiang, Wei

    2015-12-01

    Coronary artery disease (CAD) is the most common type of heart disease. However, the molecular mechanisms of CAD remain elusive. Regulatory pathways are known to play crucial roles in many pathogenic processes. Thus, inferring risk regulatory pathways is an important step toward elucidating the mechanisms underlying CAD. With advances in high-throughput data, we developed an integrated systems approach to identify CAD risk regulatory pathways and key regulators. Firstly, a CAD-related core subnetwork was identified from a curated transcription factor (TF) and microRNA (miRNA) regulatory network based on a random walk algorithm. Secondly, candidate risk regulatory pathways were extracted from the subnetwork by applying a breadth-first search (BFS) algorithm. Then, risk regulatory pathways were prioritized based on multiple CAD-associated data sources. Finally, we also proposed a new measure to prioritize upstream regulators. We inferred that phosphatase and tensin homolog (PTEN) may be a key regulator in the dysregulation of risk regulatory pathways. This study takes a closer step than the identification of disease subnetworks or modules. From the risk regulatory pathways, we could understand the flow of regulatory information in the initiation and progression of the disease. Our approach helps to uncover its potential etiology. We developed an integrated systems approach to identify risk regulatory pathways. We proposed a new measure to prioritize the key regulators in CAD. PTEN may be a key regulator in dysregulation of the risk regulatory pathways.

  17. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  18. Identifying biomarkers for asthma diagnosis using targeted metabolomics approaches.

    Science.gov (United States)

    Checkley, William; Deza, Maria P; Klawitter, Jost; Romero, Karina M; Klawitter, Jelena; Pollard, Suzanne L; Wise, Robert A; Christians, Uwe; Hansel, Nadia N

    2016-12-01

    The diagnosis of asthma in children is challenging and relies on a combination of clinical factors and biomarkers including methacholine challenge, lung function, bronchodilator responsiveness, and presence of airway inflammation. No single test is diagnostic. We sought to identify a pattern of inflammatory biomarkers that was unique to asthma using a targeted metabolomics approach combined with data science methods. We conducted a nested case-control study of 100 children living in a peri-urban community in Lima, Peru. We defined cases as children with current asthma, and controls as children with no prior history of asthma and normal lung function. We further categorized enrollment following a factorial design to enroll equal numbers of children as either overweight or not. We obtained a fasting venous blood sample to characterize a comprehensive panel of targeted markers using a metabolomics approach based on high performance liquid chromatography-mass spectrometry. A statistical comparison of targeted metabolites between children with asthma (n = 50) and healthy controls (n = 49) revealed distinct patterns in relative concentrations of several metabolites: children with asthma had approximately 40-50% lower relative concentrations of ascorbic acid, 2-isopropylmalic acid, shikimate-3-phosphate, and 6-phospho-d-gluconate when compared to children without asthma, and 70% lower relative concentrations of reduced glutathione (all p  13 077 normalized counts/second and betaine ≤ 16 47 121 normalized counts/second). By using a metabolomics approach applied to serum, we were able to discriminate between children with and without asthma by revealing different metabolic patterns. These results suggest that serum metabolomics may represent a diagnostic tool for asthma and may be helpful for distinguishing asthma phenotypes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. MrTADFinder: A network modularity based approach to identify topologically associating domains in multiple resolutions.

    Directory of Open Access Journals (Sweden)

    Koon-Kiu Yan

    2017-07-01

    Full Text Available Genome-wide proximity ligation based assays such as Hi-C have revealed that eukaryotic genomes are organized into structural units called topologically associating domains (TADs. From a visual examination of the chromosomal contact map, however, it is clear that the organization of the domains is not simple or obvious. Instead, TADs exhibit various length scales and, in many cases, a nested arrangement. Here, by exploiting the resemblance between TADs in a chromosomal contact map and densely connected modules in a network, we formulate TAD identification as a network optimization problem and propose an algorithm, MrTADFinder, to identify TADs from intra-chromosomal contact maps. MrTADFinder is based on the network-science concept of modularity. A key component of it is deriving an appropriate background model for contacts in a random chain, by numerically solving a set of matrix equations. The background model preserves the observed coverage of each genomic bin as well as the distance dependence of the contact frequency for any pair of bins exhibited by the empirical map. Also, by introducing a tunable resolution parameter, MrTADFinder provides a self-consistent approach for identifying TADs at different length scales, hence the acronym "Mr" standing for Multiple Resolutions. We then apply MrTADFinder to various Hi-C datasets. The identified domain boundaries are marked by characteristic signatures in chromatin marks and transcription factors (TF that are consistent with earlier work. Moreover, by calling TADs at different length scales, we observe that boundary signatures change with resolution, with different chromatin features having different characteristic length scales. Furthermore, we report an enrichment of HOT (high-occupancy target regions near TAD boundaries and investigate the role of different TFs in determining boundaries at various resolutions. To further explore the interplay between TADs and epigenetic marks, as tumor mutational

  20. Experimental/Computational Approach to Accommodation Coefficients and its Application to Noble Gases on Aluminum Surface (Preprint)

    Science.gov (United States)

    2009-02-03

    computational approach to accommodation coefficients and its application to noble gases on aluminum surface Nathaniel Selden Uruversity of Southern Cahfornia, Los ...8217 ,. 0.’ a~ .......,..,P. • " ,,-0, "p"’U".. ,Po"D.’ 0.’P.... uro . P." FIG. 5: Experimental and computed radiometri~ force for argon (left), xenon

  1. Artificial intelligence and tutoring systems computational and cognitive approaches to the communication of knowledge

    CERN Document Server

    Wenger, Etienne

    2014-01-01

    Artificial Intelligence and Tutoring Systems: Computational and Cognitive Approaches to the Communication of Knowledge focuses on the cognitive approaches, methodologies, principles, and concepts involved in the communication of knowledge. The publication first elaborates on knowledge communication systems, basic issues, and tutorial dialogues. Concerns cover natural reasoning and tutorial dialogues, shift from local strategies to multiple mental models, domain knowledge, pedagogical knowledge, implicit versus explicit encoding of knowledge, knowledge communication, and practical and theoretic

  2. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    Directory of Open Access Journals (Sweden)

    Dang Hung

    2017-07-01

    Full Text Available We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation efficiency, it is critical to keep trusted code bases lean, for large ones are unwieldy to vet and verify. In this paper, we advocate a simple approach wherein many basic algorithms (e.g., sorting can be made privacy-preserving by adding a step that securely scrambles the data before feeding it to the original algorithms. We call this approach Scramble-then-Compute (StC, and give a sufficient condition whereby existing external memory algorithms can be made privacy-preserving via StC. This approach facilitates code-reuse, and its simplicity contributes to a smaller trusted code base. It is also general, allowing algorithm designers to leverage an extensive body of known efficient algorithms for better performance. Our experiments show that StC could offer up to 4.1× speedups over known, application-specific alternatives.

  3. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  4. Computer programs as accounting object

    Directory of Open Access Journals (Sweden)

    I.V. Perviy

    2015-03-01

    Full Text Available Existing approaches to the regulation of accounting software as one of the types of intangible assets have been considered. The features and current state of the legal protection of computer programs have been analyzed. The reasons for the need to use patent law as a means of legal protection of individual elements of computer programs have been discovered. The influence of the legal aspects of the use of computer programs for national legislation to their accounting reflection has been analyzed. The possible options for the transfer of rights from computer programs copyright owners have been analyzed that should be considered during creation of software accounting system at the enterprise. Identified and analyzed the characteristics of computer software as an intangible asset under the current law. General economic characteristics of computer programs as one of the types of intangible assets have been grounded. The main distinguishing features of software compared to other types of intellectual property have been all ocated

  5. An Approach to Identify and Characterize a Subunit Candidate Shigella Vaccine Antigen.

    Science.gov (United States)

    Pore, Debasis; Chakrabarti, Manoj K

    2016-01-01

    Shigellosis remains a serious issue throughout the developing countries, particularly in children under the age of 5. Numerous strategies have been tested to develop vaccines targeting shigellosis; unfortunately despite several years of extensive research, no safe, effective, and inexpensive vaccine against shigellosis is available so far. Here, we illustrate in detail an approach to identify and establish immunogenic outer membrane proteins from Shigella flexneri 2a as subunit vaccine candidates.

  6. Anatomical and computed tomographic analysis of the transcochlear and endoscopic transclival approaches to the petroclival region.

    Science.gov (United States)

    Mason, Eric; Van Rompaey, Jason; Carrau, Ricardo; Panizza, Benedict; Solares, C Arturo

    2014-03-01

    Advances in the field of skull base surgery aim to maximize anatomical exposure while minimizing patient morbidity. The petroclival region of the skull base presents numerous challenges for surgical access due to the complex anatomy. The transcochlear approach to the region provides adequate access; however, the resection involved sacrifices hearing and results in at least a grade 3 facial palsy. An endoscopic endonasal approach could potentially avoid negative patient outcomes while providing a desirable surgical window in a select patient population. Cadaveric study. Endoscopic access to the petroclival region was achieved through an endonasal approach. For comparison, a transcochlear approach to the clivus was performed. Different facets of the dissections, such as bone removal volume and exposed surface area, were computed using computed tomography analysis. The endoscopic endonasal approach provided a sufficient corridor to the petroclival region with significantly less bone removal and nearly equivalent exposure of the surgical target, thus facilitating the identification of the relevant anatomy. The lateral approach allowed for better exposure from a posterolateral direction until the inferior petrosal sinus; however, the endonasal approach avoided labyrinthine/cochlear destruction and facial nerve manipulation while providing an anteromedial viewpoint. The endonasal approach also avoided external incisions and cosmetic deficits. The endonasal approach required significant sinonasal resection. Endoscopic access to the petroclival region is a feasible approach. It potentially avoids hearing loss, facial nerve manipulation, and cosmetic damage. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  7. A comprehensive approach to identify dominant controls of the behavior of a land surface-hydrology model across various hydroclimatic conditions

    Science.gov (United States)

    Haghnegahdar, Amin; Elshamy, Mohamed; Yassin, Fuad; Razavi, Saman; Wheater, Howard; Pietroniro, Al

    2017-04-01

    Complex physically-based environmental models are being increasingly used as the primary tool for watershed planning and management due to advances in computation power and data acquisition. Model sensitivity analysis plays a crucial role in understanding the behavior of these complex models and improving their performance. Due to the non-linearity and interactions within these complex models, Global sensitivity analysis (GSA) techniques should be adopted to provide a comprehensive understanding of model behavior and identify its dominant controls. In this study we adopt a multi-basin multi-criteria GSA approach to systematically assess the behavior of the Modélisation Environmentale-Surface et Hydrologie (MESH) across various hydroclimatic conditions in Canada including areas in the Great Lakes Basin, Mackenzie River Basin, and South Saskatchewan River Basin. MESH is a semi-distributed physically-based coupled land surface-hydrology modelling system developed by Environment and Climate Change Canada (ECCC) for various water resources management purposes in Canada. We use a novel method, called Variogram Analysis of Response Surfaces (VARS), to perform sensitivity analysis. VARS is a variogram-based GSA technique that can efficiently provide a spectrum of sensitivity information across a range of scales within the parameter space. We use multiple metrics to identify dominant controls of model response (e.g. streamflow) to model parameters under various conditions such as high flows, low flows, and flow volume. We also investigate the influence of initial conditions on model behavior as part of this study. Our preliminary results suggest that this type of GSA can significantly help with estimating model parameters, decreasing calibration computational burden, and reducing prediction uncertainty.

  8. Computational approaches in the design of synthetic receptors – A review

    Energy Technology Data Exchange (ETDEWEB)

    Cowen, Todd, E-mail: tc203@le.ac.uk; Karim, Kal; Piletsky, Sergey

    2016-09-14

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as “plastic antibodies” – high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller–Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. - Highlights: • A review of computational modelling in the design of molecularly imprinted polymers. • Target analytes and method of analysis for the vast majority of recent articles. • Explanations are given of all the popular and emerging techniques used in design. • Highlighted examples of sophisticated analysis of imprinted polymer systems.

  9. Computational approaches in the design of synthetic receptors – A review

    International Nuclear Information System (INIS)

    Cowen, Todd; Karim, Kal; Piletsky, Sergey

    2016-01-01

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as “plastic antibodies” – high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller–Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. - Highlights: • A review of computational modelling in the design of molecularly imprinted polymers. • Target analytes and method of analysis for the vast majority of recent articles. • Explanations are given of all the popular and emerging techniques used in design. • Highlighted examples of sophisticated analysis of imprinted polymer systems.

  10. Computational identification of binding energy hot spots in protein-RNA complexes using an ensemble approach.

    Science.gov (United States)

    Pan, Yuliang; Wang, Zixiang; Zhan, Weihua; Deng, Lei

    2018-05-01

    Identifying RNA-binding residues, especially energetically favored hot spots, can provide valuable clues for understanding the mechanisms and functional importance of protein-RNA interactions. Yet, limited availability of experimentally recognized energy hot spots in protein-RNA crystal structures leads to the difficulties in developing empirical identification approaches. Computational prediction of RNA-binding hot spot residues is still in its infant stage. Here, we describe a computational method, PrabHot (Prediction of protein-RNA binding hot spots), that can effectively detect hot spot residues on protein-RNA binding interfaces using an ensemble of conceptually different machine learning classifiers. Residue interaction network features and new solvent exposure characteristics are combined together and selected for classification with the Boruta algorithm. In particular, two new reference datasets (benchmark and independent) have been generated containing 107 hot spots from 47 known protein-RNA complex structures. In 10-fold cross-validation on the training dataset, PrabHot achieves promising performances with an AUC score of 0.86 and a sensitivity of 0.78, which are significantly better than that of the pioneer RNA-binding hot spot prediction method HotSPRing. We also demonstrate the capability of our proposed method on the independent test dataset and gain a competitive advantage as a result. The PrabHot webserver is freely available at http://denglab.org/PrabHot/. leideng@csu.edu.cn. Supplementary data are available at Bioinformatics online.

  11. Medium-term generation programming in competitive environments: a new optimisation approach for market equilibrium computing

    International Nuclear Information System (INIS)

    Barquin, J.; Centeno, E.; Reneses, J.

    2004-01-01

    The paper proposes a model to represent medium-term hydro-thermal operation of electrical power systems in deregulated frameworks. The model objective is to compute the oligopolistic market equilibrium point in which each utility maximises its profit, based on other firms' behaviour. This problem is not an optimisation one. The main contribution of the paper is to demonstrate that, nevertheless, under some reasonable assumptions, it can be formulated as an equivalent minimisation problem. A computer program has been coded by using the proposed approach. It is used to compute the market equilibrium of a real-size system. (author)

  12. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  13. Soft computing based on hierarchical evaluation approach and criteria interdependencies for energy decision-making problems: A case study

    International Nuclear Information System (INIS)

    Gitinavard, Hossein; Mousavi, S. Meysam; Vahdani, Behnam

    2017-01-01

    In numerous real-world energy decision problems, decision makers often encounter complex environments, in which existent imprecise data and uncertain information lead us to make an appropriate decision. In this paper, a new soft computing group decision-making approach is introduced based on novel compromise ranking method and interval-valued hesitant fuzzy sets (IVHFSs) for energy decision-making problems under multiple criteria. In the proposed approach, the assessment information is provided by energy experts or decision makers based on interval-valued hesitant fuzzy elements under incomplete criteria weights. In this respect, a new ranking index is presented respecting to interval-valued hesitant fuzzy Hamming distance measure to prioritize energy candidates, and criteria weights are computed based on an extended maximizing deviation method by considering the preferences experts' judgments about the relative importance of each criterion. Also, a decision making trial and evaluation laboratory (DEMATEL) method is extended under an IVHF-environment to compute the interdependencies between and within the selected criteria in the hierarchical structure. Accordingly, to demonstrate the applicability of the presented approach a case study and a practical example are provided regarding to hierarchical structure and criteria interdependencies relations for renewable energy and energy policy selection problems. Hence, the obtained computational results are compared with a fuzzy decision-making method from the recent literature based on some comparison parameters to show the advantages and constraints of the proposed approach. Finally, a sensitivity analysis is prepared to indicate effects of different criteria weights on ranking results to present the robustness or sensitiveness of the proposed soft computing approach versus the relative importance of criteria. - Highlights: • Introducing a novel interval-valued hesitant fuzzy compromise ranking method. • Presenting

  14. Can Computers Be Used for Whole Language Approaches to Reading and Language Arts?

    Science.gov (United States)

    Balajthy, Ernest

    Holistic approaches to the teaching of reading and writing, most notably the Whole Language movement, reject the philosophy that language skills can be taught. Instead, holistic teachers emphasize process, and they structure the students' classroom activities to be rich in language experience. Computers can be used as tools for whole language…

  15. Relationships among Taiwanese Children's Computer Game Use, Academic Achievement and Parental Governing Approach

    Science.gov (United States)

    Yeh, Duen-Yian; Cheng, Ching-Hsue

    2016-01-01

    This study examined the relationships among children's computer game use, academic achievement and parental governing approach to propose probable answers for the doubts of Taiwanese parents. 355 children (ages 11-14) were randomly sampled from 20 elementary schools in a typically urbanised county in Taiwan. Questionnaire survey (five questions)…

  16. Identifying approaches for assessing methodological and reporting quality of systematic reviews

    DEFF Research Database (Denmark)

    Pussegoda, Kusala; Turner, Lucy; Garritty, Chantelle

    2017-01-01

    there are potential gaps in research best-practice guidance materials. The aims of this study are to identify reports assessing the methodological quality (MQ) and/or reporting quality (RQ) of a cohort of SRs and to assess their number, general characteristics, and approaches to 'quality' assessment over time......BACKGROUND: The methodological quality and completeness of reporting of the systematic reviews (SRs) is fundamental to optimal implementation of evidence-based health care and the reduction of research waste. Methods exist to appraise SRs yet little is known about how they are used in SRs or where...... or reporting guidelines used as proxy to assess RQ were used in 80% (61/76) of identified reports. These included two reporting guidelines (PRISMA and QUOROM) and five quality assessment tools (AMSTAR, R-AMSTAR, OQAQ, Mulrow, Sacks) and GRADE criteria. The remaining 24% (18/76) of reports developed their own...

  17. Replicated Computations Results (RCR) report for “A holistic approach for collaborative workload execution in volunteer clouds”

    DEFF Research Database (Denmark)

    Vandin, Andrea

    2018-01-01

    “A Holistic Approach for Collaborative Workload Execution in Volunteer Clouds” [3] proposes a novel approach to task scheduling in volunteer clouds. Volunteer clouds are decentralized cloud systems based on collaborative task execution, where clients voluntarily share their own unused computational...

  18. Novel computational methods to predict drug–target interactions using graph mining and machine learning approaches

    KAUST Repository

    Olayan, Rawan S.

    2017-12-01

    Computational drug repurposing aims at finding new medical uses for existing drugs. The identification of novel drug-target interactions (DTIs) can be a useful part of such a task. Computational determination of DTIs is a convenient strategy for systematic screening of a large number of drugs in the attempt to identify new DTIs at low cost and with reasonable accuracy. This necessitates development of accurate computational methods that can help focus on the follow-up experimental validation on a smaller number of highly likely targets for a drug. Although many methods have been proposed for computational DTI prediction, they suffer the high false positive prediction rate or they do not predict the effect that drugs exert on targets in DTIs. In this report, first, we present a comprehensive review of the recent progress in the field of DTI prediction from data-centric and algorithm-centric perspectives. The aim is to provide a comprehensive review of computational methods for identifying DTIs, which could help in constructing more reliable methods. Then, we present DDR, an efficient method to predict the existence of DTIs. DDR achieves significantly more accurate results compared to the other state-of-theart methods. As supported by independent evidences, we verified as correct 22 out of the top 25 DDR DTIs predictions. This validation proves the practical utility of DDR, suggesting that DDR can be used as an efficient method to identify 5 correct DTIs. Finally, we present DDR-FE method that predicts the effect types of a drug on its target. On different representative datasets, under various test setups, and using different performance measures, we show that DDR-FE achieves extremely good performance. Using blind test data, we verified as correct 2,300 out of 3,076 DTIs effects predicted by DDR-FE. This suggests that DDR-FE can be used as an efficient method to identify correct effects of a drug on its target.

  19. Integrative biology approach identifies cytokine targeting strategies for psoriasis.

    Science.gov (United States)

    Perera, Gayathri K; Ainali, Chrysanthi; Semenova, Ekaterina; Hundhausen, Christian; Barinaga, Guillermo; Kassen, Deepika; Williams, Andrew E; Mirza, Muddassar M; Balazs, Mercedesz; Wang, Xiaoting; Rodriguez, Robert Sanchez; Alendar, Andrej; Barker, Jonathan; Tsoka, Sophia; Ouyang, Wenjun; Nestle, Frank O

    2014-02-12

    Cytokines are critical checkpoints of inflammation. The treatment of human autoimmune disease has been revolutionized by targeting inflammatory cytokines as key drivers of disease pathogenesis. Despite this, there exist numerous pitfalls when translating preclinical data into the clinic. We developed an integrative biology approach combining human disease transcriptome data sets with clinically relevant in vivo models in an attempt to bridge this translational gap. We chose interleukin-22 (IL-22) as a model cytokine because of its potentially important proinflammatory role in epithelial tissues. Injection of IL-22 into normal human skin grafts produced marked inflammatory skin changes resembling human psoriasis. Injection of anti-IL-22 monoclonal antibody in a human xenotransplant model of psoriasis, developed specifically to test potential therapeutic candidates, efficiently blocked skin inflammation. Bioinformatic analysis integrating both the IL-22 and anti-IL-22 cytokine transcriptomes and mapping them onto a psoriasis disease gene coexpression network identified key cytokine-dependent hub genes. Using knockout mice and small-molecule blockade, we show that one of these hub genes, the so far unexplored serine/threonine kinase PIM1, is a critical checkpoint for human skin inflammation and potential future therapeutic target in psoriasis. Using in silico integration of human data sets and biological models, we were able to identify a new target in the treatment of psoriasis.

  20. Enzyme (re)design: lessons from natural evolution and computation.

    Science.gov (United States)

    Gerlt, John A; Babbitt, Patricia C

    2009-02-01

    The (re)design of enzymes to catalyze 'new' reactions is a topic of considerable practical and intellectual interest. Directed evolution (random mutagenesis followed by screening/selection) has been used widely to identify novel biocatalysts. However, 'rational' approaches using either natural divergent evolution or computational predictions based on chemical principles have been less successful. This review summarizes recent progress in evolution-based and computation-based (re)design.

  1. Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs

    Science.gov (United States)

    Purba, J. H.

    2018-02-01

    Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.

  2. Interpretative approaches to identifying sources of hydrocarbons in complex contaminated environments

    International Nuclear Information System (INIS)

    Sauer, T.C.; Brown, J.S.; Boehm, P.D.

    1993-01-01

    Recent advances in analytical instrumental hardware and software have permitted the use of more sophisticated approaches in identifying or fingerprinting sources of hydrocarbons in complex matrix environments. In natural resource damage assessments and contaminated site investigations of both terrestrial and aquatic environments, chemical fingerprinting has become an important interpretative tool. The alkyl homologues of the major polycyclic and heterocyclic aromatic hydrocarbons (e.g., phenanthrenes/anthracenes, dibenzothiophenes, chrysenes) have been found to the most valuable hydrocarbons in differentiating hydrocarbon sources, but there are other hydrocarbon analytes, such as the chemical biomarkers steranes and triterpanes, and alkyl homologues of benzene, and chemical methodologies, such as scanning UV fluorescence, that have been found to be useful in certain environments. This presentation will focus on recent data interpretative approaches for hydrocarbon source identification assessments. Selection of appropriate targets analytes and data quality requirements will be discussed and example cases including the Arabian Gulf War oil spill results will be presented

  3. Information Technology Service Management with Cloud Computing Approach to Improve Administration System and Online Learning Performance

    Directory of Open Access Journals (Sweden)

    Wilianto Wilianto

    2015-10-01

    Full Text Available This work discusses the development of information technology service management using cloud computing approach to improve the performance of administration system and online learning at STMIK IBBI Medan, Indonesia. The network topology is modeled and simulated for system administration and online learning. The same network topology is developed in cloud computing using Amazon AWS architecture. The model is designed and modeled using Riverbed Academic Edition Modeler to obtain values of the parameters: delay, load, CPU utilization, and throughput. The simu- lation results are the following. For network topology 1, without cloud computing, the average delay is 54  ms, load 110 000 bits/s, CPU utilization 1.1%, and throughput 440  bits/s.  With  cloud  computing,  the  average  delay  is 45 ms,  load  2 800  bits/s,  CPU  utilization  0.03%,  and throughput 540 bits/s. For network topology 2, without cloud computing, the average delay is 39  ms, load 3 500 bits/s, CPU utilization 0.02%, and throughput database server 1 400 bits/s. With cloud computing, the average delay is 26 ms, load 5 400 bits/s, CPU utilization email server 0.0001%, FTP server 0.001%, HTTP server 0.0002%, throughput email server 85 bits/s, FTP    server 100 bits/sec, and HTTP server 95  bits/s.  Thus,  the  delay, the load, and the CPU utilization decrease; but,  the throughput increases. Information technology service management with cloud computing approach has better performance.

  4. A Computer-Based Game That Promotes Mathematics Learning More than a Conventional Approach

    Science.gov (United States)

    McLaren, Bruce M.; Adams, Deanne M.; Mayer, Richard E.; Forlizzi, Jodi

    2017-01-01

    Excitement about learning from computer-based games has been papable in recent years and has led to the development of many educational games. However, there are relatively few sound empirical studies in the scientific literature that have shown the benefits of learning mathematics from games as opposed to more traditional approaches. The…

  5. Candidate gene linkage approach to identify DNA variants that predispose to preterm birth

    DEFF Research Database (Denmark)

    Bream, Elise N A; Leppellere, Cara R; Cooper, Margaret E

    2013-01-01

    Background:The aim of this study was to identify genetic variants contributing to preterm birth (PTB) using a linkage candidate gene approach.Methods:We studied 99 single-nucleotide polymorphisms (SNPs) for 33 genes in 257 families with PTBs segregating. Nonparametric and parametric analyses were...... through the infant and/or the mother in the etiology of PTB....

  6. A Novel Goal-Oriented Approach for Training Older Adult Computer Novices: Beyond the Effects of Individual-Difference Factors.

    Science.gov (United States)

    Hollis-Sawyer, Lisa A.; Sterns, Harvey L.

    1999-01-01

    Spreadsheet training using either goal-oriented or verbal persuasion approach was given to 106 computer novices aged 50-89. Goal orientation achieved more changes in computer attitudes, efficacy, and proficiency. Intellectual ability and personality dimensions did not affect results. (SK)

  7. Vehicular traffic noise prediction using soft computing approach.

    Science.gov (United States)

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Integrative approaches to computational biomedicine

    Science.gov (United States)

    Coveney, Peter V.; Diaz-Zuccarini, Vanessa; Graf, Norbert; Hunter, Peter; Kohl, Peter; Tegner, Jesper; Viceconti, Marco

    2013-01-01

    The new discipline of computational biomedicine is concerned with the application of computer-based techniques and particularly modelling and simulation to human health. Since 2007, this discipline has been synonymous, in Europe, with the name given to the European Union's ambitious investment in integrating these techniques with the eventual aim of modelling the human body as a whole: the virtual physiological human. This programme and its successors are expected, over the next decades, to transform the study and practice of healthcare, moving it towards the priorities known as ‘4P's’: predictive, preventative, personalized and participatory medicine.

  9. ReputationPro: The Efficient Approaches to Contextual Transaction Trust Computation in E-Commerce Environments

    OpenAIRE

    Zhang, Haibin; Wang, Yan; Zhang, Xiuzhen; Lim, Ee-Peng

    2013-01-01

    In e-commerce environments, the trustworthiness of a seller is utterly important to potential buyers, especially when the seller is unknown to them. Most existing trust evaluation models compute a single value to reflect the general trust level of a seller without taking any transaction context information into account. In this paper, we first present a trust vector consisting of three values for Contextual Transaction Trust (CTT). In the computation of three CTT values, the identified three ...

  10. Combining phenotypic and proteomic approaches to identify membrane targets in a ‘triple negative’ breast cancer cell type

    Directory of Open Access Journals (Sweden)

    Rust Steven

    2013-02-01

    Full Text Available Abstract Background The continued discovery of therapeutic antibodies, which address unmet medical needs, requires the continued discovery of tractable antibody targets. Multiple protein-level target discovery approaches are available and these can be used in combination to extensively survey relevant cell membranomes. In this study, the MDA-MB-231 cell line was selected for membranome survey as it is a ‘triple negative’ breast cancer cell line, which represents a cancer subtype that is aggressive and has few treatment options. Methods The MDA-MB-231 breast carcinoma cell line was used to explore three membranome target discovery approaches, which were used in parallel to cross-validate the significance of identified antigens. A proteomic approach, which used membrane protein enrichment followed by protein identification by mass spectrometry, was used alongside two phenotypic antibody screening approaches. The first phenotypic screening approach was based on hybridoma technology and the second was based on phage display technology. Antibodies isolated by the phenotypic approaches were tested for cell specificity as well as internalisation and the targets identified were compared to each other as well as those identified by the proteomic approach. An anti-CD73 antibody derived from the phage display-based phenotypic approach was tested for binding to other ‘triple negative’ breast cancer cell lines and tested for tumour growth inhibitory activity in a MDA-MB-231 xenograft model. Results All of the approaches identified multiple cell surface markers, including integrins, CD44, EGFR, CD71, galectin-3, CD73 and BCAM, some of which had been previously confirmed as being tractable to antibody therapy. In total, 40 cell surface markers were identified for further study. In addition to cell surface marker identification, the phenotypic antibody screening approaches provided reagent antibodies for target validation studies. This is illustrated

  11. Improving accuracy for identifying related PubMed queries by an integrated approach.

    Science.gov (United States)

    Lu, Zhiyong; Wilbur, W John

    2009-10-01

    PubMed is the most widely used tool for searching biomedical literature online. As with many other online search tools, a user often types a series of multiple related queries before retrieving satisfactory results to fulfill a single information need. Meanwhile, it is also a common phenomenon to see a user type queries on unrelated topics in a single session. In order to study PubMed users' search strategies, it is necessary to be able to automatically separate unrelated queries and group together related queries. Here, we report a novel approach combining both lexical and contextual analyses for segmenting PubMed query sessions and identifying related queries and compare its performance with the previous approach based solely on concept mapping. We experimented with our integrated approach on sample data consisting of 1539 pairs of consecutive user queries in 351 user sessions. The prediction results of 1396 pairs agreed with the gold-standard annotations, achieving an overall accuracy of 90.7%. This demonstrates that our approach is significantly better than the previously published method. By applying this approach to a one day query log of PubMed, we found that a significant proportion of information needs involved more than one PubMed query, and that most of the consecutive queries for the same information need are lexically related. Finally, the proposed PubMed distance is shown to be an accurate and meaningful measure for determining the contextual similarity between biological terms. The integrated approach can play a critical role in handling real-world PubMed query log data as is demonstrated in our experiments.

  12. Computational methods in metabolic engineering for strain design.

    Science.gov (United States)

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    Science.gov (United States)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  14. CREATIVE APPROACHES TO COMPUTER SCIENCE EDUCATION

    Directory of Open Access Journals (Sweden)

    V. B. Raspopov

    2010-04-01

    Full Text Available Using the example of PPS «Toolbox of multimedia lessons «For Children About Chopin» we demonstrate the possibility of involving creative students in developing the software packages for educational purposes. Similar projects can be assigned to school and college students studying computer sciences and informatics, and implemented under the teachers’ supervision, as advanced assignments or thesis projects as a part of a high school course IT or Computer Sciences, a college course of Applied Scientific Research, or as a part of preparation for students’ participation in the Computer Science competitions or IT- competitions of Youth Academy of Sciences ( MAN in Russian or in Ukrainian.

  15. A Systematic Approach to Identify Promising New Items for Small to Medium Enterprises: A Case Study

    Directory of Open Access Journals (Sweden)

    Sukjae Jeong

    2016-11-01

    Full Text Available Despite the growing importance of identifying new business items for small and medium enterprises (SMEs, most previous studies focus on conglomerates. The paucity of empirical studies has also led to limited real-life applications. Hence, this study proposes a systematic approach to find new business items (NBIs that help the prospective SMEs develop, evaluate, and select viable business items to survive the competitive environment. The proposed approach comprises two stages: (1 the classification of diversification of SMEs; and (2 the searching and screening of business items. In the first stage, SMEs are allocated to five groups, based on their internal technological competency and external market conditions. In the second stage, based on the types of SMEs identified in the first stage, a set of alternative business items is derived by combining the results of portfolio analysis and benchmarking analysis. After deriving new business items, a market and technology-driven matrix analysis is utilized to screen suitable business items, and the Bruce Merrifield-Ohe (BMO method is used to categorize and identify prospective items based on market attractiveness and internal capability. To illustrate the applicability of the proposed approach, a case study is presented.

  16. Inhibitors of HIV-protease from computational design. A history of theory and synthesis still to be fully appreciated.

    Science.gov (United States)

    Berti, Federico; Frecer, Vladimir; Miertus, Stanislav

    2014-01-01

    Despite the fact that HIV-Protease is an over 20 years old target, computational approaches to rational design of its inhibitors still have a great potential to stimulate the synthesis of new compounds and the discovery of new, potent derivatives, ever capable to overcome the problem of drug resistance. This review deals with successful examples of inhibitors identified by computational approaches, rather than by knowledge-based design. Such methodologies include the development of energy and scoring functions, docking protocols, statistical models, virtual combinatorial chemistry. Computations addressing drug resistance, and the development of related models as the substrate envelope hypothesis are also reviewed. In some cases, the identified structures required the development of synthetic approaches in order to obtain the desired target molecules; several examples are reported.

  17. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    Science.gov (United States)

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  18. Adiabatic graph-state quantum computation

    International Nuclear Information System (INIS)

    Antonio, B; Anders, J; Markham, D

    2014-01-01

    Measurement-based quantum computation (MBQC) and holonomic quantum computation (HQC) are two very different computational methods. The computation in MBQC is driven by adaptive measurements executed in a particular order on a large entangled state. In contrast in HQC the system starts in the ground subspace of a Hamiltonian which is slowly changed such that a transformation occurs within the subspace. Following the approach of Bacon and Flammia, we show that any MBQC on a graph state with generalized flow (gflow) can be converted into an adiabatically driven holonomic computation, which we call adiabatic graph-state quantum computation (AGQC). We then investigate how properties of AGQC relate to the properties of MBQC, such as computational depth. We identify a trade-off that can be made between the number of adiabatic steps in AGQC and the norm of H-dot as well as the degree of H, in analogy to the trade-off between the number of measurements and classical post-processing seen in MBQC. Finally the effects of performing AGQC with orderings that differ from standard MBQC are investigated. (paper)

  19. Outbreaks source: A new mathematical approach to identify their possible location

    Science.gov (United States)

    Buscema, Massimo; Grossi, Enzo; Breda, Marco; Jefferson, Tom

    2009-11-01

    Classical epidemiology has generally relied on the description and explanation of the occurrence of infectious diseases in relation to time occurrence of events rather than to place of occurrence. In recent times, computer generated dot maps have facilitated the modeling of the spread of infectious epidemic diseases either with classical statistics approaches or with artificial “intelligent systems”. Few attempts, however, have been made so far to identify the origin of the epidemic spread rather than its evolution by mathematical topology methods. We report on the use of a new artificial intelligence method (the H-PST Algorithm) and we compare this new technique with other well known algorithms to identify the source of three examples of infectious disease outbreaks derived from literature. The H-PST algorithm is a new system able to project a distances matrix of points (events) into a bi-dimensional space, with the generation of a new point, named hidden unit. This new hidden unit deforms the original Euclidean space and transforms it into a new space (cognitive space). The cost function of this transformation is the minimization of the differences between the original distance matrix among the assigned points and the distance matrix of the same points projected into the bi-dimensional map (or any different set of constraints). For many reasons we will discuss, the position of the hidden unit shows to target the outbreak source in many epidemics much better than the other classic algorithms specifically targeted for this task. Compared with main algorithms known in the location theory, the hidden unit was within yards of the outbreak source in the first example (the 2007 epidemic of Chikungunya fever in Italy). The hidden unit was located in the river between the two village epicentres of the spread exactly where the index case was living. Equally in the second (the 1967 foot and mouth disease epidemic in England), and the third (1854 London Cholera epidemic

  20. An approach to computing discrete adjoints for MPI-parallelized models applied to Ice Sheet System Model 4.11

    Directory of Open Access Journals (Sweden)

    E. Larour

    2016-11-01

    Full Text Available Within the framework of sea-level rise projections, there is a strong need for hindcast validation of the evolution of polar ice sheets in a way that tightly matches observational records (from radar, gravity, and altimetry observations mainly. However, the computational requirements for making hindcast reconstructions possible are severe and rely mainly on the evaluation of the adjoint state of transient ice-flow models. Here, we look at the computation of adjoints in the context of the NASA/JPL/UCI Ice Sheet System Model (ISSM, written in C++ and designed for parallel execution with MPI. We present the adaptations required in the way the software is designed and written, but also generic adaptations in the tools facilitating the adjoint computations. We concentrate on the use of operator overloading coupled with the AdjoinableMPI library to achieve the adjoint computation of the ISSM. We present a comprehensive approach to (1 carry out type changing through the ISSM, hence facilitating operator overloading, (2 bind to external solvers such as MUMPS and GSL-LU, and (3 handle MPI-based parallelism to scale the capability. We demonstrate the success of the approach by computing sensitivities of hindcast metrics such as the misfit to observed records of surface altimetry on the northeastern Greenland Ice Stream, or the misfit to observed records of surface velocities on Upernavik Glacier, central West Greenland. We also provide metrics for the scalability of the approach, and the expected performance. This approach has the potential to enable a new generation of hindcast-validated projections that make full use of the wealth of datasets currently being collected, or already collected, in Greenland and Antarctica.

  1. Computational Dehydration of Crystalline Hydrates Using Molecular Dynamics Simulations

    DEFF Research Database (Denmark)

    Larsen, Anders Støttrup; Rantanen, Jukka; Johansson, Kristoffer E

    2017-01-01

    Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach...... to the dehydration of ampicillin trihydrate. The crystallographic unit cell of the trihydrate is used to construct the simulation cell containing 216 ampicillin and 648 water molecules. This system is dehydrated by removing water molecules during a 2200 ps simulation, and depending on the computational dehydration....... The structural changes could be followed in real time, and in addition, an intermediate amorphous phase was identified. The computationally identified dehydrated structure (anhydrate) was slightly different from the experimentally known anhydrate structure suggesting that the simulated computational structure...

  2. Computer based virtual reality approach towards its application in an accidental emergency at nuclear power plant

    International Nuclear Information System (INIS)

    Yan Jun; Yao Qingshan

    1999-01-01

    Virtual reality is a computer based system for creating and receiving virtual world. As an emerging branch of computer discipline, this approach is extensively expanding and widely used in variety of industries such as national defence, research, engineering, medicine and air navigation. The author intends to present the fundamentals of virtual reality, in attempt to study some interested aspects for use in nuclear power emergency planning

  3. Merging K-means with hierarchical clustering for identifying general-shaped groups.

    Science.gov (United States)

    Peterson, Anna D; Ghosh, Arka P; Maitra, Ranjan

    2018-01-01

    Clustering partitions a dataset such that observations placed together in a group are similar but different from those in other groups. Hierarchical and K -means clustering are two approaches but have different strengths and weaknesses. For instance, hierarchical clustering identifies groups in a tree-like structure but suffers from computational complexity in large datasets while K -means clustering is efficient but designed to identify homogeneous spherically-shaped clusters. We present a hybrid non-parametric clustering approach that amalgamates the two methods to identify general-shaped clusters and that can be applied to larger datasets. Specifically, we first partition the dataset into spherical groups using K -means. We next merge these groups using hierarchical methods with a data-driven distance measure as a stopping criterion. Our proposal has the potential to reveal groups with general shapes and structure in a dataset. We demonstrate good performance on several simulated and real datasets.

  4. The Educator´s Approach to Media Training and Computer Games within Leisure Time of School-children

    OpenAIRE

    MORAVCOVÁ, Dagmar

    2009-01-01

    The paper describes possible ways of approaching computer games playing as part of leisure time of school-children and deals with the significance of media training in leisure time. At first it specifies the concept of leisure time and its functions, then shows some positive and negative effects of the media. It further describes classical computer games, the problem of excess computer game playing and means of prevention. The paper deals with the educator's personality and the importance of ...

  5. A flexible, extendable, modular and computationally efficient approach to scattering-integral-based seismic full waveform inversion

    Science.gov (United States)

    Schumacher, F.; Friederich, W.; Lamara, S.

    2016-02-01

    We present a new conceptual approach to scattering-integral-based seismic full waveform inversion (FWI) that allows a flexible, extendable, modular and both computationally and storage-efficient numerical implementation. To achieve maximum modularity and extendability, interactions between the three fundamental steps carried out sequentially in each iteration of the inversion procedure, namely, solving the forward problem, computing waveform sensitivity kernels and deriving a model update, are kept at an absolute minimum and are implemented by dedicated interfaces. To realize storage efficiency and maximum flexibility, the spatial discretization of the inverted earth model is allowed to be completely independent of the spatial discretization employed by the forward solver. For computational efficiency reasons, the inversion is done in the frequency domain. The benefits of our approach are as follows: (1) Each of the three stages of an iteration is realized by a stand-alone software program. In this way, we avoid the monolithic, unflexible and hard-to-modify codes that have often been written for solving inverse problems. (2) The solution of the forward problem, required for kernel computation, can be obtained by any wave propagation modelling code giving users maximum flexibility in choosing the forward modelling method. Both time-domain and frequency-domain approaches can be used. (3) Forward solvers typically demand spatial discretizations that are significantly denser than actually desired for the inverted model. Exploiting this fact by pre-integrating the kernels allows a dramatic reduction of disk space and makes kernel storage feasible. No assumptions are made on the spatial discretization scheme employed by the forward solver. (4) In addition, working in the frequency domain effectively reduces the amount of data, the number of kernels to be computed and the number of equations to be solved. (5) Updating the model by solving a large equation system can be

  6. Fourier-based approach to interpolation in single-slice helical computed tomography

    International Nuclear Information System (INIS)

    La Riviere, Patrick J.; Pan Xiaochuan

    2001-01-01

    It has recently been shown that longitudinal aliasing can be a significant and detrimental presence in reconstructed single-slice helical computed tomography (CT) volumes. This aliasing arises because the directly measured data in helical CT are generally undersampled by a factor of at least 2 in the longitudinal direction and because the exploitation of the redundancy of fanbeam data acquired over 360 degree sign to generate additional longitudinal samples does not automatically eliminate the aliasing. In this paper we demonstrate that for pitches near 1 or lower, the redundant fanbeam data, when used properly, can provide sufficient information to satisfy a generalized sampling theorem and thus to eliminate aliasing. We develop and evaluate a Fourier-based algorithm, called 180FT, that accomplishes this. As background we present a second Fourier-based approach, called 360FT, that makes use only of the directly measured data. Both Fourier-based approaches exploit the fast Fourier transform and the Fourier shift theorem to generate from the helical projection data a set of fanbeam sinograms corresponding to equispaced transverse slices. Slice-by-slice reconstruction is then performed by use of two-dimensional fanbeam algorithms. The proposed approaches are compared to their counterparts based on the use of linear interpolation - the 360LI and 180LI approaches. The aliasing suppression property of the 180FT approach is a clear advantage of the approach and represents a step toward the desirable goal of achieving uniform longitudinal resolution properties in reconstructed helical CT volumes

  7. A computational approach to climate science education with CLIMLAB

    Science.gov (United States)

    Rose, B. E. J.

    2017-12-01

    CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format

  8. Identification of novel adhesins of M. tuberculosis H37Rv using integrated approach of multiple computational algorithms and experimental analysis.

    Directory of Open Access Journals (Sweden)

    Sanjiv Kumar

    Full Text Available Pathogenic bacteria interacting with eukaryotic host express adhesins on their surface. These adhesins aid in bacterial attachment to the host cell receptors during colonization. A few adhesins such as Heparin binding hemagglutinin adhesin (HBHA, Apa, Malate Synthase of M. tuberculosis have been identified using specific experimental interaction models based on the biological knowledge of the pathogen. In the present work, we carried out computational screening for adhesins of M. tuberculosis. We used an integrated computational approach using SPAAN for predicting adhesins, PSORTb, SubLoc and LocTree for extracellular localization, and BLAST for verifying non-similarity to human proteins. These steps are among the first of reverse vaccinology. Multiple claims and attacks from different algorithms were processed through argumentative approach. Additional filtration criteria included selection for proteins with low molecular weights and absence of literature reports. We examined binding potential of the selected proteins using an image based ELISA. The protein Rv2599 (membrane protein binds to human fibronectin, laminin and collagen. Rv3717 (N-acetylmuramoyl-L-alanine amidase and Rv0309 (L,D-transpeptidase bind to fibronectin and laminin. We report Rv2599 (membrane protein, Rv0309 and Rv3717 as novel adhesins of M. tuberculosis H37Rv. Our results expand the number of known adhesins of M. tuberculosis and suggest their regulated expression in different stages.

  9. A direct approach to fault-tolerance in measurement-based quantum computation via teleportation

    International Nuclear Information System (INIS)

    Silva, Marcus; Danos, Vincent; Kashefi, Elham; Ollivier, Harold

    2007-01-01

    We discuss a simple variant of the one-way quantum computing model (Raussendorf R and Briegel H-J 2001 Phys. Rev. Lett. 86 5188), called the Pauli measurement model, where measurements are restricted to be along the eigenbases of the Pauli X and Y operators, while qubits can be initially prepared both in the vertical bar + π/4 > := 1/√2( vertical bar 0> + e i(π/4) vertical bar 1>) state and the usual vertical bar +> := 1/√2 ( vertical bar 0 > + vertical bar 1>) state. We prove the universality of this quantum computation model, and establish a standardization procedure which permits all entanglement and state preparation to be performed at the beginning of computation. This leads us to develop a direct approach to fault-tolerance by simple transformations of the entanglement graph and preparation operations, while error correction is performed naturally via syndrome-extracting teleportations

  10. Discovery of resources using MADM approaches for parallel and distributed computing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2017-06-01

    Full Text Available Grid, a form of parallel and distributed computing, allows the sharing of data and computational resources among its users from various geographical locations. The grid resources are diverse in terms of their underlying attributes. The majority of the state-of-the-art resource discovery techniques rely on the static resource attributes during resource selection. However, the matching resources based on the static resource attributes may not be the most appropriate resources for the execution of user applications because they may have heavy job loads, less storage space or less working memory (RAM. Hence, there is a need to consider the current state of the resources in order to find the most suitable resources. In this paper, we have proposed a two-phased multi-attribute decision making (MADM approach for discovery of grid resources by using P2P formalism. The proposed approach considers multiple resource attributes for decision making of resource selection and provides the best suitable resource(s to grid users. The first phase describes a mechanism to discover all matching resources and applies SAW method to shortlist the top ranked resources, which are communicated to the requesting super-peer. The second phase of our proposed methodology applies integrated MADM approach (AHP enriched PROMETHEE-II on the list of selected resources received from different super-peers. The pairwise comparison of the resources with respect to their attributes is made and the rank of each resource is determined. The top ranked resource is then communicated to the grid user by the grid scheduler. Our proposed methodology enables the grid scheduler to allocate the most suitable resource to the user application and also reduces the search complexity by filtering out the less suitable resources during resource discovery.

  11. Application of Computer Simulation to Identify Erosion Resistance of Materials of Wet-steam Turbine Blades

    Science.gov (United States)

    Korostelyov, D. A.; Dergachyov, K. V.

    2017-10-01

    A problem of identifying the efficiency of using materials, coatings, linings and solderings of wet-steam turbine rotor blades by means of computer simulation is considered. Numerical experiments to define erosion resistance of materials of wet-steam turbine blades are described. Kinetic curves for erosion area and weight of the worn rotor blade material of turbines K-300-240 LMP and atomic icebreaker “Lenin” have been defined. The conclusion about the effectiveness of using different erosion-resistant materials and protection configuration of rotor blades is also made.

  12. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  13. Selection of personalized patient therapy through the use of knowledge-based computational models that identify tumor-driving signal transduction pathways.

    Science.gov (United States)

    Verhaegh, Wim; van Ooijen, Henk; Inda, Márcia A; Hatzis, Pantelis; Versteeg, Rogier; Smid, Marcel; Martens, John; Foekens, John; van de Wiel, Paul; Clevers, Hans; van de Stolpe, Anja

    2014-06-01

    Increasing knowledge about signal transduction pathways as drivers of cancer growth has elicited the development of "targeted drugs," which inhibit aberrant signaling pathways. They require a companion diagnostic test that identifies the tumor-driving pathway; however, currently available tests like estrogen receptor (ER) protein expression for hormonal treatment of breast cancer do not reliably predict therapy response, at least in part because they do not adequately assess functional pathway activity. We describe a novel approach to predict signaling pathway activity based on knowledge-based Bayesian computational models, which interpret quantitative transcriptome data as the functional output of an active signaling pathway, by using expression levels of transcriptional target genes. Following calibration on only a small number of cell lines or cohorts of patient data, they provide a reliable assessment of signaling pathway activity in tumors of different tissue origin. As proof of principle, models for the canonical Wnt and ER pathways are presented, including initial clinical validation on independent datasets from various cancer types. ©2014 American Association for Cancer Research.

  14. Computational Design and Discovery of Ni-Based Alloys and Coatings: Thermodynamic Approaches Validated by Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zi-Kui [Pennsylvania State University; Gleeson, Brian [University of Pittsburgh; Shang, Shunli [Pennsylvania State University; Gheno, Thomas [University of Pittsburgh; Lindwall, Greta [Pennsylvania State University; Zhou, Bi-Cheng [Pennsylvania State University; Liu, Xuan [Pennsylvania State University; Ross, Austin [Pennsylvania State University

    2018-04-23

    This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities, which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.

  15. Identifying bully victims: definitional versus behavioral approaches.

    Science.gov (United States)

    Green, Jennifer Greif; Felix, Erika D; Sharkey, Jill D; Furlong, Michael J; Kras, Jennifer E

    2013-06-01

    Schools frequently assess bullying and the Olweus Bully/Victimization Questionnaire (BVQ; Olweus, 1996) is the most widely adopted tool for this purpose. The BVQ is a self-report survey that uses a definitional measurement method--describing "bullying" as involving repeated, intentional aggression in a relationship where there is an imbalance of power and then asking respondents to indicate how frequently they experienced this type of victimization. Few studies have examined BVQ validity and whether this definitional method truly identifies the repetition and power differential that distinguish bullying from other forms of peer victimization. This study examined the concurrent validity of the BVQ definitional question among 435 students reporting peer victimization. BVQ definitional responses were compared with responses to a behavioral measure that did not use the term "bullying" but, instead, included items that asked about its defining characteristics (repetition, intentionality, power imbalance). Concordance between the two approaches was moderate, with an area under the receiver operating curve of .72. BVQ responses were more strongly associated with students indicating repeated victimization and multiple forms of victimization, than with power imbalance in their relationship with the bully. Findings indicate that the BVQ is a valid measure of repeated victimization and a broad range of victimization experiences but may not detect the more subtle and complex power imbalances that distinguish bullying from other forms of peer victimization. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  16. Monitoring and Auditing Residual Information on the User’s Computer

    Directory of Open Access Journals (Sweden)

    Vladlena Sergeevna Oladk

    2015-06-01

    Full Text Available This paper considers the problem of violation of information security components such as confidentiality and availability in the event of a computer user's residual information. Analyze the requirements of regulators and mechanisms to be applied in the organization to monitor the residual information or its destruction.Approach to monitoring and auditing residual information on the user's computer, which allows monitoring the residual information in certain areas proposed. Approach allows us to identify the detected information and rank it according to the degree of criticality, as well as calculate the risk of leakage and its potential to develop recommendations aimed at its reduction. The proposed approach is formally described and automated in a software system.

  17. Computational methods for protein identification from mass spectrometry data.

    Directory of Open Access Journals (Sweden)

    Leo McHugh

    2008-02-01

    Full Text Available Protein identification using mass spectrometry is an indispensable computational tool in the life sciences. A dramatic increase in the use of proteomic strategies to understand the biology of living systems generates an ongoing need for more effective, efficient, and accurate computational methods for protein identification. A wide range of computational methods, each with various implementations, are available to complement different proteomic approaches. A solid knowledge of the range of algorithms available and, more critically, the accuracy and effectiveness of these techniques is essential to ensure as many of the proteins as possible, within any particular experiment, are correctly identified. Here, we undertake a systematic review of the currently available methods and algorithms for interpreting, managing, and analyzing biological data associated with protein identification. We summarize the advances in computational solutions as they have responded to corresponding advances in mass spectrometry hardware. The evolution of scoring algorithms and metrics for automated protein identification are also discussed with a focus on the relative performance of different techniques. We also consider the relative advantages and limitations of different techniques in particular biological contexts. Finally, we present our perspective on future developments in the area of computational protein identification by considering the most recent literature on new and promising approaches to the problem as well as identifying areas yet to be explored and the potential application of methods from other areas of computational biology.

  18. A novel computational method identifies intra- and inter-species recombination events in Staphylococcus aureus and Streptococcus pneumoniae.

    Directory of Open Access Journals (Sweden)

    Lisa Sanguinetti

    Full Text Available Advances in high-throughput DNA sequencing technologies have determined an explosion in the number of sequenced bacterial genomes. Comparative sequence analysis frequently reveals evidences of homologous recombination occurring with different mechanisms and rates in different species, but the large-scale use of computational methods to identify recombination events is hampered by their high computational costs. Here, we propose a new method to identify recombination events in large datasets of whole genome sequences. Using a filtering procedure of the gene conservation profiles of a test genome against a panel of strains, this algorithm identifies sets of contiguous genes acquired by homologous recombination. The locations of the recombination breakpoints are determined using a statistical test that is able to account for the differences in the natural rate of evolution between different genes. The algorithm was tested on a dataset of 75 genomes of Staphylococcus aureus and 50 genomes comprising different streptococcal species, and was able to detect intra-species recombination events in S. aureus and in Streptococcus pneumoniae. Furthermore, we found evidences of an inter-species exchange of genetic material between S. pneumoniae and Streptococcus mitis, a closely related commensal species that colonizes the same ecological niche. The method has been implemented in an R package, Reco, which is freely available from supplementary material, and provides a rapid screening tool to investigate recombination on a genome-wide scale from sequence data.

  19. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    Science.gov (United States)

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  20. Identifying functional reorganization of spelling networks: an individual peak probability comparison approach

    Science.gov (United States)

    Purcell, Jeremy J.; Rapp, Brenda

    2013-01-01

    Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981

  1. How can computers support, enrich, and transform collaborative creativity

    DEFF Research Database (Denmark)

    Dalsgaard, Peter; Inie, Nanna; Hansen, Nicolai Brodersen

    2017-01-01

    of different approaches to providing digital support for collaborative creativity. Participation in the workshop requires participants to actively document and identify salient themes in one or more examples of computer- supported collaborative creativity, and the resulting material will serve as the empirical...

  2. An integrative -omics approach to identify functional sub-networks in human colorectal cancer.

    Directory of Open Access Journals (Sweden)

    Rod K Nibbe

    2010-01-01

    Full Text Available Emerging evidence indicates that gene products implicated in human cancers often cluster together in "hot spots" in protein-protein interaction (PPI networks. Additionally, small sub-networks within PPI networks that demonstrate synergistic differential expression with respect to tumorigenic phenotypes were recently shown to be more accurate classifiers of disease progression when compared to single targets identified by traditional approaches. However, many of these studies rely exclusively on mRNA expression data, a useful but limited measure of cellular activity. Proteomic profiling experiments provide information at the post-translational level, yet they generally screen only a limited fraction of the proteome. Here, we demonstrate that integration of these complementary data sources with a "proteomics-first" approach can enhance the discovery of candidate sub-networks in cancer that are well-suited for mechanistic validation in disease. We propose that small changes in the mRNA expression of multiple genes in the neighborhood of a protein-hub can be synergistically associated with significant changes in the activity of that protein and its network neighbors. Further, we hypothesize that proteomic targets with significant fold change between phenotype and control may be used to "seed" a search for small PPI sub-networks that are functionally associated with these targets. To test this hypothesis, we select proteomic targets having significant expression changes in human colorectal cancer (CRC from two independent 2-D gel-based screens. Then, we use random walk based models of network crosstalk and develop novel reference models to identify sub-networks that are statistically significant in terms of their functional association with these proteomic targets. Subsequently, using an information-theoretic measure, we evaluate synergistic changes in the activity of identified sub-networks based on genome-wide screens of mRNA expression in CRC

  3. Numerical methodologies for investigation of moderate-velocity flow using a hybrid computational fluid dynamics - molecular dynamics simulation approach

    International Nuclear Information System (INIS)

    Ko, Soon Heum; Kim, Na Yong; Nikitopoulos, Dimitris E.; Moldovan, Dorel; Jha, Shantenu

    2014-01-01

    Numerical approaches are presented to minimize the statistical errors inherently present due to finite sampling and the presence of thermal fluctuations in the molecular region of a hybrid computational fluid dynamics (CFD) - molecular dynamics (MD) flow solution. Near the fluid-solid interface the hybrid CFD-MD simulation approach provides a more accurate solution, especially in the presence of significant molecular-level phenomena, than the traditional continuum-based simulation techniques. It also involves less computational cost than the pure particle-based MD. Despite these advantages the hybrid CFD-MD methodology has been applied mostly in flow studies at high velocities, mainly because of the higher statistical errors associated with low velocities. As an alternative to the costly increase of the size of the MD region to decrease statistical errors, we investigate a few numerical approaches that reduce sampling noise of the solution at moderate-velocities. These methods are based on sampling of multiple simulation replicas and linear regression of multiple spatial/temporal samples. We discuss the advantages and disadvantages of each technique in the perspective of solution accuracy and computational cost.

  4. Parallel computations of molecular dynamics trajectories using the stochastic path approach

    Science.gov (United States)

    Zaloj, Veaceslav; Elber, Ron

    2000-06-01

    A novel protocol to parallelize molecular dynamics trajectories is discussed and tested on a cluster of PCs running the NT operating system. The new technique does not propagate the solution in small time steps, but uses instead a global optimization of a functional of the whole trajectory. The new approach is especially attractive for parallel and distributed computing and its advantages (and disadvantages) are presented. Two numerical examples are discussed: (a) A conformational transition in a solvated dipeptide, and (b) The R→T conformational transition in solvated hemoglobin.

  5. An approach to quantum-computational hydrologic inverse analysis.

    Science.gov (United States)

    O'Malley, Daniel

    2018-05-02

    Making predictions about flow and transport in an aquifer requires knowledge of the heterogeneous properties of the aquifer such as permeability. Computational methods for inverse analysis are commonly used to infer these properties from quantities that are more readily observable such as hydraulic head. We present a method for computational inverse analysis that utilizes a type of quantum computer called a quantum annealer. While quantum computing is in an early stage compared to classical computing, we demonstrate that it is sufficiently developed that it can be used to solve certain subsurface flow problems. We utilize a D-Wave 2X quantum annealer to solve 1D and 2D hydrologic inverse problems that, while small by modern standards, are similar in size and sometimes larger than hydrologic inverse problems that were solved with early classical computers. Our results and the rapid progress being made with quantum computing hardware indicate that the era of quantum-computational hydrology may not be too far in the future.

  6. Coronary plaque quantification and fractional flow reserve by coronary computed tomography angiography identify ischaemia-causing lesions

    DEFF Research Database (Denmark)

    Gaur, Sara; Øvrehus, Kristian Altern; Dey, Damini

    2016-01-01

    AIMS: Coronary plaque characteristics are associated with ischaemia. Differences in plaque volumes and composition may explain the discordance between coronary stenosis severity and ischaemia. We evaluated the association between coronary stenosis severity, plaque characteristics, coronary computed...... tomography angiography (CTA)-derived fractional flow reserve (FFRCT), and lesion-specific ischaemia identified by FFR in a substudy of the NXT trial (Analysis of Coronary Blood Flow Using CT Angiography: Next Steps). METHODS AND RESULTS: Coronary CTA stenosis, plaque volumes, FFRCT, and FFR were assessed...

  7. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  8. An Organic Computing Approach to Self-organising Robot Ensembles

    Directory of Open Access Journals (Sweden)

    Sebastian Albrecht von Mammen

    2016-11-01

    Full Text Available Similar to the Autonomous Computing initiative, that has mainly been advancing techniques for self-optimisation focussing on computing systems and infrastructures, Organic Computing (OC has been driving the development of system design concepts and algorithms for self-adaptive systems at large. Examples of application domains include, for instance, traffic management and control, cloud services, communication protocols, and robotic systems. Such an OC system typically consists of a potentially large set of autonomous and self-managed entities, where each entity acts with a local decision horizon. By means of cooperation of the individual entities, the behaviour of the entire ensemble system is derived. In this article, we present our work on how autonomous, adaptive robot ensembles can benefit from OC technology. Our elaborations are aligned with the different layers of an observer/controller framework which provides the foundation for the individuals' adaptivity at system design-level. Relying on an extended Learning Classifier System (XCS in combination with adequate simulation techniques, this basic system design empowers robot individuals to improve their individual and collaborative performances, e.g. by means of adapting to changing goals and conditions.Not only for the sake of generalisability, but also because of its enormous transformative potential, we stage our research in the domain of robot ensembles that are typically comprised of several quad-rotors and that organise themselves to fulfil spatial tasks such as maintenance of building facades or the collaborative search for mobile targets. Our elaborations detail the architectural concept, provide examples of individual self-optimisation as well as of the optimisation of collaborative efforts, and we show how the user can control the ensembles at multiple levels of abstraction. We conclude with a summary of our approach and an outlook on possible future steps.

  9. Clustering approaches to identifying gene expression patterns from DNA microarray data.

    Science.gov (United States)

    Do, Jin Hwan; Choi, Dong-Kug

    2008-04-30

    The analysis of microarray data is essential for large amounts of gene expression data. In this review we focus on clustering techniques. The biological rationale for this approach is the fact that many co-expressed genes are co-regulated, and identifying co-expressed genes could aid in functional annotation of novel genes, de novo identification of transcription factor binding sites and elucidation of complex biological pathways. Co-expressed genes are usually identified in microarray experiments by clustering techniques. There are many such methods, and the results obtained even for the same datasets may vary considerably depending on the algorithms and metrics for dissimilarity measures used, as well as on user-selectable parameters such as desired number of clusters and initial values. Therefore, biologists who want to interpret microarray data should be aware of the weakness and strengths of the clustering methods used. In this review, we survey the basic principles of clustering of DNA microarray data from crisp clustering algorithms such as hierarchical clustering, K-means and self-organizing maps, to complex clustering algorithms like fuzzy clustering.

  10. Diagnostic Accuracy of Periapical Radiography and Cone-beam Computed Tomography in Identifying Root Canal Configuration of Human Premolars.

    Science.gov (United States)

    Sousa, Thiago Oliveira; Haiter-Neto, Francisco; Nascimento, Eduarda Helena Leandro; Peroni, Leonardo Vieira; Freitas, Deborah Queiroz; Hassan, Bassam

    2017-07-01

    The aim of this study was to assess the diagnostic accuracy of periapical radiography (PR) and cone-beam computed tomographic (CBCT) imaging in the detection of the root canal configuration (RCC) of human premolars. PR and CBCT imaging of 114 extracted human premolars were evaluated by 2 oral radiologists. RCC was recorded according to Vertucci's classification. Micro-computed tomographic imaging served as the gold standard to determine RCC. Accuracy, sensitivity, specificity, and predictive values were calculated. The Friedman test compared both PR and CBCT imaging with the gold standard. CBCT imaging showed higher values for all diagnostic tests compared with PR. Accuracy was 0.55 and 0.89 for PR and CBCT imaging, respectively. There was no difference between CBCT imaging and the gold standard, whereas PR differed from both CBCT and micro-computed tomographic imaging (P < .0001). CBCT imaging was more accurate than PR for evaluating different types of RCC individually. Canal configuration types III, VII, and "other" were poorly identified on CBCT imaging with a detection accuracy of 50%, 0%, and 43%, respectively. With PR, all canal configurations except type I were poorly visible. PR presented low performance in the detection of RCC in premolars, whereas CBCT imaging showed no difference compared with the gold standard. Canals with complex configurations were less identifiable using both imaging methods, especially PR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  11. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  12. Tailor-made Design of Chemical Blends using Decomposition-based Computer-aided Approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Manan, Zainuddin Abd.; Gernaey, Krist

    (properties). In this way, first the systematic computer-aided technique establishes the search space, and then narrows it down in subsequent steps until a small number of feasible and promising candidates remain and then experimental work may be conducted to verify if any or all the candidates satisfy......Computer aided technique is an efficient approach to solve chemical product design problems such as design of blended liquid products (chemical blending). In chemical blending, one tries to find the best candidate, which satisfies the product targets defined in terms of desired product attributes...... is decomposed into two stages. The first stage investigates the mixture stability where all unstable mixtures are eliminated and the stable blend candidates are retained for further testing. In the second stage, the blend candidates have to satisfy a set of target properties that are ranked according...

  13. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  14. The Baby TALK Model: An Innovative Approach to Identifying High-Risk Children and Families

    Science.gov (United States)

    Villalpando, Aimee Hilado; Leow, Christine; Hornstein, John

    2012-01-01

    This research report examines the Baby TALK model, an innovative early childhood intervention approach used to identify, recruit, and serve young children who are at-risk for developmental delays, mental health needs, and/or school failure, and their families. The report begins with a description of the model. This description is followed by an…

  15. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    Science.gov (United States)

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  16. Optical computing - an alternate approach to trigger processing

    International Nuclear Information System (INIS)

    Cleland, W.E.

    1981-01-01

    The enormous rate reduction factors required by most ISABELLE experiments suggest that we should examine every conceivable approach to trigger processing. One approach that has not received much attention by high energy physicists is optical data processing. The past few years have seen rapid advances in optoelectronic technology, stimulated mainly by the military and the communications industry. An intriguing question is whether one can utilize this technology together with the optical computing techniques that have been developed over the past two decades to develop a rapid trigger processor for high energy physics experiments. Optical data processing is a method for performing a few very specialized operations on data which is inherently two dimensional. Typical operations are the formation of convolution or correlation integrals between the input data and information stored in the processor in the form of an optical filter. Optical processors are classed as coherent or incoherent, according to the spatial coherence of the input wavefront. Typically, in a coherent processor a laser beam is modulated with a photographic transparency which represents the input data. In an incoherent processor, the input may be an incoherently illuminated transparency, but self-luminous objects, such as an oscilloscope trace, have also been used. We consider here an incoherent processor in which the input data is converted into an optical wavefront through the excitation of an array of point sources - either light emitting diodes or injection lasers

  17. Cloud security - An approach with modern cryptographic solutions

    OpenAIRE

    Kostadinovska, Ivana

    2016-01-01

    The term “cloud computing” has been in the spotlights of IT specialists due to its potential of transforming computer industry. Unfortunately, there are still some challenges to be resolved and the security aspects in the cloud based computing environment remain at the core of interest. The goal of our work is to identify the main security issues of cloud computing and to present approaches to secure clouds. Our research also focuses on data and storage security layers. As a result, we f...

  18. Computing and information services at the Jet Propulsion Laboratory - A management approach to a diversity of needs

    Science.gov (United States)

    Felberg, F. H.

    1984-01-01

    The Jet Propulsion Laboratory, a research and development organization with about 5,000 employees, presents a complicated set of requirements for an institutional system of computing and informational services. The approach taken by JPL in meeting this challenge is one of controlled flexibility. A central communications network is provided, together with selected computing facilities for common use. At the same time, staff members are given considerable discretion in choosing the mini- and microcomputers that they believe will best serve their needs. Consultation services, computer education, and other support functions are also provided.

  19. Design requirements for ubiquitous computing environments for healthcare professionals.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2004-01-01

    Ubiquitous computing environments can support clinical administrative routines in new ways. The aim of such computing approaches is to enhance routine physical work, thus it is important to identify specific design requirements. We studied healthcare professionals in an emergency room and developed the computer-augmented environment NOSTOS to support teamwork in that setting. NOSTOS uses digital pens and paper-based media as the primary input interface for data capture and as a means of controlling the system. NOSTOS also includes a digital desk, walk-up displays, and sensor technology that allow the system to track documents and activities in the workplace. We propose a set of requirements and discuss the value of tangible user interfaces for healthcare personnel. Our results suggest that the key requirements are flexibility in terms of system usage and seamless integration between digital and physical components. We also discuss how ubiquitous computing approaches like NOSTOS can be beneficial in the medical workplace.

  20. Using a distribution and conservation status weighted hotspot approach to identify areas in need of conservation action to benefit Idaho bird species

    Science.gov (United States)

    Haines, Aaron M.; Leu, Matthias; Svancara, Leona K.; Wilson, Gina; Scott, J. Michael

    2010-01-01

    Identification of biodiversity hotspots (hereafter, hotspots) has become a common strategy to delineate important areas for wildlife conservation. However, the use of hotspots has not often incorporated important habitat types, ecosystem services, anthropogenic activity, or consistency in identifying important conservation areas. The purpose of this study was to identify hotspots to improve avian conservation efforts for Species of Greatest Conservation Need (SGCN) in the state of Idaho, United States. We evaluated multiple approaches to define hotspots and used a unique approach based on weighting species by their distribution size and conservation status to identify hotspot areas. All hotspot approaches identified bodies of water (Bear Lake, Grays Lake, and American Falls Reservoir) as important hotspots for Idaho avian SGCN, but we found that the weighted approach produced more congruent hotspot areas when compared to other hotspot approaches. To incorporate anthropogenic activity into hotspot analysis, we grouped species based on their sensitivity to specific human threats (i.e., urban development, agriculture, fire suppression, grazing, roads, and logging) and identified ecological sections within Idaho that may require specific conservation actions to address these human threats using the weighted approach. The Snake River Basalts and Overthrust Mountains ecological sections were important areas for potential implementation of conservation actions to conserve biodiversity. Our approach to identifying hotspots may be useful as part of a larger conservation strategy to aid land managers or local governments in applying conservation actions on the ground.

  1. Computational and Experimental Approaches to Visual Aesthetics

    Science.gov (United States)

    Brachmann, Anselm; Redies, Christoph

    2017-01-01

    Aesthetics has been the subject of long-standing debates by philosophers and psychologists alike. In psychology, it is generally agreed that aesthetic experience results from an interaction between perception, cognition, and emotion. By experimental means, this triad has been studied in the field of experimental aesthetics, which aims to gain a better understanding of how aesthetic experience relates to fundamental principles of human visual perception and brain processes. Recently, researchers in computer vision have also gained interest in the topic, giving rise to the field of computational aesthetics. With computing hardware and methodology developing at a high pace, the modeling of perceptually relevant aspect of aesthetic stimuli has a huge potential. In this review, we present an overview of recent developments in computational aesthetics and how they relate to experimental studies. In the first part, we cover topics such as the prediction of ratings, style and artist identification as well as computational methods in art history, such as the detection of influences among artists or forgeries. We also describe currently used computational algorithms, such as classifiers and deep neural networks. In the second part, we summarize results from the field of experimental aesthetics and cover several isolated image properties that are believed to have a effect on the aesthetic appeal of visual stimuli. Their relation to each other and to findings from computational aesthetics are discussed. Moreover, we compare the strategies in the two fields of research and suggest that both fields would greatly profit from a joined research effort. We hope to encourage researchers from both disciplines to work more closely together in order to understand visual aesthetics from an integrated point of view. PMID:29184491

  2. From Computational Thinking to Computational Empowerment: A 21st Century PD Agenda

    DEFF Research Database (Denmark)

    Iversen, Ole Sejer; Smith, Rachel Charlotte; Dindler, Christian

    2018-01-01

    We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human-c...... technology in education. We argue that PD has the potential to drive a computational empowerment agenda in education, by connecting political PD with contemporary visions for addressing a future digitalized labor market and society.......We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human......-centred and societal challenges and impacts involved in students’ creative and critical engagement with digital technology. Our research is based on the FabLab@School project, in which a PD approach to computational empowerment provided opportunities as well as further challenges for the complex agenda of digital...

  3. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  4. Immediate Realities: an anthropology of computer visualisation in archaeology

    Directory of Open Access Journals (Sweden)

    Jonathan Bateman

    2000-06-01

    Full Text Available The use of computer visualisation techniques is an increasing part of archaeology's illustrative repertoire - but how do these images relate to the wider visual language that archaeologists use? This article assesses computer visualisations in the light of a range of anthropological, art historical, and cultural critiques to place them and their production squarely within the broader spectrum of the discipline's output. Moving from identifying the shortcomings in the methods and scope of existing critiques of archaeological illustrations, a comprehensive approach to understanding the visual culture of archaeology is outlined. This approach is specifically applied to computer visualisations, and identifies both the sociology of their production, and the technological nature of their creation and reproduction as key elements influencing their readings as communicators of archaeological ideas. In order to develop useful understandings of how the visual languages we employ act within the discourse of the discipline, we must be inclusive in our critiques of those languages. Until we consider the cultural products of our discipline with the same sophistication with which we examine the products of other cultures (past and present, we will struggle to use them to their full potential.

  5. A Representational Approach to Knowledge and Multiple Skill Levels for Broad Classes of Computer Generated Forces

    Science.gov (United States)

    1997-12-01

    that I’ll turn my attention to that computer game we’ve talked so much about... Dave Van Veldhuizen and Scott Brown (soon-to-be Drs. Van Veldhuizen and...Industry Training Systems Conference. 1988. 37. Van Veldhuizen , D. A. and L. J Hutson. "A Design Methodology for Domain Inde- pendent Computer...proposed by Van Veld- huizen and Hutson (37), extends the general architecture to support both a domain- independent approach to implementing CGFs and

  6. Stochastic approach for round-off error analysis in computing application to signal processing algorithms

    International Nuclear Information System (INIS)

    Vignes, J.

    1986-01-01

    Any result of algorithms provided by a computer always contains an error resulting from floating-point arithmetic round-off error propagation. Furthermore signal processing algorithms are also generally performed with data containing errors. The permutation-perturbation method, also known under the name CESTAC (controle et estimation stochastique d'arrondi de calcul) is a very efficient practical method for evaluating these errors and consequently for estimating the exact significant decimal figures of any result of algorithms performed on a computer. The stochastic approach of this method, its probabilistic proof, and the perfect agreement between the theoretical and practical aspects are described in this paper [fr

  7. A computer-assisted personalized approach in an undergraduate plant physiology class

    Science.gov (United States)

    Artus; Nadler

    1999-04-01

    We used Computer-Assisted Personalized Approach (CAPA), a networked teaching and learning tool that generates computer individualized homework problem sets, in our large-enrollment introductory plant physiology course. We saw significant improvement in student examination performance with regular homework assignments, with CAPA being an effective and efficient substitute for hand-graded homework. Using CAPA, each student received a printed set of similar but individualized problems of a conceptual (qualitative) and/or quantitative nature with quality graphics. Because each set of problems is unique, students were encouraged to work together to clarify concepts but were required to do their own work for credit. Students could enter answers multiple times without penalty, and they were able to obtain immediate feedback and hints until the due date. These features increased student time on task, allowing higher course standards and student achievement in a diverse student population. CAPA handles routine tasks such as grading, recording, summarizing, and posting grades. In anonymous surveys, students indicated an overwhelming preference for homework in CAPA format, citing several features such as immediate feedback, multiple tries, and on-line accessibility as reasons for their preference. We wrote and used more than 170 problems on 17 topics in introductory plant physiology, cataloging them in a computer library for general access. Representative problems are compared and discussed.

  8. Structural identifiability of cyclic graphical models of biological networks with latent variables.

    Science.gov (United States)

    Wang, Yulin; Lu, Na; Miao, Hongyu

    2016-06-13

    Graphical models have long been used to describe biological networks for a variety of important tasks such as the determination of key biological parameters, and the structure of graphical model ultimately determines whether such unknown parameters can be unambiguously obtained from experimental observations (i.e., the identifiability problem). Limited by resources or technical capacities, complex biological networks are usually partially observed in experiment, which thus introduces latent variables into the corresponding graphical models. A number of previous studies have tackled the parameter identifiability problem for graphical models such as linear structural equation models (SEMs) with or without latent variables. However, the limited resolution and efficiency of existing approaches necessarily calls for further development of novel structural identifiability analysis algorithms. An efficient structural identifiability analysis algorithm is developed in this study for a broad range of network structures. The proposed method adopts the Wright's path coefficient method to generate identifiability equations in forms of symbolic polynomials, and then converts these symbolic equations to binary matrices (called identifiability matrix). Several matrix operations are introduced for identifiability matrix reduction with system equivalency maintained. Based on the reduced identifiability matrices, the structural identifiability of each parameter is determined. A number of benchmark models are used to verify the validity of the proposed approach. Finally, the network module for influenza A virus replication is employed as a real example to illustrate the application of the proposed approach in practice. The proposed approach can deal with cyclic networks with latent variables. The key advantage is that it intentionally avoids symbolic computation and is thus highly efficient. Also, this method is capable of determining the identifiability of each single parameter and

  9. Computer Tutors: An Innovative Approach to Computer Literacy. Part I: The Early Stages.

    Science.gov (United States)

    Targ, Joan

    1981-01-01

    In Part I of this two-part article, the author describes the evolution of the Computer Tutor project in Palo Alto, California, and the strategies she incorporated into a successful student-taught computer literacy program. Journal availability: Educational Computer, P.O. Box 535, Cupertino, CA 95015. (Editor/SJL)

  10. Determining coding CpG islands by identifying regions significant for pattern statistics on Markov chains.

    Science.gov (United States)

    Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior

    2011-09-23

    Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.

  11. Allele Age Under Non-Classical Assumptions is Clarified by an Exact Computational Markov Chain Approach.

    Science.gov (United States)

    De Sanctis, Bianca; Krukov, Ivan; de Koning, A P Jason

    2017-09-19

    Determination of the age of an allele based on its population frequency is a well-studied problem in population genetics, for which a variety of approximations have been proposed. We present a new result that, surprisingly, allows the expectation and variance of allele age to be computed exactly (within machine precision) for any finite absorbing Markov chain model in a matter of seconds. This approach makes none of the classical assumptions (e.g., weak selection, reversibility, infinite sites), exploits modern sparse linear algebra techniques, integrates over all sample paths, and is rapidly computable for Wright-Fisher populations up to N e  = 100,000. With this approach, we study the joint effect of recurrent mutation, dominance, and selection, and demonstrate new examples of "selective strolls" where the classical symmetry of allele age with respect to selection is violated by weakly selected alleles that are older than neutral alleles at the same frequency. We also show evidence for a strong age imbalance, where rare deleterious alleles are expected to be substantially older than advantageous alleles observed at the same frequency when population-scaled mutation rates are large. These results highlight the under-appreciated utility of computational methods for the direct analysis of Markov chain models in population genetics.

  12. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  13. Computational Approaches to Chemical Hazard Assessment

    Science.gov (United States)

    Luechtefeld, Thomas; Hartung, Thomas

    2018-01-01

    Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769

  14. Integrating modelling and phenotyping approaches to identify and screen complex traits - Illustration for transpiration efficiency in cereals.

    Science.gov (United States)

    Chenu, K; van Oosterom, E J; McLean, G; Deifel, K S; Fletcher, A; Geetika, G; Tirfessa, A; Mace, E S; Jordan, D R; Sulman, R; Hammer, G L

    2018-02-21

    Following advances in genetics, genomics, and phenotyping, trait selection in breeding is limited by our ability to understand interactions within the plants and with their environments, and to target traits of most relevance for the target population of environments. We propose an integrated approach that combines insights from crop modelling, physiology, genetics, and breeding to identify traits valuable for yield gain in the target population of environments, develop relevant high-throughput phenotyping platforms, and identify genetic controls and their values in production environments. This paper uses transpiration efficiency (biomass produced per unit of water used) as an example of a complex trait of interest to illustrate how the approach can guide modelling, phenotyping, and selection in a breeding program. We believe that this approach, by integrating insights from diverse disciplines, can increase the resource use efficiency of breeding programs for improving yield gains in target populations of environments.

  15. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  16. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  17. Computational approach in zeolite science

    NARCIS (Netherlands)

    Pidko, E.A.; Santen, van R.A.; Chester, A.W.; Derouane, E.G.

    2009-01-01

    This chapter presents an overview of different computational methods and their application to various fields of zeolite chemistry. We will discuss static lattice methods based on interatomic potentials to predict zeolite structures and topologies, Monte Carlo simulations for the investigation of

  18. A Multiomics Approach to Identify Genes Associated with Childhood Asthma Risk and Morbidity.

    Science.gov (United States)

    Forno, Erick; Wang, Ting; Yan, Qi; Brehm, John; Acosta-Perez, Edna; Colon-Semidey, Angel; Alvarez, Maria; Boutaoui, Nadia; Cloutier, Michelle M; Alcorn, John F; Canino, Glorisa; Chen, Wei; Celedón, Juan C

    2017-10-01

    Childhood asthma is a complex disease. In this study, we aim to identify genes associated with childhood asthma through a multiomics "vertical" approach that integrates multiple analytical steps using linear and logistic regression models. In a case-control study of childhood asthma in Puerto Ricans (n = 1,127), we used adjusted linear or logistic regression models to evaluate associations between several analytical steps of omics data, including genome-wide (GW) genotype data, GW methylation, GW expression profiling, cytokine levels, asthma-intermediate phenotypes, and asthma status. At each point, only the top genes/single-nucleotide polymorphisms/probes/cytokines were carried forward for subsequent analysis. In step 1, asthma modified the gene expression-protein level association for 1,645 genes; pathway analysis showed an enrichment of these genes in the cytokine signaling system (n = 269 genes). In steps 2-3, expression levels of 40 genes were associated with intermediate phenotypes (asthma onset age, forced expiratory volume in 1 second, exacerbations, eosinophil counts, and skin test reactivity); of those, methylation of seven genes was also associated with asthma. Of these seven candidate genes, IL5RA was also significant in analytical steps 4-8. We then measured plasma IL-5 receptor α levels, which were associated with asthma age of onset and moderate-severe exacerbations. In addition, in silico database analysis showed that several of our identified IL5RA single-nucleotide polymorphisms are associated with transcription factors related to asthma and atopy. This approach integrates several analytical steps and is able to identify biologically relevant asthma-related genes, such as IL5RA. It differs from other methods that rely on complex statistical models with various assumptions.

  19. A Systematic Approach to Identify Candidate Transcription Factors that Control Cell Identity

    Directory of Open Access Journals (Sweden)

    Ana C. D’Alessio

    2015-11-01

    Full Text Available Hundreds of transcription factors (TFs are expressed in each cell type, but cell identity can be induced through the activity of just a small number of core TFs. Systematic identification of these core TFs for a wide variety of cell types is currently lacking and would establish a foundation for understanding the transcriptional control of cell identity in development, disease, and cell-based therapy. Here, we describe a computational approach that generates an atlas of candidate core TFs for a broad spectrum of human cells. The potential impact of the atlas was demonstrated via cellular reprogramming efforts where candidate core TFs proved capable of converting human fibroblasts to retinal pigment epithelial-like cells. These results suggest that candidate core TFs from the atlas will prove a useful starting point for studying transcriptional control of cell identity and reprogramming in many human cell types.

  20. Towards Cloud Computing SLA Risk Management: Issues and Challenges

    OpenAIRE

    Morin, Jean-Henry; Aubert, Jocelyn; Gateau, Benjamin

    2012-01-01

    Cloud Computing has become mainstream technology offering a commoditized approach to software, platform and infrastructure as a service over the Internet on a global scale. This raises important new security issues beyond traditional perimeter based approaches. This paper attempts to identify these issues and their corresponding challenges, proposing to use risk and Service Level Agreement (SLA) management as the basis for a service level framework to improve governance, risk and compliance i...

  1. Discovery of a small-molecule inhibitor of Dvl-CXXC5 interaction by computational approaches

    Science.gov (United States)

    Ma, Songling; Choi, Jiwon; Jin, Xuemei; Kim, Hyun-Yi; Yun, Ji-Hye; Lee, Weontae; Choi, Kang-Yell; No, Kyoung Tai

    2018-05-01

    The Wnt/β-catenin signaling pathway plays a significant role in the control of osteoblastogenesis and bone formation. CXXC finger protein 5 (CXXC5) has been recently identified as a negative feedback regulator of osteoblast differentiation through a specific interaction with Dishevelled (Dvl) protein. It was reported that targeting the Dvl-CXXC5 interaction could be a novel anabolic therapeutic target for osteoporosis. In this study, complex structure of Dvl PDZ domain and CXXC5 peptide was simulated with molecular dynamics (MD). Based on the structural analysis of binding modes of MD-simulated Dvl PDZ domain with CXXC5 peptide and crystal Dvl PDZ domain with synthetic peptide-ligands, we generated two different pharmacophore models and applied pharmacophore-based virtual screening to discover potent inhibitors of the Dvl-CXXC5 interaction for the anabolic therapy of osteoporosis. Analysis of 16 compounds selected by means of a virtual screening protocol yielded four compounds that effectively disrupted the Dvl-CXXC5 interaction in the fluorescence polarization assay. Potential compounds were validated by fluorescence spectroscopy and nuclear magnetic resonance. We successfully identified a highly potent inhibitor, BMD4722, which directly binds to the Dvl PDZ domain and disrupts the Dvl-CXXC5 interaction. Overall, CXXC5-Dvl PDZ domain complex based pharmacophore combined with various traditional and simple computational methods is a promising approach for the development of modulators targeting the Dvl-CXXC5 interaction, and the potent inhibitor BMD4722 could serve as a starting point to discover or design more potent and specific the Dvl-CXXC5 interaction disruptors.

  2. Discovery of a small-molecule inhibitor of Dvl-CXXC5 interaction by computational approaches.

    Science.gov (United States)

    Ma, Songling; Choi, Jiwon; Jin, Xuemei; Kim, Hyun-Yi; Yun, Ji-Hye; Lee, Weontae; Choi, Kang-Yell; No, Kyoung Tai

    2018-04-07

    The Wnt/β-catenin signaling pathway plays a significant role in the control of osteoblastogenesis and bone formation. CXXC finger protein 5 (CXXC5) has been recently identified as a negative feedback regulator of osteoblast differentiation through a specific interaction with Dishevelled (Dvl) protein. It was reported that targeting the Dvl-CXXC5 interaction could be a novel anabolic therapeutic target for osteoporosis. In this study, complex structure of Dvl PDZ domain and CXXC5 peptide was simulated with molecular dynamics (MD). Based on the structural analysis of binding modes of MD-simulated Dvl PDZ domain with CXXC5 peptide and crystal Dvl PDZ domain with synthetic peptide-ligands, we generated two different pharmacophore models and applied pharmacophore-based virtual screening to discover potent inhibitors of the Dvl-CXXC5 interaction for the anabolic therapy of osteoporosis. Analysis of 16 compounds selected by means of a virtual screening protocol yielded four compounds that effectively disrupted the Dvl-CXXC5 interaction in the fluorescence polarization assay. Potential compounds were validated by fluorescence spectroscopy and nuclear magnetic resonance. We successfully identified a highly potent inhibitor, BMD4722, which directly binds to the Dvl PDZ domain and disrupts the Dvl-CXXC5 interaction. Overall, CXXC5-Dvl PDZ domain complex based pharmacophore combined with various traditional and simple computational methods is a promising approach for the development of modulators targeting the Dvl-CXXC5 interaction, and the potent inhibitor BMD4722 could serve as a starting point to discover or design more potent and specific the Dvl-CXXC5 interaction disruptors.

  3. Seismic safety margins research program. Phase I. Project VII: systems analysis specifications of computational approach

    International Nuclear Information System (INIS)

    Collins, J.D.; Hudson, J.M.; Chrostowski, J.D.

    1979-02-01

    A computational methodology is presented for the prediction of core melt probabilities in a nuclear power plant due to earthquake events. The proposed model has four modules: seismic hazard, structural dynamic (including soil-structure interaction), component failure and core melt sequence. The proposed modules would operate in series and would not have to be operated at the same time. The basic statistical approach uses a Monte Carlo simulation to treat random and systematic error but alternate statistical approaches are permitted by the program design

  4. Examining Student Opinions on Computer Use Based on the Learning Styles in Mathematics Education

    Science.gov (United States)

    Ozgen, Kemal; Bindak, Recep

    2012-01-01

    The purpose of this study is to identify the opinions of high school students, who have different learning styles, related to computer use in mathematics education. High school students' opinions on computer use in mathematics education were collected with both qualitative and quantitative approaches in the study conducted with a survey model. For…

  5. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  6. What Computational Approaches Should be Taught for Physics?

    Science.gov (United States)

    Landau, Rubin

    2005-03-01

    The standard Computational Physics courses are designed for upper-level physics majors who already have some computational skills. We believe that it is important for first-year physics students to learn modern computing techniques that will be useful throughout their college careers, even before they have learned the math and science required for Computational Physics. To teach such Introductory Scientific Computing courses requires that some choices be made as to what subjects and computer languages wil be taught. Our survey of colleagues active in Computational Physics and Physics Education show no predominant choice, with strong positions taken for the compiled languages Java, C, C++ and Fortran90, as well as for problem-solving environments like Maple and Mathematica. Over the last seven years we have developed an Introductory course and have written up those courses as text books for others to use. We will describe our model of using both a problem-solving environment and a compiled language. The developed materials are available in both Maple and Mathaematica, and Java and Fortran90ootnotetextPrinceton University Press, to be published; www.physics.orst.edu/˜rubin/IntroBook/.

  7. COMPUTER MATHEMATICS SYSTEMS IN STUDENTS’ LEARNING OF "INFORMATIСS"

    Directory of Open Access Journals (Sweden)

    Taras P. Kobylnyk

    2014-04-01

    Full Text Available The article describes the general characteristics of the most popular computer mathematics systems such as commercial (Maple, Mathematica, Matlab and open source (Scilab, Maxima, GRAN, Sage, as well as the conditions of use of these systems as means of fundamentalization of the educational process of bachelor of informatics. It is considered the role of CMS in bachelor of informatics training. It is identified the approaches of CMS pedagogical use while learning information and physics and mathematics disciplines. There are presented some tasks, in which we must carefully use the «responses» have been received using CMS. It is identified the promising directions of development of computer mathematics systems in high-tech environment.

  8. Developing an instrument to identify MBChB students' approaches ...

    African Journals Online (AJOL)

    The constructs of deep, surface and achieving approaches to learning are well defined in the literature and amply supported by research. Quality learning results from a deep approach to learning, and a deep-achieving approach to learning is regarded as the most adaptive approach institutionally. It is therefore felt that ...

  9. An Educational Approach to Computationally Modeling Dynamical Systems

    Science.gov (United States)

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  10. Promoter-enhancer interactions identified from Hi-C data using probabilistic models and hierarchical topological domains.

    Science.gov (United States)

    Ron, Gil; Globerson, Yuval; Moran, Dror; Kaplan, Tommy

    2017-12-21

    Proximity-ligation methods such as Hi-C allow us to map physical DNA-DNA interactions along the genome, and reveal its organization into topologically associating domains (TADs). As the Hi-C data accumulate, computational methods were developed for identifying domain borders in multiple cell types and organisms. Here, we present PSYCHIC, a computational approach for analyzing Hi-C data and identifying promoter-enhancer interactions. We use a unified probabilistic model to segment the genome into domains, which we then merge hierarchically and fit using a local background model, allowing us to identify over-represented DNA-DNA interactions across the genome. By analyzing the published Hi-C data sets in human and mouse, we identify hundreds of thousands of putative enhancers and their target genes, and compile an extensive genome-wide catalog of gene regulation in human and mouse. As we show, our predictions are highly enriched for ChIP-seq and DNA accessibility data, evolutionary conservation, eQTLs and other DNA-DNA interaction data.

  11. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  12. Soft Computing Methods for Disulfide Connectivity Prediction.

    Science.gov (United States)

    Márquez-Chamorro, Alfonso E; Aguilar-Ruiz, Jesús S

    2015-01-01

    The problem of protein structure prediction (PSP) is one of the main challenges in structural bioinformatics. To tackle this problem, PSP can be divided into several subproblems. One of these subproblems is the prediction of disulfide bonds. The disulfide connectivity prediction problem consists in identifying which nonadjacent cysteines would be cross-linked from all possible candidates. Determining the disulfide bond connectivity between the cysteines of a protein is desirable as a previous step of the 3D PSP, as the protein conformational search space is highly reduced. The most representative soft computing approaches for the disulfide bonds connectivity prediction problem of the last decade are summarized in this paper. Certain aspects, such as the different methodologies based on soft computing approaches (artificial neural network or support vector machine) or features of the algorithms, are used for the classification of these methods.

  13. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Directory of Open Access Journals (Sweden)

    Marijan Beg

    2017-05-01

    Full Text Available Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i the re-compilation of source code, (ii the use of configuration files, (iii the graphical user interface, and (iv embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF. We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  14. Analog series-based scaffolds: computational design and exploration of a new type of molecular scaffolds for medicinal chemistry

    Science.gov (United States)

    Dimova, Dilyana; Stumpfe, Dagmar; Hu, Ye; Bajorath, Jürgen

    2016-01-01

    Aim: Computational design of and systematic search for a new type of molecular scaffolds termed analog series-based scaffolds. Materials & methods: From currently available bioactive compounds, analog series were systematically extracted, key compounds identified and new scaffolds isolated from them. Results: Using our computational approach, more than 12,000 scaffolds were extracted from bioactive compounds. Conclusion: A new scaffold definition is introduced and a computational methodology developed to systematically identify such scaffolds, yielding a large freely available scaffold knowledge base. PMID:28116132

  15. Identifying the Critical Links in Road Transportation Networks: Centrality-based approach utilizing structural properties

    Energy Technology Data Exchange (ETDEWEB)

    Chinthavali, Supriya [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Surface transportation road networks share structural properties similar to other complex networks (e.g., social networks, information networks, biological networks, and so on). This research investigates the structural properties of road networks for any possible correlation with the traffic characteristics such as link flows those determined independently. Additionally, we define a criticality index for the links of the road network that identifies the relative importance in the network. We tested our hypotheses with two sample road networks. Results show that, correlation exists between the link flows and centrality measures of a link of the road (dual graph approach is followed) and the criticality index is found to be effective for one test network to identify the vulnerable nodes.

  16. Can children identify and achieve goals for intervention? A randomized trial comparing two goal-setting approaches.

    Science.gov (United States)

    Vroland-Nordstrand, Kristina; Eliasson, Ann-Christin; Jacobsson, Helén; Johansson, Ulla; Krumlinde-Sundholm, Lena

    2016-06-01

    The efficacy of two different goal-setting approaches (children's self-identified goals and goals identified by parents) were compared on a goal-directed, task-oriented intervention. In this assessor-blinded parallel randomized trial, 34 children with disabilities (13 males, 21 females; mean age 9y, SD 1y 4mo) were randomized using concealed allocation to one of two 8-week, goal-directed, task-oriented intervention groups with different goal-setting approaches: (1) children's self-identified goals (n=18) using the Perceived Efficacy and Goal-Setting System, or (2) goals identified by parents (n=16) using the Canadian Occupational Performance Measure (COPM). Participants were recruited through eight paediatric rehabilitation centres and randomized between October 2011 and May 2013. The primary outcome measure was the Goal Attainment Scaling and the secondary measure, the COPM performance scale (COPM-P). Data were collected pre- and post-intervention and at the 5-month follow-up. There was no evidence of a difference in mean characteristics at baseline between groups. There was evidence of an increase in mean goal attainment (mean T score) in both groups after intervention (child-goal group: estimated mean difference [EMD] 27.84, 95% CI 22.93-32.76; parent-goal group: EMD 21.42, 95% CI 16.16-26.67). There was no evidence of a difference in the mean T scores post-intervention between the two groups (EMD 6.42, 95% CI -0.80 to 13.65). These results were sustained at the 5-month follow-up. Children's self-identified goals are achievable to the same extent as parent-identified goals and remain stable over time. Thus children can be trusted to identify their own goals for intervention, thereby influencing their involvement in their intervention programmes. © 2015 Mac Keith Press.

  17. Integrated Pathway-Based Approach Identifies Association between Genomic Regions at CTCF and CACNB2 and Schizophrenia

    NARCIS (Netherlands)

    Juraeva, Dilafruz; Haenisch, Britta; Zapatka, Marc; Frank, Josef; Witt, Stephanie H.; Mühleisen, Thomas W.; Treutlein, Jens; Strohmaier, Jana; Meier, Sandra; Degenhardt, Franziska; Giegling, Ina; Ripke, Stephan; Leber, Markus; Lange, Christoph; Schulze, Thomas G.; Mössner, Rainald; Nenadic, Igor; Sauer, Heinrich; Rujescu, Dan; Maier, Wolfgang; Børglum, Anders; Ophoff, Roel; Cichon, Sven; Nöthen, Markus M.; Rietschel, Marcella; Mattheisen, Manuel; Brors, Benedikt; Kahn, René S.; Cahn, Wiepke; Linszen, Don H.; de Haan, Lieuwe; van Os, Jim; Krabbendam, Lydia; Myin-Germeys, Inez; Wiersma, Durk; Bruggeman, Richard; Mors, O.; Børglum, A. D.; Mortensen, P. B.; Pedersen, C. B.; Demontis, D.; Grove, J.; Mattheisen, M.; Hougaard, D. M.

    2014-01-01

    In the present study, an integrated hierarchical approach was applied to: (1) identify pathways associated with susceptibility to schizophrenia; (2) detect genes that may be potentially affected in these pathways since they contain an associated polymorphism; and (3) annotate the functional

  18. User involvement in the design of human-computer interactions: some similarities and differences between design approaches

    NARCIS (Netherlands)

    Bekker, M.M.; Long, J.B.

    1998-01-01

    This paper presents a general review of user involvement in the design of human-computer interactions, as advocated by a selection of different approaches to design. The selection comprises User-Centred Design, Participatory Design, Socio-Technical Design, Soft Systems Methodology, and Joint

  19. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  20. A Representation-Theoretic Approach to Reversible Computation with Applications

    DEFF Research Database (Denmark)

    Maniotis, Andreas Milton

    Reversible computing is a sub-discipline of computer science that helps to understand the foundations of the interplay between physics, algebra, and logic in the context of computation. Its subjects of study are computational devices and abstract models of computation that satisfy the constraint ......, there is still no uniform and consistent theory that is general in the sense of giving a model-independent account to the field....... of information conservation. Such machine models, which are known as reversible models of computation, have been examined both from a theoretical perspective and from an engineering perspective. While a bundle of many isolated successful findings and applications concerning reversible computing exists...

  1. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Korfiati, Aigli; Theofilatos, Konstantinos A.; Likothanassis, Spiridon D.; Tsakalidis, Athanasios K.; Mavroudi, Seferina P.

    2013-01-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  2. Where we stand, where we are moving: Surveying computational techniques for identifying miRNA genes and uncovering their regulatory role

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2013-06-01

    Traditional biology was forced to restate some of its principles when the microRNA (miRNA) genes and their regulatory role were firstly discovered. Typically, miRNAs are small non-coding RNA molecules which have the ability to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. Existing experimental techniques for their identification and the prediction of the target genes share some important limitations such as low coverage, time consuming experiments and high cost reagents. Hence, many computational methods have been proposed for these tasks to overcome these limitations. Recently, many researchers emphasized on the development of computational approaches to predict the participation of miRNA genes in regulatory networks and to analyze their transcription mechanisms. All these approaches have certain advantages and disadvantages which are going to be described in the present survey. Our work is differentiated from existing review papers by updating the methodologies list and emphasizing on the computational issues that arise from the miRNA data analysis. Furthermore, in the present survey, the various miRNA data analysis steps are treated as an integrated procedure whose aims and scope is to uncover the regulatory role and mechanisms of the miRNA genes. This integrated view of the miRNA data analysis steps may be extremely useful for all researchers even if they work on just a single step. © 2013 Elsevier Inc.

  3. A High-Throughput Computational Framework for Identifying Significant Copy Number Aberrations from Array Comparative Genomic Hybridisation Data

    Directory of Open Access Journals (Sweden)

    Ian Roberts

    2012-01-01

    Full Text Available Reliable identification of copy number aberrations (CNA from comparative genomic hybridization data would be improved by the availability of a generalised method for processing large datasets. To this end, we developed swatCGH, a data analysis framework and region detection heuristic for computational grids. swatCGH analyses sequentially displaced (sliding windows of neighbouring probes and applies adaptive thresholds of varying stringency to identify the 10% of each chromosome that contains the most frequently occurring CNAs. We used the method to analyse a published dataset, comparing data preprocessed using four different DNA segmentation algorithms, and two methods for prioritising the detected CNAs. The consolidated list of the most commonly detected aberrations confirmed the value of swatCGH as a simplified high-throughput method for identifying biologically significant CNA regions of interest.

  4. Computer assisted collimation gamma camera: A new approach to imaging contaminated tissues

    International Nuclear Information System (INIS)

    Quartuccio, M.; Franck, D.; Pihet, P.; Begot, S.; Jeanguillaume, C.

    2000-01-01

    Measurement systems with the capability of imaging tissues contaminated with radioactive materials would find relevant applications in medical physics research and possibly in health physics. The latter in particular depends critically on the performance achieved for sensitivity and spatial resolution. An original approach of computer assisted collimation gamma camera (French acronym CACAO) which could meet suitable characteristics has been proposed elsewhere. CACAO requires detectors with high spatial resolution. The present work was aimed at investigating the application of the CACAO principle on a laboratory testing bench using silicon detectors made of small pixels. (author)

  5. Computer assisted collimation gamma camera: A new approach to imaging contaminated tissues

    Energy Technology Data Exchange (ETDEWEB)

    Quartuccio, M.; Franck, D.; Pihet, P.; Begot, S.; Jeanguillaume, C

    2000-07-01

    Measurement systems with the capability of imaging tissues contaminated with radioactive materials would find relevant applications in medical physics research and possibly in health physics. The latter in particular depends critically on the performance achieved for sensitivity and spatial resolution. An original approach of computer assisted collimation gamma camera (French acronym CACAO) which could meet suitable characteristics has been proposed elsewhere. CACAO requires detectors with high spatial resolution. The present work was aimed at investigating the application of the CACAO principle on a laboratory testing bench using silicon detectors made of small pixels. (author)

  6. Tensor Voting A Perceptual Organization Approach to Computer Vision and Machine Learning

    CERN Document Server

    Mordohai, Philippos

    2006-01-01

    This lecture presents research on a general framework for perceptual organization that was conducted mainly at the Institute for Robotics and Intelligent Systems of the University of Southern California. It is not written as a historical recount of the work, since the sequence of the presentation is not in chronological order. It aims at presenting an approach to a wide range of problems in computer vision and machine learning that is data-driven, local and requires a minimal number of assumptions. The tensor voting framework combines these properties and provides a unified perceptual organiza

  7. Identifiability of PBPK Models with Applications to ...

    Science.gov (United States)

    Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy

  8. A computational approach to discovering the functions of bacterial phytochromes by analysis of homolog distributions

    Directory of Open Access Journals (Sweden)

    Lamparter Tilman

    2006-03-01

    bacterial phytochromes in ammonium assimilation and amino acid metabolism. Conclusion It was possible to identify several proteins that might share common functions with bacterial phytochromes by the co-distribution approach. This computational approach might also be helpful in other cases.

  9. An Integrated Constraint Programming Approach to Scheduling Sports Leagues with Divisional and Round-robin Tournaments

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Mats; Johansson, Mikael; Larson, Jeffrey

    2014-01-01

    Previous approaches for scheduling a league with round-robin and divisional tournaments involved decomposing the problem into easier subproblems. This approach, used to schedule the top Swedish handball league Elitserien, reduces the problem complexity but can result in suboptimal schedules. This paper presents an integrated constraint programming model that allows to perform the scheduling in a single step. Particular attention is given to identifying implied and symmetry-breaking constraints that reduce the computational complexity significantly. The experimental evaluation of the integrated approach takes considerably less computational effort than the previous approach.

  10. Identifying a set of influential spreaders in complex networks

    Science.gov (United States)

    Zhang, Jian-Xiong; Chen, Duan-Bing; Dong, Qiang; Zhao, Zhi-Dan

    2016-06-01

    Identifying a set of influential spreaders in complex networks plays a crucial role in effective information spreading. A simple strategy is to choose top-r ranked nodes as spreaders according to influence ranking method such as PageRank, ClusterRank and k-shell decomposition. Besides, some heuristic methods such as hill-climbing, SPIN, degree discount and independent set based are also proposed. However, these approaches suffer from a possibility that some spreaders are so close together that they overlap sphere of influence or time consuming. In this report, we present a simply yet effectively iterative method named VoteRank to identify a set of decentralized spreaders with the best spreading ability. In this approach, all nodes vote in a spreader in each turn, and the voting ability of neighbors of elected spreader will be decreased in subsequent turn. Experimental results on four real networks show that under Susceptible-Infected-Recovered (SIR) and Susceptible-Infected (SI) models, VoteRank outperforms the traditional benchmark methods on both spreading rate and final affected scale. What’s more, VoteRank has superior computational efficiency.

  11. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  12. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  13. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  14. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  15. SPARQL-enabled identifier conversion with Identifiers.org.

    Science.gov (United States)

    Wimalaratne, Sarala M; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-06-01

    On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. © The Author 2015. Published by Oxford University Press.

  16. SPARQL-enabled identifier conversion with Identifiers.org

    Science.gov (United States)

    Wimalaratne, Sarala M.; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-01-01

    Motivation: On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. Results: We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. Availability and implementation: The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. Contact: sarala@ebi.ac.uk PMID:25638809

  17. Costs of fire suppression forces based on cost-aggregation approach

    Science.gov (United States)

    Gonz& aacute; lez-Cab& aacute; Armando n; Charles W. McKetta; Thomas J. Mills

    1984-01-01

    A cost-aggregation approach has been developed for determining the cost of Fire Management Inputs (FMls)-the direct fireline production units (personnel and equipment) used in initial attack and large-fire suppression activities. All components contributing to an FMI are identified, computed, and summed to estimate hourly costs. This approach can be applied to any FMI...

  18. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  19. SitesIdentify: a protein functional site prediction tool

    Directory of Open Access Journals (Sweden)

    Doig Andrew J

    2009-11-01

    Full Text Available Abstract Background The rate of protein structures being deposited in the Protein Data Bank surpasses the capacity to experimentally characterise them and therefore computational methods to analyse these structures have become increasingly important. Identifying the region of the protein most likely to be involved in function is useful in order to gain information about its potential role. There are many available approaches to predict functional site, but many are not made available via a publicly-accessible application. Results Here we present a functional site prediction tool (SitesIdentify, based on combining sequence conservation information with geometry-based cleft identification, that is freely available via a web-server. We have shown that SitesIdentify compares favourably to other functional site prediction tools in a comparison of seven methods on a non-redundant set of 237 enzymes with annotated active sites. Conclusion SitesIdentify is able to produce comparable accuracy in predicting functional sites to its closest available counterpart, but in addition achieves improved accuracy for proteins with few characterised homologues. SitesIdentify is available via a webserver at http://www.manchester.ac.uk/bioinformatics/sitesidentify/

  20. Computer-aided drug discovery [v1; ref status: indexed, http://f1000r.es/5ij

    Directory of Open Access Journals (Sweden)

    Jürgen Bajorath

    2015-08-01

    Full Text Available Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience.

  1. A Collaborative Approach to Identifying Social Media Markers of Schizophrenia by Employing Machine Learning and Clinical Appraisals.

    Science.gov (United States)

    Birnbaum, Michael L; Ernala, Sindhu Kiranmai; Rizvi, Asra F; De Choudhury, Munmun; Kane, John M

    2017-08-14

    Linguistic analysis of publicly available Twitter feeds have achieved success in differentiating individuals who self-disclose online as having schizophrenia from healthy controls. To date, limited efforts have included expert input to evaluate the authenticity of diagnostic self-disclosures. This study aims to move from noisy self-reports of schizophrenia on social media to more accurate identification of diagnoses by exploring a human-machine partnered approach, wherein computational linguistic analysis of shared content is combined with clinical appraisals. Twitter timeline data, extracted from 671 users with self-disclosed diagnoses of schizophrenia, was appraised for authenticity by expert clinicians. Data from disclosures deemed true were used to build a classifier aiming to distinguish users with schizophrenia from healthy controls. Results from the classifier were compared to expert appraisals on new, unseen Twitter users. Significant linguistic differences were identified in the schizophrenia group including greater use of interpersonal pronouns (P<.001), decreased emphasis on friendship (P<.001), and greater emphasis on biological processes (P<.001). The resulting classifier distinguished users with disclosures of schizophrenia deemed genuine from control users with a mean accuracy of 88% using linguistic data alone. Compared to clinicians on new, unseen users, the classifier's precision, recall, and accuracy measures were 0.27, 0.77, and 0.59, respectively. These data reinforce the need for ongoing collaborations integrating expertise from multiple fields to strengthen our ability to accurately identify and effectively engage individuals with mental illness online. These collaborations are crucial to overcome some of mental illnesses' biggest challenges by using digital technology. ©Michael L Birnbaum, Sindhu Kiranmai Ernala, Asra F Rizvi, Munmun De Choudhury, John M Kane. Originally published in the Journal of Medical Internet Research (http

  2. Genome-wide local ancestry approach identifies genes and variants associated with chemotherapeutic susceptibility in African Americans.

    Directory of Open Access Journals (Sweden)

    Heather E Wheeler

    Full Text Available Chemotherapeutic agents are used in the treatment of many cancers, yet variable resistance and toxicities among individuals limit successful outcomes. Several studies have indicated outcome differences associated with ancestry among patients with various cancer types. Using both traditional SNP-based and newly developed gene-based genome-wide approaches, we investigated the genetics of chemotherapeutic susceptibility in lymphoblastoid cell lines derived from 83 African Americans, a population for which there is a disparity in the number of genome-wide studies performed. To account for population structure in this admixed population, we incorporated local ancestry information into our association model. We tested over 2 million SNPs and identified 325, 176, 240, and 190 SNPs that were suggestively associated with cytarabine-, 5'-deoxyfluorouridine (5'-DFUR-, carboplatin-, and cisplatin-induced cytotoxicity, respectively (p≤10(-4. Importantly, some of these variants are found only in populations of African descent. We also show that cisplatin-susceptibility SNPs are enriched for carboplatin-susceptibility SNPs. Using a gene-based genome-wide association approach, we identified 26, 11, 20, and 41 suggestive candidate genes for association with cytarabine-, 5'-DFUR-, carboplatin-, and cisplatin-induced cytotoxicity, respectively (p≤10(-3. Fourteen of these genes showed evidence of association with their respective chemotherapeutic phenotypes in the Yoruba from Ibadan, Nigeria (p<0.05, including TP53I11, COPS5 and GAS8, which are known to be involved in tumorigenesis. Although our results require further study, we have identified variants and genes associated with chemotherapeutic susceptibility in African Americans by using an approach that incorporates local ancestry information.

  3. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  4. Interactive Computer Graphics

    Science.gov (United States)

    Kenwright, David

    2000-01-01

    Aerospace data analysis tools that significantly reduce the time and effort needed to analyze large-scale computational fluid dynamics simulations have emerged this year. The current approach for most postprocessing and visualization work is to explore the 3D flow simulations with one of a dozen or so interactive tools. While effective for analyzing small data sets, this approach becomes extremely time consuming when working with data sets larger than one gigabyte. An active area of research this year has been the development of data mining tools that automatically search through gigabyte data sets and extract the salient features with little or no human intervention. With these so-called feature extraction tools, engineers are spared the tedious task of manually exploring huge amounts of data to find the important flow phenomena. The software tools identify features such as vortex cores, shocks, separation and attachment lines, recirculation bubbles, and boundary layers. Some of these features can be extracted in a few seconds; others take minutes to hours on extremely large data sets. The analysis can be performed off-line in a batch process, either during or following the supercomputer simulations. These computations have to be performed only once, because the feature extraction programs search the entire data set and find every occurrence of the phenomena being sought. Because the important questions about the data are being answered automatically, interactivity is less critical than it is with traditional approaches.

  5. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    Science.gov (United States)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on

  6. IDENTIFYING DEMENTIA IN ELDERLY POPULATION : A CAMP APPROACH

    Directory of Open Access Journals (Sweden)

    Anand P

    2015-06-01

    Full Text Available BACKGROUND: Dementia is an emerging medico social problem affecting elderly, and poses a challenge to clinician and caregivers. It is usually identified in late stage where management becomes difficult. AIM: The aim of camp was to identify dementia in elderly population participating in screening camp. MATERIAL AND METHODS : The geriatric clinic and department of psychiatry jointly organised screening camp to detect dementia in elderly for five days in September 2014 to commemorate world Alzheimer’s day. The invitation regarding camp was sent to all senio r citizen forums and also published in leading Kannada daily newspaper. Mini Mental Status Examination and Diagnostic and Statistical Manual of Mental Disorders, 4 th edition criteria (DSM IV was used to identify dementia. RESULTS: Elderly male participate d in camp in more number than females and dementia was identified in 36% elderly with education less than 9 th standard. Dementia was found in 18% in our study population. CONCLUSION: The camp help identify elderly suffering from dementia and also created a wareness about it. Hypertension and diabetes mellitus were common co morbidity in study population. Our study suggested organising screening camp will help identify elderly living with dementia.

  7. Soft and hard computing approaches for real-time prediction of currents in a tide-dominated coastal area

    Digital Repository Service at National Institute of Oceanography (India)

    Charhate, S.B.; Deo, M.C.; SanilKumar, V.

    . Owing to the complex real sea conditions, such methods may not always yield satisfactory results. This paper discusses a few alternative approaches based on the soft computing tools of artificial neural networks (ANNs) and genetic programming (GP...

  8. Tiered High-Throughput Screening Approach to Identify ...

    Science.gov (United States)

    High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the US EPA ToxCast screening assay portfolio. To fill one critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast Phase I and II chemical libraries, comprised of 1,074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single concentration screen were retested in concentration-response. Due to high false positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed two additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using

  9. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    Science.gov (United States)

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  10. Intelligent computational systems for space applications

    Science.gov (United States)

    Lum, Henry; Lau, Sonie

    Intelligent computational systems can be described as an adaptive computational system integrating both traditional computational approaches and artificial intelligence (AI) methodologies to meet the science and engineering data processing requirements imposed by specific mission objectives. These systems will be capable of integrating, interpreting, and understanding sensor input information; correlating that information to the "world model" stored within its data base and understanding the differences, if any; defining, verifying, and validating a command sequence to merge the "external world" with the "internal world model"; and, controlling the vehicle and/or platform to meet the scientific and engineering mission objectives. Performance and simulation data obtained to date indicate that the current flight processors baselined for many missions such as Space Station Freedom do not have the computational power to meet the challenges of advanced automation and robotics systems envisioned for the year 2000 era. Research issues which must be addressed to achieve greater than giga-flop performance for on-board intelligent computational systems have been identified, and a technology development program has been initiated to achieve the desired long-term system performance objectives.

  11. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes...... of action in humans, to group them according to their modes of action, and to hypothesize on their potential effects on human health. We extracted human proteins associated to prochloraz, tebuconazole, epoxiconazole, procymidone, and mancozeb and enriched each protein set by using a high confidence human......, and procymidone exerted their effects mainly via interference with steroidogenesis and nuclear receptors. Prochloraz was associated to a large number of human diseases, and together with tebuconazole showed several significant associations to Testicular Dysgenesis Syndrome. Mancozeb showed a differential mode...

  12. Structured methodology review identified seven (RETREAT) criteria for selecting qualitative evidence synthesis approaches.

    Science.gov (United States)

    Booth, Andrew; Noyes, Jane; Flemming, Kate; Gerhardus, Ansgar; Wahlster, Philip; van der Wilt, Gert Jan; Mozygemba, Kati; Refolo, Pietro; Sacchini, Dario; Tummers, Marcia; Rehfuess, Eva

    2018-07-01

    To compare and contrast different methods of qualitative evidence synthesis (QES) against criteria identified from the literature and to map their attributes to inform selection of the most appropriate QES method to answer research questions addressed by qualitative research. Electronic databases, citation searching, and a study register were used to identify studies reporting QES methods. Attributes compiled from 26 methodological papers (2001-2014) were used as a framework for data extraction. Data were extracted into summary tables by one reviewer and then considered within the author team. We identified seven considerations determining choice of methods from the methodological literature, encapsulated within the mnemonic Review question-Epistemology-Time/Timescale-Resources-Expertise-Audience and purpose-Type of data. We mapped 15 different published QES methods against these seven criteria. The final framework focuses on stand-alone QES methods but may also hold potential when integrating quantitative and qualitative data. These findings offer a contemporary perspective as a conceptual basis for future empirical investigation of the advantages and disadvantages of different methods of QES. It is hoped that this will inform appropriate selection of QES approaches. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Integration of experimental and computational methods for identifying geometric, thermal and diffusive properties of biomaterials

    Science.gov (United States)

    Weres, Jerzy; Kujawa, Sebastian; Olek, Wiesław; Czajkowski, Łukasz

    2016-04-01

    Knowledge of physical properties of biomaterials is important in understanding and designing agri-food and wood processing industries. In the study presented in this paper computational methods were developed and combined with experiments to enhance identification of agri-food and forest product properties, and to predict heat and water transport in such products. They were based on the finite element model of heat and water transport and supplemented with experimental data. Algorithms were proposed for image processing, geometry meshing, and inverse/direct finite element modelling. The resulting software system was composed of integrated subsystems for 3D geometry data acquisition and mesh generation, for 3D geometry modelling and visualization, and for inverse/direct problem computations for the heat and water transport processes. Auxiliary packages were developed to assess performance, accuracy and unification of data access. The software was validated by identifying selected properties and using the estimated values to predict the examined processes, and then comparing predictions to experimental data. The geometry, thermal conductivity, specific heat, coefficient of water diffusion, equilibrium water content and convective heat and water transfer coefficients in the boundary layer were analysed. The estimated values, used as an input for simulation of the examined processes, enabled reduction in the uncertainty associated with predictions.

  14. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  15. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2017-02-03

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  16. Progress and challenges in bioinformatics approaches for enhancer identification

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2017-01-01

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration.

  17. On the computation of the demagnetization tensor field for an arbitrary particle shape using a Fourier space approach

    International Nuclear Information System (INIS)

    Beleggia, M.; Graef, M. de

    2003-01-01

    A method is presented to compute the demagnetization tensor field for uniformly magnetized particles of arbitrary shape. By means of a Fourier space approach it is possible to compute analytically the Fourier representation of the demagnetization tensor field for a given shape. Then, specifying the direction of the uniform magnetization, the demagnetizing field and the magnetostatic energy associated with the particle can be evaluated. In some particular cases, the real space representation is computable analytically. In general, a numerical inverse fast Fourier transform is required to perform the inversion. As an example, the demagnetization tensor field for the tetrahedron will be given

  18. A Computer-Assisted Personalized Approach in an Undergraduate Plant Physiology Class1

    Science.gov (United States)

    Artus, Nancy N.; Nadler, Kenneth D.

    1999-01-01

    We used Computer-Assisted Personalized Approach (CAPA), a networked teaching and learning tool that generates computer individualized homework problem sets, in our large-enrollment introductory plant physiology course. We saw significant improvement in student examination performance with regular homework assignments, with CAPA being an effective and efficient substitute for hand-graded homework. Using CAPA, each student received a printed set of similar but individualized problems of a conceptual (qualitative) and/or quantitative nature with quality graphics. Because each set of problems is unique, students were encouraged to work together to clarify concepts but were required to do their own work for credit. Students could enter answers multiple times without penalty, and they were able to obtain immediate feedback and hints until the due date. These features increased student time on task, allowing higher course standards and student achievement in a diverse student population. CAPA handles routine tasks such as grading, recording, summarizing, and posting grades. In anonymous surveys, students indicated an overwhelming preference for homework in CAPA format, citing several features such as immediate feedback, multiple tries, and on-line accessibility as reasons for their preference. We wrote and used more than 170 problems on 17 topics in introductory plant physiology, cataloging them in a computer library for general access. Representative problems are compared and discussed. PMID:10198076

  19. An image processing approach to computing distances between RNA secondary structures dot plots

    Directory of Open Access Journals (Sweden)

    Sapiro Guillermo

    2009-02-01

    Full Text Available Abstract Background Computing the distance between two RNA secondary structures can contribute in understanding the functional relationship between them. When used repeatedly, such a procedure may lead to finding a query RNA structure of interest in a database of structures. Several methods are available for computing distances between RNAs represented as strings or graphs, but none utilize the RNA representation with dot plots. Since dot plots are essentially digital images, there is a clear motivation to devise an algorithm for computing the distance between dot plots based on image processing methods. Results We have developed a new metric dubbed 'DoPloCompare', which compares two RNA structures. The method is based on comparing dot plot diagrams that represent the secondary structures. When analyzing two diagrams and motivated by image processing, the distance is based on a combination of histogram correlations and a geometrical distance measure. We introduce, describe, and illustrate the procedure by two applications that utilize this metric on RNA sequences. The first application is the RNA design problem, where the goal is to find the nucleotide sequence for a given secondary structure. Examples where our proposed distance measure outperforms others are given. The second application locates peculiar point mutations that induce significant structural alternations relative to the wild type predicted secondary structure. The approach reported in the past to solve this problem was tested on several RNA sequences with known secondary structures to affirm their prediction, as well as on a data set of ribosomal pieces. These pieces were computationally cut from a ribosome for which an experimentally derived secondary structure is available, and on each piece the prediction conveys similarity to the experimental result. Our newly proposed distance measure shows benefit in this problem as well when compared to standard methods used for assessing

  20. An Immunohistochemical Approach to Identify the Sex of Young Marine Turtles.

    Science.gov (United States)

    Tezak, Boris M; Guthrie, Kathleen; Wyneken, Jeanette

    2017-08-01

    Marine turtles exhibit temperature-dependent sex determination (TSD). During critical periods of embryonic development, the nest's thermal environment directs whether an embryo will develop as a male or female. At warmer sand temperatures, nests tend to produce female-biased sex ratios. The rapid increase of global temperature highlights the need for a clear assessment of its effects on sea turtle sex ratios. However, estimating hatchling sex ratios at rookeries remains imprecise due to the lack of sexual dimorphism in young marine turtles. We rely mainly upon laparoscopic procedures to verify hatchling sex; however, in some species, morphological sex can be ambiguous even at the histological level. Recent studies using immunohistochemical (IHC) techniques identified that embryonic snapping turtle (Chelydra serpentina) ovaries overexpressed a particular cold-induced RNA-binding protein in comparison to testes. This feature allows the identification of females vs. males. We modified this technique to successfully identify the sexes of loggerhead sea turtle (Caretta caretta) hatchlings, and independently confirmed the results by standard histological and laparoscopic methods that reliably identify sex in this species. We next tested the CIRBP IHC method on gonad samples from leatherback turtles (Dermochelys coriacea). Leatherbacks display delayed gonad differentiation, when compared to other sea turtles, making hatchling gonads difficult to sex using standard H&E stain histology. The IHC approach was successful in both C. caretta and D. coriacea samples, offering a much-needed tool to establish baseline hatchling sex ratios, particularly for assessing impacts of climate change effects on leatherback turtle hatchlings and sea turtle demographics. Anat Rec, 300:1512-1518, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Environmental computing compendium - background and motivation

    Science.gov (United States)

    Heikkurinen, Matti; Kranzlmüller, Dieter

    2017-04-01

    The emerging discipline of environmental computing brings together experts in applied, advanced environmental modelling. The application domains address several fundamental societal challenges, ranging from disaster risk reduction to sustainability issues (such as food security on the global scale). The community has used an Intuitive, pragmatic approach when determining which initiatives are considered to "belong to the discipline". The community's growth is based on sharing of experiences and tools provides opportunities for reusing solutions or applying knowledge in new settings. Thus, limiting possible synergies by applying an arbitrary, formal definition to exclude some of the sources of solutions and knowledge would be counterproductive. However, the number of individuals and initiatives involved has grown to the level where a survey of initiatives and sub-themes they focus on is of interest. By surveying the project landscape and identifying common themes and building a shared vocabulary to describe them we can both communicate the relevance of the new discipline to the general public more easily and make it easier for the new members of the community to find the most promising collaboration partners. This talk presents the methodology and initial findings of the initial survey of the environmental computing initiatives and organisations, as well as approaches that could lead to an environmental computing compendium that would be a collaborative maintained shared resource of the environmental computing community.

  2. Delamination detection using methods of computational intelligence

    Science.gov (United States)

    Ihesiulor, Obinna K.; Shankar, Krishna; Zhang, Zhifang; Ray, Tapabrata

    2012-11-01

    Abstract Reliable delamination prediction scheme is indispensable in order to prevent potential risks of catastrophic failures in composite structures. The existence of delaminations changes the vibration characteristics of composite laminates and hence such indicators can be used to quantify the health characteristics of laminates. An approach for online health monitoring of in-service composite laminates is presented in this paper that relies on methods based on computational intelligence. Typical changes in the observed vibration characteristics (i.e. change in natural frequencies) are considered as inputs to identify the existence, location and magnitude of delaminations. The performance of the proposed approach is demonstrated using numerical models of composite laminates. Since this identification problem essentially involves the solution of an optimization problem, the use of finite element (FE) methods as the underlying tool for analysis turns out to be computationally expensive. A surrogate assisted optimization approach is hence introduced to contain the computational time within affordable limits. An artificial neural network (ANN) model with Bayesian regularization is used as the underlying approximation scheme while an improved rate of convergence is achieved using a memetic algorithm. However, building of ANN surrogate models usually requires large training datasets. K-means clustering is effectively employed to reduce the size of datasets. ANN is also used via inverse modeling to determine the position, size and location of delaminations using changes in measured natural frequencies. The results clearly highlight the efficiency and the robustness of the approach.

  3. Vector and parallel computing on the IBM ES/3090, a powerful approach to solving problems in the utility industry

    International Nuclear Information System (INIS)

    Bellucci, V.J.

    1990-01-01

    This paper describes IBM's approach to parallel computing using the IBM ES/3090 computer. Parallel processing concepts were discussed including its advantages, potential performance improvements and limitations. Particular applications and capabilities for the IBM ES/3090 were presented along with preliminary results from some utilities in the application of parallel processing to simulation of system reliability, air pollution models, and power network dynamics

  4. A computational approach to measuring the correlation between expertise and social media influence for celebrities on microblogs

    OpenAIRE

    Zhao, Wayne Xin; Liu, Jing; He, Yulan; Lin, Chin Yew; Wen, Ji-Rong

    2016-01-01

    Social media influence analysis, sometimes also called authority detection, aims to rank users based on their influence scores in social media. Existing approaches of social influence analysis usually focus on how to develop effective algorithms to quantize users’ influence scores. They rarely consider a person’s expertise levels which are arguably important to influence measures. In this paper, we propose a computational approach to measuring the correlation between expertise and social medi...

  5. An affinity pull-down approach to identify the plant cyclic nucleotide interactome

    KAUST Repository

    Donaldson, Lara Elizabeth; Meier, Stuart Kurt

    2013-01-01

    Cyclic nucleotides (CNs) are intracellular second messengers that play an important role in mediating physiological responses to environmental and developmental signals, in species ranging from bacteria to humans. In response to these signals, CNs are synthesized by nucleotidyl cyclases and then act by binding to and altering the activity of downstream target proteins known as cyclic nucleotide-binding proteins (CNBPs). A number of CNBPs have been identified across kingdoms including transcription factors, protein kinases, phosphodiesterases, and channels, all of which harbor conserved CN-binding domains. In plants however, few CNBPs have been identified as homology searches fail to return plant sequences with significant matches to known CNBPs. Recently, affinity pull-down techniques have been successfully used to identify CNBPs in animals and have provided new insights into CN signaling. The application of these techniques to plants has not yet been extensively explored and offers an alternative approach toward the unbiased discovery of novel CNBP candidates in plants. Here, an affinity pull-down technique for the identification of the plant CN interactome is presented. In summary, the method involves an extraction of plant proteins which is incubated with a CN-bait, followed by a series of increasingly stringent elutions that eliminates proteins in a sequential manner according to their affinity to the bait. The eluted and bait-bound proteins are separated by one-dimensional gel electrophoresis, excised, and digested with trypsin after which the resultant peptides are identified by mass spectrometry - techniques that are commonplace in proteomics experiments. The discovery of plant CNBPs promises to provide valuable insight into the mechanism of CN signal transduction in plants. © Springer Science+Business Media New York 2013.

  6. An affinity pull-down approach to identify the plant cyclic nucleotide interactome

    KAUST Repository

    Donaldson, Lara Elizabeth

    2013-09-03

    Cyclic nucleotides (CNs) are intracellular second messengers that play an important role in mediating physiological responses to environmental and developmental signals, in species ranging from bacteria to humans. In response to these signals, CNs are synthesized by nucleotidyl cyclases and then act by binding to and altering the activity of downstream target proteins known as cyclic nucleotide-binding proteins (CNBPs). A number of CNBPs have been identified across kingdoms including transcription factors, protein kinases, phosphodiesterases, and channels, all of which harbor conserved CN-binding domains. In plants however, few CNBPs have been identified as homology searches fail to return plant sequences with significant matches to known CNBPs. Recently, affinity pull-down techniques have been successfully used to identify CNBPs in animals and have provided new insights into CN signaling. The application of these techniques to plants has not yet been extensively explored and offers an alternative approach toward the unbiased discovery of novel CNBP candidates in plants. Here, an affinity pull-down technique for the identification of the plant CN interactome is presented. In summary, the method involves an extraction of plant proteins which is incubated with a CN-bait, followed by a series of increasingly stringent elutions that eliminates proteins in a sequential manner according to their affinity to the bait. The eluted and bait-bound proteins are separated by one-dimensional gel electrophoresis, excised, and digested with trypsin after which the resultant peptides are identified by mass spectrometry - techniques that are commonplace in proteomics experiments. The discovery of plant CNBPs promises to provide valuable insight into the mechanism of CN signal transduction in plants. © Springer Science+Business Media New York 2013.

  7. Systems approach to modeling the Token Bucket algorithm in computer networks

    Directory of Open Access Journals (Sweden)

    Ahmed N. U.

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  8. Identifying regions of interest in medical images using self-organizing maps.

    Science.gov (United States)

    Teng, Wei-Guang; Chang, Ping-Lin

    2012-10-01

    Advances in data acquisition, processing and visualization techniques have had a tremendous impact on medical imaging in recent years. However, the interpretation of medical images is still almost always performed by radiologists. Developments in artificial intelligence and image processing have shown the increasingly great potential of computer-aided diagnosis (CAD). Nevertheless, it has remained challenging to develop a general approach to process various commonly used types of medical images (e.g., X-ray, MRI, and ultrasound images). To facilitate diagnosis, we recommend the use of image segmentation to discover regions of interest (ROI) using self-organizing maps (SOM). We devise a two-stage SOM approach that can be used to precisely identify the dominant colors of a medical image and then segment it into several small regions. In addition, by appropriately conducting the recursive merging steps to merge smaller regions into larger ones, radiologists can usually identify one or more ROIs within a medical image.

  9. Identifying the Return on Investment for Army Migration to a Modular Open Systems Approach for Future and Legacy Systems

    Science.gov (United States)

    2017-04-05

    Identifying the Return on Investment for Army Migration to a Modular Open Systems Approach for Future and Legacy Systems Phillip Minor...Authorization Act (NDAA) of 2015, cites the modular open systems approach (MOSA) as both a business and technical strategy to reduce the cost of system ...access the service over the network. Combine the advances cited above with the emergence of systems developed using the modular open systems approach

  10. Demonstration of statistical approaches to identify component's ageing by operational data analysis-A case study for the ageing PSA network

    International Nuclear Information System (INIS)

    Rodionov, Andrei; Atwood, Corwin L.; Kirchsteiger, Christian; Patrik, Milan

    2008-01-01

    The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study

  11. A Bicriteria Approach Identifying Nondominated Portfolios

    Directory of Open Access Journals (Sweden)

    Javier Pereira

    2014-01-01

    Full Text Available We explore a portfolio constructive model, formulated in terms of satisfaction of a given set of technical requirements, with the minimum number of projects and minimum redundancy. An algorithm issued from robust portfolio modeling is adapted to a vector model, modifying the dominance condition as convenient, in order to find the set of nondominated portfolios, as solutions of a bicriteria integer linear programming problem. In order to improve the former algorithm, a process finding an optimal solution of a monocriteria version of this problem is proposed, which is further used as a first feasible solution aiding to find nondominated solutions more rapidly. Next, a sorting process is applied on the input data or information matrix, which is intended to prune nonfeasible solutions early in the constructive algorithm. Numerical examples show that the optimization and sorting processes both improve computational efficiency of the original algorithm. Their limits are also shown on certain complex instances.

  12. Selecting Personal Computers.

    Science.gov (United States)

    Djang, Philipp A.

    1993-01-01

    Describes a Multiple Criteria Decision Analysis Approach for the selection of personal computers that combines the capabilities of Analytic Hierarchy Process and Integer Goal Programing. An example of how decision makers can use this approach to determine what kind of personal computers and how many of each type to purchase is given. (nine…

  13. Reverse Vaccinology: An Approach for Identifying Leptospiral Vaccine Candidates

    Directory of Open Access Journals (Sweden)

    Odir A. Dellagostin

    2017-01-01

    Full Text Available Leptospirosis is a major public health problem with an incidence of over one million human cases each year. It is a globally distributed, zoonotic disease and is associated with significant economic losses in farm animals. Leptospirosis is caused by pathogenic Leptospira spp. that can infect a wide range of domestic and wild animals. Given the inability to control the cycle of transmission among animals and humans, there is an urgent demand for a new vaccine. Inactivated whole-cell vaccines (bacterins are routinely used in livestock and domestic animals, however, protection is serovar-restricted and short-term only. To overcome these limitations, efforts have focused on the development of recombinant vaccines, with partial success. Reverse vaccinology (RV has been successfully applied to many infectious diseases. A growing number of leptospiral genome sequences are now available in public databases, providing an opportunity to search for prospective vaccine antigens using RV. Several promising leptospiral antigens were identified using this approach, although only a few have been characterized and evaluated in animal models. In this review, we summarize the use of RV for leptospirosis and discuss the need for potential improvements for the successful development of a new vaccine towards reducing the burden of human and animal leptospirosis.

  14. An ordinal approach to computing with words and the preference-aversion model

    DEFF Research Database (Denmark)

    Franco de los Rios, Camilo Andres; Rodríguez, J. Tinguaro; Montero, Javier

    2014-01-01

    Computing with words (CWW) explores the brain’s ability to handle and evaluate perceptions through language, i.e., by means of the linguistic representation of information and knowledge. On the other hand, standard preference structures examine decision problems through the decomposition of the p......Computing with words (CWW) explores the brain’s ability to handle and evaluate perceptions through language, i.e., by means of the linguistic representation of information and knowledge. On the other hand, standard preference structures examine decision problems through the decomposition...... of the preference predicate into the simpler situations of strict preference, indifference and incomparability. Hence, following the distinctive cognitive/neurological features for perceiving positive and negative stimuli in separate regions of the brain, we consider two separate and opposite poles of preference...... and aversion, and obtain an extended preference structure named the Preference–aversion (P–A) structure. In this way, examining the meaning of words under an ordinal scale and using CWW’s methodology, we are able to formulate the P–A model under a simple and purely linguistic approach to decision making...

  15. A non-linear programming approach to the computer-aided design of regulators using a linear-quadratic formulation

    Science.gov (United States)

    Fleming, P.

    1985-01-01

    A design technique is proposed for linear regulators in which a feedback controller of fixed structure is chosen to minimize an integral quadratic objective function subject to the satisfaction of integral quadratic constraint functions. Application of a non-linear programming algorithm to this mathematically tractable formulation results in an efficient and useful computer-aided design tool. Particular attention is paid to computational efficiency and various recommendations are made. Two design examples illustrate the flexibility of the approach and highlight the special insight afforded to the designer.

  16. Solubility of magnetite in high temperature water and an approach to generalized solubility computations

    International Nuclear Information System (INIS)

    Dinov, K.; Ishigure, K.; Matsuura, C.; Hiroishi, D.

    1993-01-01

    Magnetite solubility in pure water was measured at 423 K in a fully teflon-covered autoclave system. A fairly good agreement was found to exist between the experimental data and calculation results obtained from the thermodynamical model, based on the assumption of Fe 3 O 4 dissolution and Fe 2 O 3 deposition reactions. A generalized thermodynamical approach to the solubility computations under complex conditions on the basis of minimization of the total system Gibbs free energy was proposed. The forms of the chemical equilibria were obtained for various systems initially defined and successfully justified by the subsequent computations. A [Fe 3+ ] T -[Fe 2+ ] T phase diagram was introduced as a tool for systematic understanding of the magnetite dissolution phenomena in pure water and under oxidizing and reducing conditions. (orig.)

  17. Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.

    Science.gov (United States)

    Scheeline, Alexander; Mork, Brian J.

    1988-01-01

    Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)

  18. An artificial intelligence approach to well log correlation

    International Nuclear Information System (INIS)

    Startzman, R.A.; Kuo, T.B.

    1986-01-01

    This paper shows how an expert computer system was developed to correlate two well logs in at least moderately difficult situations. A four step process was devised to process log trace information and apply a set of rules to identify zonal correlations. Some of the advantages and problems with the artificial intelligence approach are shown using field logs. The approach is useful and, if properly and systematically applied, it can result in good correlations

  19. Efficient approach to compute melting properties fully from ab initio with application to Cu

    Science.gov (United States)

    Zhu, Li-Fang; Grabowski, Blazej; Neugebauer, Jörg

    2017-12-01

    Applying thermodynamic integration within an ab initio-based free-energy approach is a state-of-the-art method to calculate melting points of materials. However, the high computational cost and the reliance on a good reference system for calculating the liquid free energy have so far hindered a general application. To overcome these challenges, we propose the two-optimized references thermodynamic integration using Langevin dynamics (TOR-TILD) method in this work by extending the two-stage upsampled thermodynamic integration using Langevin dynamics (TU-TILD) method, which has been originally developed to obtain anharmonic free energies of solids, to the calculation of liquid free energies. The core idea of TOR-TILD is to fit two empirical potentials to the energies from density functional theory based molecular dynamics runs for the solid and the liquid phase and to use these potentials as reference systems for thermodynamic integration. Because the empirical potentials closely reproduce the ab initio system in the relevant part of the phase space the convergence of the thermodynamic integration is very rapid. Therefore, the proposed approach improves significantly the computational efficiency while preserving the required accuracy. As a test case, we apply TOR-TILD to fcc Cu computing not only the melting point but various other melting properties, such as the entropy and enthalpy of fusion and the volume change upon melting. The generalized gradient approximation (GGA) with the Perdew-Burke-Ernzerhof (PBE) exchange-correlation functional and the local-density approximation (LDA) are used. Using both functionals gives a reliable ab initio confidence interval for the melting point, the enthalpy of fusion, and entropy of fusion.

  20. Domain-restricted mutation analysis to identify novel driver events in human cancer

    Directory of Open Access Journals (Sweden)

    Sanket Desai

    2017-10-01

    Full Text Available Analysis of mutational spectra across various cancer types has given valuable insights into tumorigenesis. Different approaches have been used to identify novel drivers from the set of somatic mutations, including the methods which use sequence conservation, geometric localization and pathway information. Recent computational methods suggest use of protein domain information for analysis and understanding of the functional consequence of non-synonymous mutations. Similarly, evidence suggests recurrence at specific position in proteins is robust indicators of its functional impact. Building on this, we performed a systematic analysis of TCGA exome derived somatic mutations across 6089 PFAM domains and significantly mutated domains were identified using randomization approach. Multiple alignment of individual domain allowed us to prioritize for conserved residues mutated at analogous positions across different proteins in a statistically disciplined manner. In addition to the known frequently mutated genes, this analysis independently identifies low frequency Meprin and TRAF-Homology (MATH domain in Speckle Type BTB/POZ (SPOP protein, in prostate adenocarcinoma. Results from this analysis will help generate hypotheses about the downstream molecular mechanism resulting in cancer phenotypes.

  1. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    Science.gov (United States)

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  2. Computer-Assisted, Counselor-Delivered Smoking Cessation Counseling for Community College Students: Intervention Approach and Sample Characteristics

    Science.gov (United States)

    Prokhorov, Alexander V.; Fouladi, Rachel T.; de Moor, Carl; Warneke, Carla L.; Luca, Mario; Jones, Mary Mullin; Rosenblum, Carol; Emmons, Karen M.; Hudmon, Karen Suchanek; Yost, Tracey E.; Gritz, Ellen R.

    2007-01-01

    This report presents the experimental approach and baseline findings from "Look at Your Health," an ongoing study to develop and evaluate a computer-assisted, counselor-delivered smoking cessation program for community college students. It describes the expert system software program used for data collection and for provision of tailored feedback,…

  3. A computationally efficient approach for template matching-based ...

    Indian Academy of Sciences (India)

    In this paper, a new computationally efficient image registration method is ...... the proposed method requires less computational time as compared to traditional methods. ... Zitová B and Flusser J 2003 Image registration methods: A survey.

  4. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  5. Structural Identifiability of Dynamic Systems Biology Models.

    Science.gov (United States)

    Villaverde, Alejandro F; Barreiro, Antonio; Papachristodoulou, Antonis

    2016-10-01

    A powerful way of gaining insight into biological systems is by creating a nonlinear differential equation model, which usually contains many unknown parameters. Such a model is called structurally identifiable if it is possible to determine the values of its parameters from measurements of the model outputs. Structural identifiability is a prerequisite for parameter estimation, and should be assessed before exploiting a model. However, this analysis is seldom performed due to the high computational cost involved in the necessary symbolic calculations, which quickly becomes prohibitive as the problem size increases. In this paper we show how to analyse the structural identifiability of a very general class of nonlinear models by extending methods originally developed for studying observability. We present results about models whose identifiability had not been previously determined, report unidentifiabilities that had not been found before, and show how to modify those unidentifiable models to make them identifiable. This method helps prevent problems caused by lack of identifiability analysis, which can compromise the success of tasks such as experiment design, parameter estimation, and model-based optimization. The procedure is called STRIKE-GOLDD (STRuctural Identifiability taKen as Extended-Generalized Observability with Lie Derivatives and Decomposition), and it is implemented in a MATLAB toolbox which is available as open source software. The broad applicability of this approach facilitates the analysis of the increasingly complex models used in systems biology and other areas.

  6. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  7. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    Energy Technology Data Exchange (ETDEWEB)

    Aderholdt, Ferrol [Tennessee Technological Univ., Cookeville, TN (United States); Caldwell, Blake A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hicks, Susan Elaine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Koch, Scott M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Naughton, III, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pelfrey, Daniel S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pogge, James R [Tennessee Technological Univ., Cookeville, TN (United States); Scott, Stephen L [Tennessee Technological Univ., Cookeville, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sorrillo, Lawrence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges for the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.

  8. Identifying determinants of medication adherence following myocardial infarction using the Theoretical Domains Framework and the Health Action Process Approach.

    Science.gov (United States)

    Presseau, Justin; Schwalm, J D; Grimshaw, Jeremy M; Witteman, Holly O; Natarajan, Madhu K; Linklater, Stefanie; Sullivan, Katrina; Ivers, Noah M

    2017-10-01

    Despite evidence-based recommendations, adherence with secondary prevention medications post-myocardial infarction (MI) remains low. Taking medication requires behaviour change, and using behavioural theories to identify what factors determine adherence could help to develop novel adherence interventions. Compare the utility of different behaviour theory-based approaches for identifying modifiable determinants of medication adherence post-MI that could be targeted by interventions. Two studies were conducted with patients 0-2, 3-12, 13-24 or 25-36 weeks post-MI. Study 1: 24 patients were interviewed about barriers and facilitators to medication adherence. Interviews were conducted and coded using the Theoretical Domains Framework. Study 2: 201 patients answered a telephone questionnaire assessing Health Action Process Approach constructs to predict intention and medication adherence (MMAS-8). Study 1: domains identified: Beliefs about Consequences, Memory/Attention/Decision Processes, Behavioural Regulation, Social Influences and Social Identity. Study 2: 64, 59, 42 and 58% reported high adherence at 0-2, 3-12, 13-24 and 25-36 weeks. Social Support and Action Planning predicted adherence at all time points, though the relationship between Action Planning and adherence decreased over time. Using two behaviour theory-based approaches provided complimentary findings and identified modifiable factors that could be targeted to help translate Intention into action to improve medication adherence post-MI.

  9. Sentinel nodes identified by computed tomography-lymphography accurately stage the axilla in patients with breast cancer

    International Nuclear Information System (INIS)

    Motomura, Kazuyoshi; Sumino, Hiroshi; Noguchi, Atsushi; Horinouchi, Takashi; Nakanishi, Katsuyuki

    2013-01-01

    Sentinel node biopsy often results in the identification and removal of multiple nodes as sentinel nodes, although most of these nodes could be non-sentinel nodes. This study investigated whether computed tomography-lymphography (CT-LG) can distinguish sentinel nodes from non-sentinel nodes and whether sentinel nodes identified by CT-LG can accurately stage the axilla in patients with breast cancer. This study included 184 patients with breast cancer and clinically negative nodes. Contrast agent was injected interstitially. The location of sentinel nodes was marked on the skin surface using a CT laser light navigator system. Lymph nodes located just under the marks were first removed as sentinel nodes. Then, all dyed nodes or all hot nodes were removed. The mean number of sentinel nodes identified by CT-LG was significantly lower than that of dyed and/or hot nodes removed (1.1 vs 1.8, p <0.0001). Twenty-three (12.5%) patients had ≥2 sentinel nodes identified by CT-LG removed, whereas 94 (51.1%) of patients had ≥2 dyed and/or hot nodes removed (p <0.0001). Pathological evaluation demonstrated that 47 (25.5%) of 184 patients had metastasis to at least one node. All 47 patients demonstrated metastases to at least one of the sentinel nodes identified by CT-LG. CT-LG can distinguish sentinel nodes from non-sentinel nodes, and sentinel nodes identified by CT-LG can accurately stage the axilla in patients with breast cancer. Successful identification of sentinel nodes using CT-LG may facilitate image-based diagnosis of metastasis, possibly leading to the omission of sentinel node biopsy

  10. Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction

    Directory of Open Access Journals (Sweden)

    Peter E Larsen

    2016-01-01

    Full Text Available In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree and Laccaria bicolor (mycorrhizal fungi interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensor systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. This multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.

  11. A physarum-inspired prize-collecting steiner tree approach to identify subnetworks for drug repositioning.

    Science.gov (United States)

    Sun, Yahui; Hameed, Pathima Nusrath; Verspoor, Karin; Halgamuge, Saman

    2016-12-05

    Drug repositioning can reduce the time, costs and risks of drug development by identifying new therapeutic effects for known drugs. It is challenging to reposition drugs as pharmacological data is large and complex. Subnetwork identification has already been used to simplify the visualization and interpretation of biological data, but it has not been applied to drug repositioning so far. In this paper, we fill this gap by proposing a new Physarum-inspired Prize-Collecting Steiner Tree algorithm to identify subnetworks for drug repositioning. Drug Similarity Networks (DSN) are generated using the chemical, therapeutic, protein, and phenotype features of drugs. In DSNs, vertex prizes and edge costs represent the similarities and dissimilarities between drugs respectively, and terminals represent drugs in the cardiovascular class, as defined in the Anatomical Therapeutic Chemical classification system. A new Physarum-inspired Prize-Collecting Steiner Tree algorithm is proposed in this paper to identify subnetworks. We apply both the proposed algorithm and the widely-used GW algorithm to identify subnetworks in our 18 generated DSNs. In these DSNs, our proposed algorithm identifies subnetworks with an average Rand Index of 81.1%, while the GW algorithm can only identify subnetworks with an average Rand Index of 64.1%. We select 9 subnetworks with high Rand Index to find drug repositioning opportunities. 10 frequently occurring drugs in these subnetworks are identified as candidates to be repositioned for cardiovascular diseases. We find evidence to support previous discoveries that nitroglycerin, theophylline and acarbose may be able to be repositioned for cardiovascular diseases. Moreover, we identify seven previously unknown drug candidates that also may interact with the biological cardiovascular system. These discoveries show our proposed Prize-Collecting Steiner Tree approach as a promising strategy for drug repositioning.

  12. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  13. Computer Assisted REhabilitation (CARE) Lab: A novel approach towards Pediatric Rehabilitation 2.0.

    Science.gov (United States)

    Olivieri, Ivana; Meriggi, Paolo; Fedeli, Cristina; Brazzoli, Elena; Castagna, Anna; Roidi, Marina Luisa Rodocanachi; Angelini, Lucia

    2018-01-01

    Pediatric Rehabilitation therapists have always worked using a variety of off-the-shelf or custom-made objects and devices, more recently including computer based systems. These Information and Communication Technology (ICT) solutions vary widely in complexity, from easy-to-use interactive videogame consoles originally intended for entertainment purposes to sophisticated systems specifically developed for rehabilitation.This paper describes the principles underlying an innovative "Pediatric Rehabilitation 2.0" approach, based on the combination of suitable ICT solutions and traditional rehabilitation, which has been progressively refined while building up and using a computer-assisted rehabilitation laboratory. These principles are thus summarized in the acronym EPIQ, to account for the terms Ecological, Personalized, Interactive and Quantitative. The paper also presents the laboratory, which has been designed to meet the children's rehabilitation needs and to empower therapists in their work. The laboratory is equipped with commercial hardware and specially developed software called VITAMIN: a virtual reality platform for motor and cognitive rehabilitation.

  14. A neural algorithm for a fundamental computing problem.

    Science.gov (United States)

    Dasgupta, Sanjoy; Stevens, Charles F; Navlakha, Saket

    2017-11-10

    Similarity search-for example, identifying similar images in a database or similar documents on the web-is a fundamental computing problem faced by large-scale information retrieval systems. We discovered that the fruit fly olfactory circuit solves this problem with a variant of a computer science algorithm (called locality-sensitive hashing). The fly circuit assigns similar neural activity patterns to similar odors, so that behaviors learned from one odor can be applied when a similar odor is experienced. The fly algorithm, however, uses three computational strategies that depart from traditional approaches. These strategies can be translated to improve the performance of computational similarity searches. This perspective helps illuminate the logic supporting an important sensory function and provides a conceptually new algorithm for solving a fundamental computational problem. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  15. Computer-aided testing and operational aids for PARR-1 nuclear reactor

    International Nuclear Information System (INIS)

    Ansari, S.A.

    1990-01-01

    The utilization of the plant computer of Pakistan Research Reactor (PARR-1) for automatic periodic testing of nuclear instrumentation in the reactor is described. Computer algorithms have been developed for on-line acquisition and real-time processing of nuclear channel signals. The mean value, standard deviation, and probability distributions of nuclear channel signals are obtained in real time, and the computer generates a warning message if the signal error exceeds the maximum permissible error. In this way a faulty channel is automatically identified. Other real-time algorithms are also described that assist the operator in safe reactor operation by automatically computing approach-to-criticality during reactor start-up and the control rod worth determination

  16. Probing the structure of complex solids using a distributed computing approach-Applications in zeolite science

    International Nuclear Information System (INIS)

    French, Samuel A.; Coates, Rosie; Lewis, Dewi W.; Catlow, C. Richard A.

    2011-01-01

    We demonstrate the viability of distributed computing techniques employing idle desktop computers in investigating complex structural problems in solids. Through the use of a combined Monte Carlo and energy minimisation method, we show how a large parameter space can be effectively scanned. By controlling the generation and running of different configurations through a database engine, we are able to not only analyse the data 'on the fly' but also direct the running of jobs and the algorithms for generating further structures. As an exemplar case, we probe the distribution of Al and extra-framework cations in the structure of the zeolite Mordenite. We compare our computed unit cells with experiment and find that whilst there is excellent correlation between computed and experimentally derived unit cell volumes, cation positioning and short-range Al ordering (i.e. near neighbour environment), there remains some discrepancy in the distribution of Al throughout the framework. We also show that stability-structure correlations only become apparent once a sufficiently large sample is used. - Graphical Abstract: Aluminium distributions in zeolites are determined using e-science methods. Highlights: → Use of e-science methods to search configurationally space. → Automated control of space searching. → Identify key structural features conveying stability. → Improved correlation of computed structures with experimental data.

  17. The Trapping Index: How to integrate the Eulerian and the Lagrangian approach for the computation of the transport time scales of semi-enclosed basins.

    Science.gov (United States)

    Cucco, Andrea; Umgiesser, Georg

    2015-09-15

    In this work, we investigated if the Eulerian and the Lagrangian approaches for the computation of the Transport Time Scales (TTS) of semi-enclosed water bodies can be used univocally to define the spatial variability of basin flushing features. The Eulerian and Lagrangian TTS were computed for both simplified test cases and a realistic domain: the Venice Lagoon. The results confirmed the two approaches cannot be adopted univocally and that the spatial variability of the water renewal capacity can be investigated only through the computation of both the TTS. A specific analysis, based on the computation of a so-called Trapping Index, was then suggested to integrate the information provided by the two different approaches. The obtained results proved the Trapping Index to be useful to avoid any misleading interpretation due to the evaluation of the basin renewal features just from an Eulerian only or from a Lagrangian only perspective. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Computational analysis of the SRS Phase III salt disposition alternatives

    International Nuclear Information System (INIS)

    Dimenna, R.A.

    2000-01-01

    In late 1997, the In-Tank Precipitation (ITP), facility was shut down and an evaluation of alternative methods to process the liquid high-level waste stored in the Savannah River Site High-Level Waste storage tanks was begun. The objective was to determine whether another process might avoid the operational difficulties encountered with ITP for a lower cost than modifying the existing structured approach to evaluating proposed alternatives on a common basis to identify the best one. Results from the computational analysis were a key part of the input used to select a primary and a secondary salt disposition alternative. This paper describes the process by which the computation needs were identified, addressed, and accomplished with a limited staff under stringent schedule constraints

  19. Who’s Who at the Border? A rights-based approach to identifying human trafficking at international borders

    Directory of Open Access Journals (Sweden)

    Marika McAdam

    2013-09-01

    Full Text Available International borders are widely touted as bastions in the fight against trafficking in persons. This article acknowledges the important role border officials play in preventing human trafficking, but calls for expectations to be tempered by deference to the conceptual complexity of cross-border trafficking and the migration processes involved. The fact that many trafficked victims begin their journeys as irregular or smuggled migrants highlights the challenge posed to border officials in identifying trafficked persons among the people they encounter. Indicators of trafficking generally relate to the exploitation phase, leaving border officials with little guidance as to how persons vulnerable to trafficking can be accurately identified before any exploitation has occurred. Ultimately, this paper advocates a pragmatic rights-based approach in designating anti-trafficking functions to border officials. A rights-based approach to border control acknowledges the core work of border officials as being to uphold border integrity, while ensuring that their performance of this role does not jeopardise the rights of those they intercept nor result in missed opportunities for specialists to identify trafficked persons and other vulnerable people among them.

  20. Computational Quantum Mechanics for Materials Engineers The EMTO Method and Applications

    CERN Document Server

    Vitos, L

    2007-01-01

    Traditionally, new materials have been developed by empirically correlating their chemical composition, and the manufacturing processes used to form them, with their properties. Until recently, metallurgists have not used quantum theory for practical purposes. However, the development of modern density functional methods means that today, computational quantum mechanics can help engineers to identify and develop novel materials. Computational Quantum Mechanics for Materials Engineers describes new approaches to the modelling of disordered alloys that combine the most efficient quantum-level th