WorldWideScience

Sample records for computational approach identifies

  1. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  2. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  3. Computational Approaches for Mining GRO-Seq Data to Identify and Characterize Active Enhancers.

    Science.gov (United States)

    Nagari, Anusha; Murakami, Shino; Malladi, Venkat S; Kraus, W Lee

    2017-01-01

    Transcriptional enhancers are DNA regulatory elements that are bound by transcription factors and act to positively regulate the expression of nearby or distally located target genes. Enhancers have many features that have been discovered using genomic analyses. Recent studies have shown that active enhancers recruit RNA polymerase II (Pol II) and are transcribed, producing enhancer RNAs (eRNAs). GRO-seq, a method for identifying the location and orientation of all actively transcribing RNA polymerases across the genome, is a powerful approach for monitoring nascent enhancer transcription. Furthermore, the unique pattern of enhancer transcription can be used to identify enhancers in the absence of any information about the underlying transcription factors. Here, we describe the computational approaches required to identify and analyze active enhancers using GRO-seq data, including data pre-processing, alignment, and transcript calling. In addition, we describe protocols and computational pipelines for mining GRO-seq data to identify active enhancers, as well as known transcription factor binding sites that are transcribed. Furthermore, we discuss approaches for integrating GRO-seq-based enhancer data with other genomic data, including target gene expression and function. Finally, we describe molecular biology assays that can be used to confirm and explore further the function of enhancers that have been identified using genomic assays. Together, these approaches should allow the user to identify and explore the features and biological functions of new cell type-specific enhancers.

  4. Computational Approach to Identify Enzymes That Are Potential Therapeutic Candidates for Psoriasis

    Directory of Open Access Journals (Sweden)

    Daeui Park

    2011-01-01

    Full Text Available Psoriasis is well known as a chronic inflammatory dermatosis. The disease affects persons of all ages and is a burden worldwide. Psoriasis is associated with various diseases such as arthritis. The disease is characterized by well-demarcated lesions on the skin of the elbows and knees. Various genetic and environmental factors are related to the pathogenesis of psoriasis. In order to identify enzymes that are potential therapeutic targets for psoriasis, we utilized a computational approach, combining microarray analysis and protein interaction prediction. We found 6,437 genes (3,264 upregulated and 3,173 downregulated that have significant differences in expression between regions with and without lesions in psoriasis patients. We identified potential candidates through protein-protein interaction predictions made using various protein interaction resources. By analyzing the hub protein of the networks with metrics such as degree and centrality, we detected 32 potential therapeutic candidates. After filtering these candidates through the ENZYME nomenclature database, we selected 5 enzymes: DNA helicase (RUVBL2, proteasome endopeptidase complex (PSMA2, nonspecific protein-tyrosine kinase (ZAP70, I-kappa-B kinase (IKBKE, and receptor protein-tyrosine kinase (EGFR. We adopted a computational approach to detect potential therapeutic targets; this approach may become an effective strategy for the discovery of new drug targets for psoriasis.

  5. Identifying human disease genes: advances in molecular genetics and computational approaches.

    Science.gov (United States)

    Bakhtiar, S M; Ali, A; Baig, S M; Barh, D; Miyoshi, A; Azevedo, V

    2014-07-04

    The human genome project is one of the significant achievements that have provided detailed insight into our genetic legacy. During the last two decades, biomedical investigations have gathered a considerable body of evidence by detecting more than 2000 disease genes. Despite the imperative advances in the genetic understanding of various diseases, the pathogenesis of many others remains obscure. With recent advances, the laborious methodologies used to identify DNA variations are replaced by direct sequencing of genomic DNA to detect genetic changes. The ability to perform such studies depends equally on the development of high-throughput and economical genotyping methods. Currently, basically for every disease whose origen is still unknown, genetic approaches are available which could be pedigree-dependent or -independent with the capacity to elucidate fundamental disease mechanisms. Computer algorithms and programs for linkage analysis have formed the foundation for many disease gene detection projects, similarly databases of clinical findings have been widely used to support diagnostic decisions in dysmorphology and general human disease. For every disease type, genome sequence variations, particularly single nucleotide polymorphisms are mapped by comparing the genetic makeup of case and control groups. Methods that predict the effects of polymorphisms on protein stability are useful for the identification of possible disease associations, whereas structural effects can be assessed using methods to predict stability changes in proteins using sequence and/or structural information.

  6. An Integrated Bioinformatics and Computational Biology Approach Identifies New BH3-Only Protein Candidates.

    Science.gov (United States)

    Hawley, Robert G; Chen, Yuzhong; Riz, Irene; Zeng, Chen

    2012-05-04

    In this study, we utilized an integrated bioinformatics and computational biology approach in search of new BH3-only proteins belonging to the BCL2 family of apoptotic regulators. The BH3 (BCL2 homology 3) domain mediates specific binding interactions among various BCL2 family members. It is composed of an amphipathic α-helical region of approximately 13 residues that has only a few amino acids that are highly conserved across all members. Using a generalized motif, we performed a genome-wide search for novel BH3-containing proteins in the NCBI Consensus Coding Sequence (CCDS) database. In addition to known pro-apoptotic BH3-only proteins, 197 proteins were recovered that satisfied the search criteria. These were categorized according to α-helical content and predictive binding to BCL-xL (encoded by BCL2L1) and MCL-1, two representative anti-apoptotic BCL2 family members, using position-specific scoring matrix models. Notably, the list is enriched for proteins associated with autophagy as well as a broad spectrum of cellular stress responses such as endoplasmic reticulum stress, oxidative stress, antiviral defense, and the DNA damage response. Several potential novel BH3-containing proteins are highlighted. In particular, the analysis strongly suggests that the apoptosis inhibitor and DNA damage response regulator, AVEN, which was originally isolated as a BCL-xL-interacting protein, is a functional BH3-only protein representing a distinct subclass of BCL2 family members.

  7. Identifying Ghanaian Pre-Service Teachers' Readiness for Computer Use: A Technology Acceptance Model Approach

    Science.gov (United States)

    Gyamfi, Stephen Adu

    2016-01-01

    This study extends the technology acceptance model to identify factors that influence technology acceptance among pre-service teachers in Ghana. Data from 380 usable questionnaires were tested against the research model. Utilising the extended technology acceptance model (TAM) as a research framework, the study found that: pre-service teachers'…

  8. A computational approach for identifying the chemical factors involved in the glycosaminoglycans-mediated acceleration of amyloid fibril formation.

    Directory of Open Access Journals (Sweden)

    Elodie Monsellier

    Full Text Available BACKGROUND: Amyloid fibril formation is the hallmark of many human diseases, including Alzheimer's disease, type II diabetes and amyloidosis. Amyloid fibrils deposit in the extracellular space and generally co-localize with the glycosaminoglycans (GAGs of the basement membrane. GAGs have been shown to accelerate the formation of amyloid fibrils in vitro for a number of protein systems. The high number of data accumulated so far has created the grounds for the construction of a database on the effects of a number of GAGs on different proteins. METHODOLOGY/PRINCIPAL FINDINGS: In this study, we have constructed such a database and have used a computational approach that uses a combination of single parameter and multivariate analyses to identify the main chemical factors that determine the GAG-induced acceleration of amyloid formation. We show that the GAG accelerating effect is mainly governed by three parameters that account for three-fourths of the observed experimental variability: the GAG sulfation state, the solute molarity, and the ratio of protein and GAG molar concentrations. We then combined these three parameters into a single equation that predicts, with reasonable accuracy, the acceleration provided by a given GAG in a given condition. CONCLUSIONS/SIGNIFICANCE: In addition to shedding light on the chemical determinants of the protein:GAG interaction and to providing a novel mathematical predictive tool, our findings highlight the possibility that GAGs may not have such an accelerating effect on protein aggregation under the conditions existing in the basement membrane, given the values of salt molarity and protein:GAG molar ratio existing under such conditions.

  9. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    Science.gov (United States)

    Durstewitz, Daniel

    2017-06-01

    The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects

  10. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    Directory of Open Access Journals (Sweden)

    Daniel Durstewitz

    2017-06-01

    Full Text Available The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast maximum-likelihood estimation framework for PLRNNs that may enable to recover

  11. Identifying potential selective fluorescent probes for cancer-associated protein carbonic anhydrase IX using a computational approach.

    Science.gov (United States)

    Kamstra, Rhiannon L; Floriano, Wely B

    2014-11-01

    Carbonic anhydrase IX (CAIX) is a biomarker for tumor hypoxia. Fluorescent inhibitors of CAIX have been used to study hypoxic tumor cell lines. However, these inhibitor-based fluorescent probes may have a therapeutic effect that is not appropriate for monitoring treatment efficacy. In the search for novel fluorescent probes that are not based on known inhibitors, a database of 20,860 fluorescent compounds was virtually screened against CAIX using hierarchical virtual ligand screening (HierVLS). The screening database contained 14,862 compounds tagged with the ATTO680 fluorophore plus an additional 5998 intrinsically fluorescent compounds. Overall ranking of compounds to identify hit molecular probe candidates utilized a principal component analysis (PCA) approach. Four potential binding sites, including the catalytic site, were identified within the structure of the protein and targeted for virtual screening. Available sequence information for 23 carbonic anhydrase isoforms was used to prioritize the four sites based on the estimated "uniqueness" of each site in CAIX relative to the other isoforms. A database of 32 known inhibitors and 478 decoy compounds was used to validate the methodology. A receiver-operating characteristic (ROC) analysis using the first principal component (PC1) as predictive score for the validation database yielded an area under the curve (AUC) of 0.92. AUC is interpreted as the probability that a binder will have a better score than a non-binder. The use of first component analysis of binding energies for multiple sites is a novel approach for hit selection. The very high prediction power for this approach increases confidence in the outcome from the fluorescent library screening. Ten of the top scoring candidates for isoform-selective putative binding sites are suggested for future testing as fluorescent molecular probe candidates.

  12. MiR-RACE, a new efficient approach to determine the precise sequences of computationally identified trifoliate orange (Poncirus trifoliata microRNAs.

    Directory of Open Access Journals (Sweden)

    Changnian Song

    Full Text Available BACKGROUND: Among the hundreds of genes encoding miRNAs in plants reported, much more were predicted by numerous computational methods. However, unlike protein-coding genes defined by start and stop codons, the ends of miRNA molecules do not have characteristics that can be used to define the mature miRNAs exactly, which made computational miRNA prediction methods often cannot predict the accurate location of the mature miRNA in a precursor with nucleotide-level precision. To our knowledge, there haven't been reports about comprehensive strategies determining the precise sequences, especially two termini, of these miRNAs. METHODS: In this study, we report an efficient method to determine the precise sequences of computationally predicted microRNAs (miRNAs that combines miRNA-enriched library preparation, two specific 5' and 3' miRNA RACE (miR-RACE PCR reactions, and sequence-directed cloning, in which the most challenging step is the two specific gene specific primers designed for the two RACE reactions. miRNA-mediated mRNA cleavage by RLM-5' RACE and sequencing were carried out to validate the miRNAs detected. Real-time PCR was used to analyze the expression of each miRNA. RESULTS: The efficiency of this newly developed method was validated using nine trifoliate orange (Poncirus trifoliata miRNAs predicted computationally. The miRNAs computationally identified were validated by miR-RACE and sequencing. Quantitative analysis showed that they have variable expression. Eight target genes have been experimentally verified by detection of the miRNA-mediated mRNA cleavage in Poncirus trifoliate. CONCLUSION: The efficient and powerful approach developed herein can be successfully used to validate the sequences of miRNAs, especially the termini, which depict the complete miRNA sequence in the computationally predicted precursor.

  13. Identifying Causal Effects with Computer Algebra

    CERN Document Server

    García-Puente, Luis David; Sullivant, Seth

    2010-01-01

    The long-standing identification problem for causal effects in graphical models has many partial results but lacks a systematic study. We show how computer algebra can be used to either prove that a causal effect can be identified, generically identified, or show that the effect is not generically identifiable. We report on the results of our computations for linear structural equation models, where we determine precisely which causal effects are generically identifiable for all graphs on three and four vertices.

  14. An Approach for a Synthetic CTL Vaccine Design against Zika Flavivirus Using Class I and Class II Epitopes Identified by Computer Modeling

    Directory of Open Access Journals (Sweden)

    Edecio Cunha-Neto

    2017-06-01

    Full Text Available The threat posed by severe congenital abnormalities related to Zika virus (ZKV infection during pregnancy has turned development of a ZKV vaccine into an emergency. Recent work suggests that the cytotoxic T lymphocyte (CTL response to infection is an important defense mechanism in response to ZKV. Here, we develop the rationale and strategy for a new approach to developing cytotoxic T lymphocyte (CTL vaccines for ZKV flavivirus infection. The proposed approach is based on recent studies using a protein structure computer model for HIV epitope selection designed to select epitopes for CTL attack optimized for viruses that exhibit antigenic drift. Because naturally processed and presented human ZKV T cell epitopes have not yet been described, we identified predicted class I peptide sequences on ZKV matching previously identified DNV (Dengue class I epitopes and by using a Major Histocompatibility Complex (MHC binding prediction tool. A subset of those met the criteria for optimal CD8+ attack based on physical chemistry parameters determined by analysis of the ZKV protein structure encoded in open source Protein Data File (PDB format files. We also identified candidate ZKV epitopes predicted to bind promiscuously to multiple HLA class II molecules that could provide help to the CTL responses. This work suggests that a CTL vaccine for ZKV may be possible even if ZKV exhibits significant antigenic drift. We have previously described a microsphere-based CTL vaccine platform capable of eliciting an immune response for class I epitopes in mice and are currently working toward in vivo testing of class I and class II epitope delivery directed against ZKV epitopes using the same microsphere-based vaccine.

  15. Pharmacophore modeling, docking, and principal component analysis based clustering: combined computer-assisted approaches to identify new inhibitors of the human rhinovirus coat protein.

    Science.gov (United States)

    Steindl, Theodora M; Crump, Carolyn E; Hayden, Frederick G; Langer, Thierry

    2005-10-06

    The development and application of a sophisticated virtual screening and selection protocol to identify potential, novel inhibitors of the human rhinovirus coat protein employing various computer-assisted strategies are described. A large commercially available database of compounds was screened using a highly selective, structure-based pharmacophore model generated with the program Catalyst. A docking study and a principal component analysis were carried out within the software package Cerius and served to validate and further refine the obtained results. These combined efforts led to the selection of six candidate structures, for which in vitro anti-rhinoviral activity could be shown in a biological assay.

  16. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  17. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  18. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    Science.gov (United States)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-03-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  19. Identifying Structure-Property Relationships Through DREAM.3D Representative Volume Elements and DAMASK Crystal Plasticity Simulations: An Integrated Computational Materials Engineering Approach

    Science.gov (United States)

    Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk

    2017-05-01

    Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.

  20. A computational approach identifies two regions of Hepatitis C Virus E1 protein as interacting domains involved in viral fusion process.

    Science.gov (United States)

    Bruni, Roberto; Costantino, Angela; Tritarelli, Elena; Marcantonio, Cinzia; Ciccozzi, Massimo; Rapicetta, Maria; El Sawaf, Gamal; Giuliani, Alessandro; Ciccaglione, Anna Rita

    2009-07-29

    The E1 protein of Hepatitis C Virus (HCV) can be dissected into two distinct hydrophobic regions: a central domain containing an hypothetical fusion peptide (FP), and a C-terminal domain (CT) comprising two segments, a pre-anchor and a trans-membrane (TM) region. In the currently accepted model of the viral fusion process, the FP and the TM regions are considered to be closely juxtaposed in the post-fusion structure and their physical interaction cannot be excluded. In the present study, we took advantage of the natural sequence variability present among HCV strains to test, by purely sequence-based computational tools, the hypothesis that in this virus the fusion process involves the physical interaction of the FP and CT regions of E1. Two computational approaches were applied. The first one is based on the co-evolution paradigm of interacting peptides and consequently on the correlation between the distance matrices generated by the sequence alignment method applied to FP and CT primary structures, respectively. In spite of the relatively low random genetic drift between genotypes, co-evolution analysis of sequences from five HCV genotypes revealed a greater correlation between the FP and CT domains than respect to a control HCV sequence from Core protein, so giving a clear, albeit still inconclusive, support to the physical interaction hypothesis.The second approach relies upon a non-linear signal analysis method widely used in protein science called Recurrence Quantification Analysis (RQA). This method allows for a direct comparison of domains for the presence of common hydrophobicity patterns, on which the physical interaction is based upon. RQA greatly strengthened the reliability of the hypothesis by the scoring of a lot of cross-recurrences between FP and CT peptides hydrophobicity patterning largely outnumbering chance expectations and pointing to putative interaction sites. Intriguingly, mutations in the CT region of E1, reducing the fusion process in

  1. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  2. A computational psychiatry approach identifies how alpha-2A noradrenergic agonist Guanfacine affects feature-based reinforcement learning in the macaque

    Science.gov (United States)

    Hassani, S. A.; Oemisch, M.; Balcarras, M.; Westendorff, S.; Ardid, S.; van der Meer, M. A.; Tiesinga, P.; Womelsdorf, T.

    2017-01-01

    Noradrenaline is believed to support cognitive flexibility through the alpha 2A noradrenergic receptor (a2A-NAR) acting in prefrontal cortex. Enhanced flexibility has been inferred from improved working memory with the a2A-NA agonist Guanfacine. But it has been unclear whether Guanfacine improves specific attention and learning mechanisms beyond working memory, and whether the drug effects can be formalized computationally to allow single subject predictions. We tested and confirmed these suggestions in a case study with a healthy nonhuman primate performing a feature-based reversal learning task evaluating performance using Bayesian and Reinforcement learning models. In an initial dose-testing phase we found a Guanfacine dose that increased performance accuracy, decreased distractibility and improved learning. In a second experimental phase using only that dose we examined the faster feature-based reversal learning with Guanfacine with single-subject computational modeling. Parameter estimation suggested that improved learning is not accounted for by varying a single reinforcement learning mechanism, but by changing the set of parameter values to higher learning rates and stronger suppression of non-chosen over chosen feature information. These findings provide an important starting point for developing nonhuman primate models to discern the synaptic mechanisms of attention and learning functions within the context of a computational neuropsychiatry framework. PMID:28091572

  3. A computational psychiatry approach identifies how alpha-2A noradrenergic agonist Guanfacine affects feature-based reinforcement learning in the macaque.

    Science.gov (United States)

    Hassani, S A; Oemisch, M; Balcarras, M; Westendorff, S; Ardid, S; van der Meer, M A; Tiesinga, P; Womelsdorf, T

    2017-01-16

    Noradrenaline is believed to support cognitive flexibility through the alpha 2A noradrenergic receptor (a2A-NAR) acting in prefrontal cortex. Enhanced flexibility has been inferred from improved working memory with the a2A-NA agonist Guanfacine. But it has been unclear whether Guanfacine improves specific attention and learning mechanisms beyond working memory, and whether the drug effects can be formalized computationally to allow single subject predictions. We tested and confirmed these suggestions in a case study with a healthy nonhuman primate performing a feature-based reversal learning task evaluating performance using Bayesian and Reinforcement learning models. In an initial dose-testing phase we found a Guanfacine dose that increased performance accuracy, decreased distractibility and improved learning. In a second experimental phase using only that dose we examined the faster feature-based reversal learning with Guanfacine with single-subject computational modeling. Parameter estimation suggested that improved learning is not accounted for by varying a single reinforcement learning mechanism, but by changing the set of parameter values to higher learning rates and stronger suppression of non-chosen over chosen feature information. These findings provide an important starting point for developing nonhuman primate models to discern the synaptic mechanisms of attention and learning functions within the context of a computational neuropsychiatry framework.

  4. Computational Approaches to Interface Design

    Science.gov (United States)

    Corker; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    Tools which make use of computational processes - mathematical, algorithmic and/or knowledge-based - to perform portions of the design, evaluation and/or construction of interfaces have become increasingly available and powerful. Nevertheless, there is little agreement as to the appropriate role for a computational tool to play in the interface design process. Current tools fall into broad classes depending on which portions, and how much, of the design process they automate. The purpose of this panel is to review and generalize about computational approaches developed to date, discuss the tasks which for which they are suited, and suggest methods to enhance their utility and acceptance. Panel participants represent a wide diversity of application domains and methodologies. This should provide for lively discussion about implementation approaches, accuracy of design decisions, acceptability of representational tradeoffs and the optimal role for a computational tool to play in the interface design process.

  5. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  6. Identifying Geographic Clusters: A Network Analytic Approach

    CERN Document Server

    Catini, Roberto; Penner, Orion; Riccaboni, Massimo

    2015-01-01

    In recent years there has been a growing interest in the role of networks and clusters in the global economy. Despite being a popular research topic in economics, sociology and urban studies, geographical clustering of human activity has often studied been by means of predetermined geographical units such as administrative divisions and metropolitan areas. This approach is intrinsically time invariant and it does not allow one to differentiate between different activities. Our goal in this paper is to present a new methodology for identifying clusters, that can be applied to different empirical settings. We use a graph approach based on k-shell decomposition to analyze world biomedical research clusters based on PubMed scientific publications. We identify research institutions and locate their activities in geographical clusters. Leading areas of scientific production and their top performing research institutions are consistently identified at different geographic scales.

  7. Distributed design approach in persistent identifiers systems

    Science.gov (United States)

    Golodoniuc, Pavel; Car, Nicholas; Klump, Jens

    2017-04-01

    The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID) systems, of which there is a great variety in terms of technical and social implementations, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have catered for identifier uniqueness, integrity, persistence, and trustworthiness, regardless of the identifier's application domain, the scope of which has expanded significantly in the past two decades. Since many PID systems have been largely conceived and developed by small communities, or even a single organisation, they have faced challenges in gaining widespread adoption and, most importantly, the ability to survive change of technology. This has left a legacy of identifiers that still exist and are being used but which have lost their resolution service. We believe that one of the causes of once successful PID systems fading is their reliance on a centralised technical infrastructure or a governing authority. Golodoniuc et al. (2016) proposed an approach to the development of PID systems that combines the use of (a) the Handle system, as a distributed system for the registration and first-degree resolution of persistent identifiers, and (b) the PID Service (Golodoniuc et al., 2015), to enable fine-grained resolution to different information object representations. The proposed approach solved the problem of guaranteed first-degree resolution of identifiers, but left fine-grained resolution and information delivery under the control of a single authoritative source, posing risk to the long-term availability of information resources. Herein, we develop these approaches further and explore the potential of large-scale decentralisation at all levels: (i) persistent identifiers and information resources registration; (ii) identifier resolution; and (iii) data delivery. To achieve large-scale decentralisation

  8. An approach to identify urban groundwater recharge

    Directory of Open Access Journals (Sweden)

    E. Vázquez-Suñé

    2010-10-01

    Full Text Available Evaluating the proportion in which waters from different origins are mixed in a given water sample is relevant for many hydrogeological problems, such as quantifying total recharge, assessing groundwater pollution risks, or managing water resources. Our work is motivated by urban hydrogeology, where waters with different chemical signature can be identified (losses from water supply and sewage networks, infiltration from surface runoff and other water bodies, lateral aquifers inflows, .... The relative contribution of different sources to total recharge can be quantified by means of solute mass balances, but application is hindered by the large number of potential origins. Hence, the need to incorporate data from a large number of conservative species, the uncertainty in sources concentrations and measurement errors. We present a methodology to compute mixing ratios and end-members composition, which consists of (i Identification of potential recharge sources, (ii Selection of tracers, (iii Characterization of the hydrochemical composition of potential recharge sources and mixed water samples, and (iv Computation of mixing ratios and reevaluation of end-members. The analysis performed in a data set from samples of the Barcelona city aquifers suggests that the main contributors to total recharge are the water supply network losses (22%, the sewage network losses (30%, rainfall, concentrated in the non-urbanized areas (17%, from runoff infiltration (20%, and the Besòs River (11%. Regarding species, halogens (chloride, fluoride and bromide, sulfate, total nitrogen, and stable isotopes (18O, 2H, and 34S behaved quite conservatively. Boron, residual alkalinity, EDTA and Zn did not. Yet, including these species in the computations did not affect significantly the proportion estimations.

  9. An approach to identify urban groundwater recharge

    Directory of Open Access Journals (Sweden)

    E. Vázquez-Suñé

    2010-04-01

    Full Text Available Evaluating the proportion in which waters from different origins are mixed in a given water sample is relevant for many hydrogeological problems, such as quantifying total recharge, assessing groundwater pollution risks, or managing water resources. Our work is motivated by urban hydrogeology, where waters with different chemical signature can be identified (losses from water supply and sewage networks, infiltration from surface runoff and other water bodies, lateral aquifers inflows, .... The relative contribution of different sources to total recharge can be quantified by means of solute mass balances, but application is hindered by the large number of potential origins. Hence, the need to incorporate data from a large number of conservative species, the uncertainty in sources concentrations and measurement errors. We present a methodology to compute mixing ratios and end-members composition, which consists of (i Identification of potential recharge sources, (ii Selection of tracers, (iii Characterization of the hydrochemical composition of potential recharge sources and mixed water samples, and (iv Computation of mixing ratios and reevaluation of end-members. The analysis performed in a data set from samples of the Barcelona city aquifers suggests that the main contributors to total recharge are the water supply network losses (22%, the sewage network losses (30%, rainfall, concentrated in the non-urbanized areas (17%, from runoff infiltration (20%, and the Besòs River (11%. Regarding species, halogens (chloride, fluoride and bromide, sulfate, total nitrogen, and stable isotopes (18O2H, and 34S behaved quite conservatively. Boron, residual alkalinity, EDTA and Zn did not. Yet, including these species in the computations did not affect significantly the proportion estimations.

  10. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  11. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  12. Computational approaches for drug discovery.

    Science.gov (United States)

    Hung, Che-Lun; Chen, Chi-Chun

    2014-09-01

    Cellular proteins are the mediators of multiple organism functions being involved in physiological mechanisms and disease. By discovering lead compounds that affect the function of target proteins, the target diseases or physiological mechanisms can be modulated. Based on knowledge of the ligand-receptor interaction, the chemical structures of leads can be modified to improve efficacy, selectivity and reduce side effects. One rational drug design technology, which enables drug discovery based on knowledge of target structures, functional properties and mechanisms, is computer-aided drug design (CADD). The application of CADD can be cost-effective using experiments to compare predicted and actual drug activity, the results from which can used iteratively to improve compound properties. The two major CADD-based approaches are structure-based drug design, where protein structures are required, and ligand-based drug design, where ligand and ligand activities can be used to design compounds interacting with the protein structure. Approaches in structure-based drug design include docking, de novo design, fragment-based drug discovery and structure-based pharmacophore modeling. Approaches in ligand-based drug design include quantitative structure-affinity relationship and pharmacophore modeling based on ligand properties. Based on whether the structure of the receptor and its interaction with the ligand are known, different design strategies can be seed. After lead compounds are generated, the rule of five can be used to assess whether these have drug-like properties. Several quality validation methods, such as cost function analysis, Fisher's cross-validation analysis and goodness of hit test, can be used to estimate the metrics of different drug design strategies. To further improve CADD performance, multi-computers and graphics processing units may be applied to reduce costs.

  13. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  14. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  15. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  16. A comparison of computational methods for identifying virulence factors.

    Directory of Open Access Journals (Sweden)

    Lu-Lu Zheng

    Full Text Available Bacterial pathogens continue to threaten public health worldwide today. Identification of bacterial virulence factors can help to find novel drug/vaccine targets against pathogenicity. It can also help to reveal the mechanisms of the related diseases at the molecular level. With the explosive growth in protein sequences generated in the postgenomic age, it is highly desired to develop computational methods for rapidly and effectively identifying virulence factors according to their sequence information alone. In this study, based on the protein-protein interaction networks from the STRING database, a novel network-based method was proposed for identifying the virulence factors in the proteomes of UPEC 536, UPEC CFT073, P. aeruginosa PAO1, L. pneumophila Philadelphia 1, C. jejuni NCTC 11168 and M. tuberculosis H37Rv. Evaluated on the same benchmark datasets derived from the aforementioned species, the identification accuracies achieved by the network-based method were around 0.9, significantly higher than those by the sequence-based methods such as BLAST, feature selection and VirulentPred. Further analysis showed that the functional associations such as the gene neighborhood and co-occurrence were the primary associations between these virulence factors in the STRING database. The high success rates indicate that the network-based method is quite promising. The novel approach holds high potential for identifying virulence factors in many other various organisms as well because it can be easily extended to identify the virulence factors in many other bacterial species, as long as the relevant significant statistical data are available for them.

  17. Identifying failure in a tree network of a parallel computer

    Science.gov (United States)

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  18. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  19. A Bicriteria Approach Identifying Nondominated Portfolios

    Directory of Open Access Journals (Sweden)

    Javier Pereira

    2014-01-01

    Full Text Available We explore a portfolio constructive model, formulated in terms of satisfaction of a given set of technical requirements, with the minimum number of projects and minimum redundancy. An algorithm issued from robust portfolio modeling is adapted to a vector model, modifying the dominance condition as convenient, in order to find the set of nondominated portfolios, as solutions of a bicriteria integer linear programming problem. In order to improve the former algorithm, a process finding an optimal solution of a monocriteria version of this problem is proposed, which is further used as a first feasible solution aiding to find nondominated solutions more rapidly. Next, a sorting process is applied on the input data or information matrix, which is intended to prune nonfeasible solutions early in the constructive algorithm. Numerical examples show that the optimization and sorting processes both improve computational efficiency of the original algorithm. Their limits are also shown on certain complex instances.

  20. An Efficient Approach for Identifying Stable Lobes with Discretization Method

    Directory of Open Access Journals (Sweden)

    Baohai Wu

    2013-01-01

    Full Text Available This paper presents a new approach for quick identification of chatter stability lobes with discretization method. Firstly, three different kinds of stability regions are defined: absolute stable region, valid region, and invalid region. Secondly, while identifying the chatter stability lobes, three different regions within the chatter stability lobes are identified with relatively large time intervals. Thirdly, stability boundary within the valid regions is finely calculated to get exact chatter stability lobes. The proposed method only needs to test a small portion of spindle speed and cutting depth set; about 89% computation time is savedcompared with full discretization method. It spends only about10 minutes to get exact chatter stability lobes. Since, based on discretization method, the proposed method can be used for different immersion cutting including low immersion cutting process, the proposed method can be directly implemented in the workshop to promote machining parameters selection efficiency.

  1. Identifying MMORPG Bots: A Traffic Analysis Approach

    Science.gov (United States)

    Chen, Kuan-Ta; Jiang, Jhih-Wei; Huang, Polly; Chu, Hao-Hua; Lei, Chin-Laung; Chen, Wen-Chin

    2008-12-01

    Massively multiplayer online role playing games (MMORPGs) have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1) the regularity in the release time of client commands, 2) the trend and magnitude of traffic burstiness in multiple time scales, and 3) the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  2. Identifying MMORPG Bots: A Traffic Analysis Approach

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2008-11-01

    Full Text Available Massively multiplayer online role playing games (MMORPGs have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1 the regularity in the release time of client commands, 2 the trend and magnitude of traffic burstiness in multiple time scales, and 3 the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  3. Computer Algebra, Instrumentation and the Anthropological Approach

    Science.gov (United States)

    Monaghan, John

    2007-01-01

    This article considers research and scholarship on the use of computer algebra in mathematics education following the instrumentation and the anthropological approaches. It outlines what these approaches are, positions them with regard to other approaches, examines tensions between the two approaches and makes suggestions for how work in this…

  4. Computational approaches for urban environments

    NARCIS (Netherlands)

    Helbich, M; Jokar Arsanjani, J; Leitner, M

    2015-01-01

    This book aims to promote the synergistic usage of advanced computational methodologies in close relationship to geospatial information across cities of different scales. A rich collection of chapters subsumes current research frontiers originating from disciplines such as geography, urban planning,

  5. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too

  6. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too bro

  7. Computer Competency: A 7-Year Study to Identify Gaps in Student Computer Skills

    Science.gov (United States)

    Shuster, George F.; Pearl, Mona

    2011-01-01

    Computer competency is crucial to student success in higher education. Assessment of student knowledge related to specific computer competencies can provide faculty with important information about the strengths and weaknesses of their students' computer competency skills. The purpose of this study was to identify the competency level of two…

  8. Antenna arrays a computational approach

    CERN Document Server

    Haupt, Randy L

    2010-01-01

    This book covers a wide range of antenna array topics that are becoming increasingly important in wireless applications, particularly in design and computer modeling. Signal processing and numerical modeling algorithms are explored, and MATLAB computer codes are provided for many of the design examples. Pictures of antenna arrays and components provided by industry and government sources are presented with explanations of how they work. Antenna Arrays is a valuable reference for practicing engineers and scientists in wireless communications, radar, and remote sensing, and an excellent textbook for advanced antenna courses.

  9. GRID COMPUTING AND CHECKPOINT APPROACH

    Directory of Open Access Journals (Sweden)

    Pankaj gupta

    2011-05-01

    Full Text Available Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation or problem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. In this paper we investigate the different techniques of fault tolerance which are used in many real time distributed systems. The main focus is on types of fault occurring in the system, fault detection techniques and the recovery techniques used. A fault can occur due to link failure, resource failure or by any other reason is to be tolerated for working the system smoothly and accurately. These faults can be detected and recovered by many techniques used accordingly. An appropriate fault detector can avoid loss due to system crash and reliable fault tolerance technique can save from system failure. This paper provides how these methods are applied to detect and tolerate faults from various Real Time Distributed Systems. The advantages of utilizing the check pointing functionality are obvious; however so far the Grid community has notdeveloped a widely accepted standard that would allow the Gridenvironment to consciously utilize low level check pointing packages.Therefore, such a standard named Grid Check pointing Architecture isbeing designed. The fault tolerance mechanism used here sets the jobcheckpoints based on the resource failure rate. If resource failureoccurs, the job is restarted from its last successful state using acheckpoint file from another grid resource. A critical aspect for anautomatic recovery is the availability of checkpoint files. A strategy to increase the availability of checkpoints is replication. Grid is a form distributed computing mainly to virtualizes and utilize geographically distributed idle resources. A grid is a distributed computational and storage environment often composed of

  10. Immune based computer virus detection approaches

    Institute of Scientific and Technical Information of China (English)

    TAN Ying; ZHANG Pengtao

    2013-01-01

    The computer virus is considered one of the most horrifying threats to the security of computer systems worldwide.The rapid development of evasion techniques used in virus causes the signature based computer virus detection techniques to be ineffective.Many novel computer virus detection approaches have been proposed in the past to cope with the ineffectiveness,mainly classified into three categories:static,dynamic and heuristics techniques.As the natural similarities between the biological immune system (BIS),computer security system (CSS),and the artificial immune system (AIS) were all developed as a new prototype in the community of anti-virus research.The immune mechanisms in the BIS provide the opportunities to construct computer virus detection models that are robust and adaptive with the ability to detect unseen viruses.In this paper,a variety of classic computer virus detection approaches were introduced and reviewed based on the background knowledge of the computer virus history.Next,a variety of immune based computer virus detection approaches were also discussed in detail.Promising experimental results suggest that the immune based computer virus detection approaches were able to detect new variants and unseen viruses at lower false positive rates,which have paved a new way for the anti-virus research.

  11. Identifying and relating nurses' attitudes toward computer use.

    Science.gov (United States)

    Burkes, M

    1991-01-01

    The purpose of this study was to measure nurses' attitudes toward computer use based on an adaptation of Vroom's expectancy theory, and identify variables that may correlate with these attitudes. Content validity and reliability for internal consistency were determined for the developed attitude questionnaire. Nurses' individual characteristics and computer-use satisfaction, beliefs, and motivation were correlated. Data analysis revealed that nurses' attitudes were significantly related (satisfaction to beliefs, r = 0.783, p less than 0.001; satisfaction to motivation, r = 0.598, p less than 0.001; and beliefs to motivation r = 0.651, p less than 0.001), supporting the model based on Vroom's expectancy theory. Computer knowledge significantly related to computer-use beliefs (r = 0.229, p less than 0.05). Length of computer experience (r = -0.265, p less than 0.05) and nursing experience (r = -0.239, p less than 0.05) related negatively to nurses' computer-use satisfaction.

  12. Identifying barriers for implementation of computer based nursing documentation.

    Science.gov (United States)

    Vollmer, Anna-Maria; Prokosch, Hans-Ulrich; Bürkle, Thomas

    2014-01-01

    This study was undertaken in the planning phase for the introduction of a comprehensive computer based nursing documentation system at Erlangen University Hospital. There, we expect a wide range of difficult organizational changes, because the nurses currently neither used computer based nursing documentation nor did they follow strongly the nursing process model within paper based documentation. Thus we were eager to recognize potential pitfalls early and to identify potential barriers for digital nursing documentation. In a questionnaire study we surveyed all German university hospitals for their experience with the implementation of computer based nursing documentation implementation. We received answers from 11 of the 23 hospitals. Furthermore we performed a questionnaire study about expectations and fears among the nurses of four pilot wards of our hospital. Most respondents stated a positive attitude towards the nursing process documentation, but many respondents note technical (e.g. bad performance of the software) and organizational barriers (e.g. lack of time).

  13. Computational approaches for systems metabolomics.

    Science.gov (United States)

    Krumsiek, Jan; Bartel, Jörg; Theis, Fabian J

    2016-06-01

    Systems genetics is defined as the simultaneous assessment and analysis of multi-omics datasets. In the past few years, metabolomics has been established as a robust tool describing an important functional layer in this approach. The metabolome of a biological system represents an integrated state of genetic and environmental factors and has been referred to as a 'link between genotype and phenotype'. In this review, we summarize recent progresses in statistical analysis methods for metabolomics data in combination with other omics layers. We put a special focus on complex, multivariate statistical approaches as well as pathway-based and network-based analysis methods. Moreover, we outline current challenges and pitfalls of metabolomics-focused multi-omics analyses and discuss future steps for the field.

  14. Learning and geometry computational approaches

    CERN Document Server

    Smith, Carl

    1996-01-01

    The field of computational learning theory arose out of the desire to for­ mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo­ metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ­ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...

  15. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  16. An Efficient Approach for Computing Silhouette Coefficients

    Directory of Open Access Journals (Sweden)

    Moh'd B. Al- Zoubi

    2008-01-01

    Full Text Available One popular approach for finding the best number of clusters (K in a data set is through computing the silhouette coefficients. The silhouette coefficients for different values of K, are first found and then the maximum value of these coefficients is chosen. However, computing the silhouette coefficient for different Ks is a very time consuming process. This is due to the amount of CPU time spent on distance calculations. A proposed approach to compute the silhouette coefficient quickly had been presented. The approach was based on decreasing the number of addition operations when computing distances. The results were efficient and more than 50% of the CPU time was achieved when applied to different data sets.

  17. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  18. A comparison of approaches for finding minimum identifying codes on graphs

    Science.gov (United States)

    Horan, Victoria; Adachi, Steve; Bak, Stanley

    2016-05-01

    In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.

  19. A Big Data Approach to Computational Creativity

    CERN Document Server

    Varshney, Lav R; Varshney, Kush R; Bhattacharjya, Debarun; Schoergendorfer, Angela; Chee, Yi-Min

    2013-01-01

    Computational creativity is an emerging branch of artificial intelligence that places computers in the center of the creative process. Broadly, creativity involves a generative step to produce many ideas and a selective step to determine the ones that are the best. Many previous attempts at computational creativity, however, have not been able to achieve a valid selective step. This work shows how bringing data sources from the creative domain and from hedonic psychophysics together with big data analytics techniques can overcome this shortcoming to yield a system that can produce novel and high-quality creative artifacts. Our data-driven approach is demonstrated through a computational creativity system for culinary recipes and menus we developed and deployed, which can operate either autonomously or semi-autonomously with human interaction. We also comment on the volume, velocity, variety, and veracity of data in computational creativity.

  20. Towards Lagrangian approach to quantum computations

    CERN Document Server

    Vlasov, A Yu

    2003-01-01

    In this work is discussed possibility and actuality of Lagrangian approach to quantum computations. Finite-dimensional Hilbert spaces used in this area provide some challenge for such consideration. The model discussed here can be considered as an analogue of Weyl quantization of field theory via path integral in L. D. Faddeev's approach. Weyl quantization is possible to use also in finite-dimensional case, and some formulas may be simply rewritten with change of integrals to finite sums. On the other hand, there are specific difficulties relevant to finite case. This work has some allusions with phase space models of quantum computations developed last time by different authors.

  1. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Kristan D.; Faraj, Daniel

    2016-05-03

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.

  2. Identifying logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-03-01

    In a parallel computer, a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: for each compute node of the subcommunicator and for a number of dimensions beginning with a first dimension: establishing, by a plane building node, in a positive direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in a positive direction of a second dimension, where the second dimension is orthogonal to the first dimension; and establishing, by the plane building node, in a negative direction of the first dimension, all logical planes that include the plane building node and compute nodes of the subcommunicator in the positive direction of the second dimension.

  3. Covert Flow Graph Approach to Identifying Covert Channels

    OpenAIRE

    XiangMei Song; ShiGuang Ju

    2011-01-01

    In this paper, the approach for identifying covert channels using a graph structure called Covert Flow Graph is introduced. Firstly, the construction of Covert Flow Graph which can offer information flows of the system for covert channel detection is proposed, and the search and judge algorithm used to identify covert channels in Covert Flow Graph is given. Secondly, an example file system analysis using Covert Flow Graph approach is provided, and the analysis result is compared with that of ...

  4. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  5. Identifying Nursing Computer Training Requirements using Web-based Assessment

    Directory of Open Access Journals (Sweden)

    Naser Ghazi

    2011-12-01

    Full Text Available Our work addresses issues of inefficiency and ineffectiveness in the training of nurses in computer literacy by developing an adaptive questionnaire system. This system works to identify the most effective training modules by evaluating applicants for pre-training and post-training. Our system, Systems Knowledge Assessment Tool (SKAT, aims to increase training proficiency, decrease training time and reduce costs associated with training by identifying areas of training required, and those which are not required for training, targeted to each individual. Based on the project’s requirements, a number of HTML documents were designed to be used as templates in the implementation stage. During this stage, the milestone principle was used, in which a series of coding and testing was performed to generate an error-free product.The decision-making process and it is components, as well as knowing the priority of each attribute in the application is responsible for determining the required training for each applicant. Thus, the decision-making process is an essential aspect of system design and greatly affects the training results of the applicant. The SKAT system has been evaluated to ensure that the system meets the project’s requirements. The evaluation stage was an important part of the project and required a number of nurses with different roles to evaluate the system. Based on their feedback, changes were made.

  6. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  7. Handbook of computational approaches to counterterrorism

    CERN Document Server

    Subrahmanian, VS

    2012-01-01

    Terrorist groups throughout the world have been studied primarily through the use of social science methods. However, major advances in IT during the past decade have led to significant new ways of studying terrorist groups, making forecasts, learning models of their behaviour, and shaping policies about their behaviour. Handbook of Computational Approaches to Counterterrorism provides the first in-depth look at how advanced mathematics and modern computing technology is shaping the study of terrorist groups. This book includes contributions from world experts in the field, and presents extens

  8. Computer-Based Procedures for Field Workers - Identified Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    The Idaho National Laboratory (INL) computer-based procedure (CBP) research team is exploring how best to design a CBP system that will deliver the intended benefits of increased efficiency and improved human performance. It is important to note that no “off-the-shelf” technology exists for the type of CBP system that is investigated and developed by the INL researchers. As more technology is integrated into the procedure process the importance of an appropriate and methodological approach to the design of the procedure system increases. Technological advancements offer great opportunities for efficiency and safety gains, however if the system is not designed correctly there is a large risk of unintentionally introducing new opportunities for human errors. The INL research team is breaking new ground in the area of CBPs with the prototype they have developed. Current electronic procedure systems are most commonly electronic versions of the paper-based procedures with hyperlinks to other procedures, limited user input functionality, and the ability to mark steps completed. These systems do not fully exploit the advantages digital technology. It is a part of the INL researchers’ role to develop and validate new CBP technologies that greatly increase the benefits of a CBP system to the nuclear industry.

  9. A predictive approach to identify genes differentially expressed

    Science.gov (United States)

    Saraiva, Erlandson F.; Louzada, Francisco; Milan, Luís A.; Meira, Silvana; Cobre, Juliana

    2012-10-01

    The main objective of gene expression data analysis is to identify genes that present significant changes in expression levels between a treatment and a control biological condition. In this paper, we propose a Bayesian approach to identify genes differentially expressed calculating credibility intervals from predictive densities which are constructed using sampled mean treatment effect from all genes in study excluding the treatment effect of genes previously identified with statistical evidence for difference. We compare our Bayesian approach with the standard ones based on the use of the t-test and modified t-tests via a simulation study, using small sample sizes which are common in gene expression data analysis. Results obtained indicate that the proposed approach performs better than standard ones, especially for cases with mean differences and increases in treatment variance in relation to control variance. We also apply the methodologies to a publicly available data set on Escherichia coli bacteria.

  10. Covert Flow Graph Approach to Identifying Covert Channels

    Directory of Open Access Journals (Sweden)

    XiangMei Song

    2011-12-01

    Full Text Available In this paper, the approach for identifying covert channels using a graph structure called Covert Flow Graph is introduced. Firstly, the construction of Covert Flow Graph which can offer information flows of the system for covert channel detection is proposed, and the search and judge algorithm used to identify covert channels in Covert Flow Graph is given. Secondly, an example file system analysis using Covert Flow Graph approach is provided, and the analysis result is compared with that of Shared Resource Matrix and Covert Flow Tree method. Finally, the comparison between Covert Flow Graph approach and other two methods is discussed. Different from previous methods, Covert Flow Graph approach provides a deep insight for system’s information flows, and gives an effective algorithm for covert channel identification.

  11. Novel computational approaches characterizing knee physiotherapy

    OpenAIRE

    Wangdo Kim; Veloso, Antonio P; Duarte Araujo; Kohles, Sean S.

    2014-01-01

    A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physi...

  12. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  13. Identifying Key Challenges in Performance Issues in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ashraf Zia

    2012-10-01

    Full Text Available Cloud computing is a harbinger to a newer era in the field of computing where distributed and centralized services are used in a unique way. In cloud computing, the computational resources of different vendors and IT services providers are managed for providing an enormous and a scalable computing services platform that offers efficient data processing coupled with better QoS at a lower cost. The on-demand dynamic and scalable resource allocation is the main motif behind the development and deployment of cloud computing. The potential growth in this area and the presence of some dominant organizations with abundant resources (like Google, Amazon, Salesforce, Rackspace, Azure, GoGrid, make the field of cloud computing more fascinating. All the cloud computing processes need to be in unanimity to dole out better QoS i.e., to provide better software functionality, meet the tenant’s requirements for their desired processing power and to exploit elevated bandwidth.. However, several technical and functional e.g., pervasive access to resources, dynamic discovery, on the fly access and composition of resources pose serious challenges for cloud computing. In this study, the performance issues in cloud computing are discussed. A number of schemes pertaining to QoS issues are critically analyzed to point out their strengths and weaknesses. Some of the performance parameters at the three basic layers of the cloud — Infrastructure as a Service, Platform as a Service and Software as a Service — are also discussed in this paper.

  14. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  15. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    Science.gov (United States)

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  16. Evolutionary Computational Methods for Identifying Emergent Behavior in Autonomous Systems

    Science.gov (United States)

    Terrile, Richard J.; Guillaume, Alexandre

    2011-01-01

    A technique based on Evolutionary Computational Methods (ECMs) was developed that allows for the automated optimization of complex computationally modeled systems, such as autonomous systems. The primary technology, which enables the ECM to find optimal solutions in complex search spaces, derives from evolutionary algorithms such as the genetic algorithm and differential evolution. These methods are based on biological processes, particularly genetics, and define an iterative process that evolves parameter sets into an optimum. Evolutionary computation is a method that operates on a population of existing computational-based engineering models (or simulators) and competes them using biologically inspired genetic operators on large parallel cluster computers. The result is the ability to automatically find design optimizations and trades, and thereby greatly amplify the role of the system engineer.

  17. Computer Forensics Education - the Open Source Approach

    Science.gov (United States)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  18. Similarity transformation approach to identifiability analysis of nonlinear compartmental models.

    Science.gov (United States)

    Vajda, S; Godfrey, K R; Rabitz, H

    1989-04-01

    Through use of the local state isomorphism theorem instead of the algebraic equivalence theorem of linear systems theory, the similarity transformation approach is extended to nonlinear models, resulting in finitely verifiable sufficient and necessary conditions for global and local identifiability. The approach requires testing of certain controllability and observability conditions, but in many practical examples these conditions prove very easy to verify. In principle the method also involves nonlinear state variable transformations, but in all of the examples presented in the paper the transformations turn out to be linear. The method is applied to an unidentifiable nonlinear model and a locally identifiable nonlinear model, and these are the first nonlinear models other than bilinear models where the reason for lack of global identifiability is nontrivial. The method is also applied to two models with Michaelis-Menten elimination kinetics, both of considerable importance in pharmacokinetics, and for both of which the complicated nature of the algebraic equations arising from the Taylor series approach has hitherto defeated attempts to establish identifiability results for specific input functions.

  19. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    Directory of Open Access Journals (Sweden)

    Grover Kearns

    2010-06-01

    Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

  20. Efficient Approach for Load Balancing in Virtual Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Harvinder singh

    2014-10-01

    Full Text Available Cloud computing technology is changing the focus of IT world and it is becoming famous because of its great characteristics. Load balancing is one of the main challenges in cloud computing for distributing workloads across multiple computers or a computer cluster, network links, central processing units, disk drives, or other resources. Successful load balancing optimizes resource use, maximizes throughput, minimizes response time, and avoids overload. The objective of this paper to propose an approach for scheduling algorithms that can maintain the load balancing and provides better improved strategies through efficient job scheduling and modified resource allocation techniques. The results discussed in this paper, based on existing round robin, least connection, throttled load balance, fastest response time and a new proposed algorithm fastest with least connection scheduling algorithms. This new algorithm identifies the overall response time and data centre processing time is improved as well as cost is reduced in comparison to the existing scheduling parameters.

  1. EFFICIENT APPROACH FOR LOAD BALANCING IN VIRTUAL CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Harvinder Singh

    2015-10-01

    Full Text Available Cloud computing technology is changing the focus of IT world and it is becoming famous because of its great characteristics. Load balancing is one of the main challenges in cloud computing for distributing workloads across multiple computers or a computer cluster, network links, central processing units, disk drives, or other resources. Successful load balancing optimizes resource use, maximizes throughput, minimizes response time, and avoids overload. The objective of this paper to propose an approach for scheduling algorithms that can maintain the load balancing and provides better improved strategies through efficient job scheduling and modified resource allocation techniques. The results discussed in this paper, based on existing round robin, least connection, throttled load balance, fastest response time and a new proposed algorithm fastest with least connection scheduling algorithms. This new algorithm identifies the overall response time and data centre processing time is improved as well as cost is reduced in comparison to the existing scheduling parameters.

  2. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  3. An Approach to Ad hoc Cloud Computing

    CERN Document Server

    Kirby, Graham; Macdonald, Angus; Fernandes, Alvaro

    2010-01-01

    We consider how underused computing resources within an enterprise may be harnessed to improve utilization and create an elastic computing infrastructure. Most current cloud provision involves a data center model, in which clusters of machines are dedicated to running cloud infrastructure software. We propose an additional model, the ad hoc cloud, in which infrastructure software is distributed over resources harvested from machines already in existence within an enterprise. In contrast to the data center cloud model, resource levels are not established a priori, nor are resources dedicated exclusively to the cloud while in use. A participating machine is not dedicated to the cloud, but has some other primary purpose such as running interactive processes for a particular user. We outline the major implementation challenges and one approach to tackling them.

  4. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  5. 76 FR 37111 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Science.gov (United States)

    2011-06-24

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Access to Confidential Business Information by Computer Sciences Corporation and Its Identified... contractor, Computer Sciences Corporation of Chantilly, VA and Its Identified Subcontractors, to access...

  6. A multidisciplinary approach to solving computer related vision problems.

    Science.gov (United States)

    Long, Jennifer; Helland, Magne

    2012-09-01

    This paper proposes a multidisciplinary approach to solving computer related vision issues by including optometry as a part of the problem-solving team. Computer workstation design is increasing in complexity. There are at least ten different professions who contribute to workstation design or who provide advice to improve worker comfort, safety and efficiency. Optometrists have a role identifying and solving computer-related vision issues and in prescribing appropriate optical devices. However, it is possible that advice given by optometrists to improve visual comfort may conflict with other requirements and demands within the workplace. A multidisciplinary approach has been advocated for solving computer related vision issues. There are opportunities for optometrists to collaborate with ergonomists, who coordinate information from physical, cognitive and organisational disciplines to enact holistic solutions to problems. This paper proposes a model of collaboration and examples of successful partnerships at a number of professional levels including individual relationships between optometrists and ergonomists when they have mutual clients/patients, in undergraduate and postgraduate education and in research. There is also scope for dialogue between optometry and ergonomics professional associations. A multidisciplinary approach offers the opportunity to solve vision related computer issues in a cohesive, rather than fragmented way. Further exploration is required to understand the barriers to these professional relationships. © 2012 The College of Optometrists.

  7. Combining risk-management and computational approaches for trustworthiness evaluation of socio-technical systems

    OpenAIRE

    Gol Mohammadi, N.; Bandyszak, T.; Goldsteen, A.; Kalogiros, C.; Weyer, T.; Moffie, M.; Nasser, B.; Surridge, M

    2015-01-01

    The analysis of existing software evaluation techniques reveals the need for evidence-based evaluation of systems’ trustworthiness. This paper aims at evaluating trustworthiness of socio-technical systems during design-time. Our approach combines two existing evaluation techniques: a computa-tional approach and a risk management approach. The risk-based approach identifies threats to trustworthiness on an abstract level. Computational ap-proaches are applied to evaluate the expected end-to-en...

  8. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  9. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  10. Identifying acute coronary syndrome patients approaching end-of-life.

    Directory of Open Access Journals (Sweden)

    Stephen Fenning

    Full Text Available BACKGROUND: Acute coronary syndrome (ACS is common in patients approaching the end-of-life (EoL, but these patients rarely receive palliative care. We compared the utility of a palliative care prognostic tool (Gold Standards Framework (GSF and the Global Registry of Acute Coronary Events (GRACE score, to help identify patients approaching EoL. METHODS AND FINDINGS: 172 unselected consecutive patients with confirmed ACS admitted over an eight-week period were assessed using prognostic tools and followed up for 12 months. GSF criteria identified 40 (23% patients suitable for EoL care while GRACE identified 32 (19% patients with ≥ 10% risk of death within 6 months. Patients meeting GSF criteria were older (p = 0.006, had more comorbidities (1.6 ± 0.7 vs. 1.2 ± 0.9, p = 0.007, more frequent hospitalisations before (p = 0.001 and after (0.0001 their index admission, and were more likely to die during follow-up (GSF+ 20% vs GSF- 7%, p = 0.03. GRACE score was predictive of 12-month mortality (C-statistic 0.75 and this was improved by the addition of previous hospital admissions and previous history of stroke (C-statistic 0.88. CONCLUSIONS: This study has highlighted a potentially large number of ACS patients eligible for EoL care. GSF or GRACE could be used in the hospital setting to help identify these patients. GSF identifies ACS patients with more comorbidity and at increased risk of hospital readmission.

  11. Reverse Pathway Genetic Approach Identifies Epistasis in Autism Spectrum Disorders

    Science.gov (United States)

    Traglia, Michela; Tsang, Kathryn; Bearden, Carrie E.; Rauen, Katherine A.

    2017-01-01

    Although gene-gene interaction, or epistasis, plays a large role in complex traits in model organisms, genome-wide by genome-wide searches for two-way interaction have limited power in human studies. We thus used knowledge of a biological pathway in order to identify a contribution of epistasis to autism spectrum disorders (ASDs) in humans, a reverse-pathway genetic approach. Based on previous observation of increased ASD symptoms in Mendelian disorders of the Ras/MAPK pathway (RASopathies), we showed that common SNPs in RASopathy genes show enrichment for association signal in GWAS (P = 0.02). We then screened genome-wide for interactors with RASopathy gene SNPs and showed strong enrichment in ASD-affected individuals (P < 2.2 x 10−16), with a number of pairwise interactions meeting genome-wide criteria for significance. Finally, we utilized quantitative measures of ASD symptoms in RASopathy-affected individuals to perform modifier mapping via GWAS. One top region overlapped between these independent approaches, and we showed dysregulation of a gene in this region, GPR141, in a RASopathy neural cell line. We thus used orthogonal approaches to provide strong evidence for a contribution of epistasis to ASDs, confirm a role for the Ras/MAPK pathway in idiopathic ASDs, and to identify a convergent candidate gene that may interact with the Ras/MAPK pathway. PMID:28076348

  12. Kinomic profiling approach identifies Trk as a novel radiation modulator.

    Science.gov (United States)

    Jarboe, John S; Jaboin, Jerry J; Anderson, Joshua C; Nowsheen, Somaira; Stanley, Jennifer A; Naji, Faris; Ruijtenbeek, Rob; Tu, Tianxiang; Hallahan, Dennis E; Yang, Eddy S; Bonner, James A; Willey, Christopher D

    2012-06-01

    Ionizing radiation treatment is used in over half of all cancer patients, thus determining the mechanisms of response or resistance is critical for the development of novel treatment approaches. In this report, we utilize a high-content peptide array platform that performs multiplex kinase assays with real-time kinetic readout to investigate the mechanism of radiation response in vascular endothelial cells. We applied this technology to irradiated human umbilical vein endothelial cells (HUVEC). We identified 49 specific tyrosine phosphopeptides that were differentially affected by irradiation over a time course of 1h. In one example, the Tropomyosin receptor kinase (Trk) family members, TrkA and TrkB, showed transient activation between 2 and 15 min following irradiation. When we targeted TrkA and TrkB using small molecule inhibitors, HUVEC were protected from radiation damage. Conversely, stimulation of TrkA using gambogic amide promoted radiation enhancement. Thus, we show that our approach not only can identify rapid changes in kinase activity but also identify novel targets such as TrkA. TrkA inhibition resulted in radioprotection that correlated with enhanced repair of radiation-induced damage while TrkA stimulation by gambogic amide produced radiation sensitization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  14. Contemporary Approaches for Identifying Rare Bone Disease Causing Genes

    Institute of Scientific and Technical Information of China (English)

    Charles R.Farber; Thomas L.Clemens

    2013-01-01

    Recent improvements in the speed and accuracy of DNA sequencing, together with increasingly sophisti-cated mathematical approaches for annotating gene networks, have revolutionized the field of human genetics and made these once time consuming approaches assessable to most investigators. In the field of bone research, a particularly active area of gene discovery has occurred in patients with rare bone disorders such as osteogenesis imperfecta (OI) that are caused by mutations in single genes. In this perspective, we highlight some of these technological advances and describe how they have been used to identify the genetic determinants underlying two previously unexplained cases of OI. The widespread availability of advanced methods for DNA sequencing and bioinformatics analysis can be expected to greatly facilitate identification of novel gene networks that normally function to control bone formation and maintenance.

  15. An Integrative data mining approach to identifying Adverse ...

    Science.gov (United States)

    The Adverse Outcome Pathway (AOP) framework is a tool for making biological connections and summarizing key information across different levels of biological organization to connect biological perturbations at the molecular level to adverse outcomes for an individual or population. Computational approaches to explore and determine these connections can accelerate the assembly of AOPs. By leveraging the wealth of publicly available data covering chemical effects on biological systems, computationally-predicted AOPs (cpAOPs) were assembled via data mining of high-throughput screening (HTS) in vitro data, in vivo data and other disease phenotype information. Frequent Itemset Mining (FIM) was used to find associations between the gene targets of ToxCast HTS assays and disease data from Comparative Toxicogenomics Database (CTD) by using the chemicals as the common aggregators between datasets. The method was also used to map gene expression data to disease data from CTD. A cpAOP network was defined by considering genes and diseases as nodes and FIM associations as edges. This network contained 18,283 gene to disease associations for the ToxCast data and 110,253 for CTD gene expression. Two case studies show the value of the cpAOP network by extracting subnetworks focused either on fatty liver disease or the Aryl Hydrocarbon Receptor (AHR). The subnetwork surrounding fatty liver disease included many genes known to play a role in this disease. When querying the cpAOP

  16. Novel computational approaches characterizing knee physiotherapy

    Directory of Open Access Journals (Sweden)

    Wangdo Kim

    2014-01-01

    Full Text Available A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physiotherapy by introducing a new dimension of foot loading to the knee axis alignment producing an improved functional status of the patient. New physiotherapeutic applications are then possible by aligning foot loading with the functional axis of the knee joint during the treatment of patients with osteoarthritis.

  17. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  18. A computational approach to negative priming

    Science.gov (United States)

    Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael

    2007-09-01

    Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).

  19. Non-lexical approaches to identifying associative relations in the gene ontology.

    Science.gov (United States)

    Bodenreider, Olivier; Aubry, Marc; Burgun, Anita

    2005-01-01

    The Gene Ontology (GO) is a controlled vocabulary widely used for the annotation of gene products. GO is organized in three hierarchies for molecular functions, cellular components, and biological processes but no relations are provided among terms across hierarchies. The objective of this study is to investigate three non-lexical approaches to identifying such associative relations in GO and compare them among themselves and to lexical approaches. The three approaches are: computing similarity in a vector space model, statistical analysis of co-occurrence of GO terms in annotation databases, and association rule mining. Five annotation databases (FlyBase, the Human subset of GOA, MGI, SGD, and WormBase) are used in this study. A total of 7,665 associations were identified by at least one of the three non-lexical approaches. Of these, 12% were identified by more than one approach. While there are almost 6,000 lexical relations among GO terms, only 203 associations were identified by both non-lexical and lexical approaches. The associations identified in this study could serve as the starting point for adding associative relations across hierarchies to GO, but would require manual curation. The application to quality assurance of annotation databases is also discussed.

  20. 75 FR 70672 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Science.gov (United States)

    2010-11-18

    ... AGENCY Access to Confidential Business Information by Computer Sciences Corporation and Its Identified... contractor, Computer Sciences Corporation (CSC) of Chantilly, VA and Its Identified Subcontractors, to access... required to support OPPT computer applications; OPPT staff; and their development staff. Specific types of...

  1. Blueprinting Approach in Support of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2012-03-01

    Full Text Available Current cloud service offerings, i.e., Software-as-a-service (SaaS, Platform-as-a-service (PaaS and Infrastructure-as-a-service (IaaS offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of Service-based Application (SBA developers to configure and syndicate offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e., SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, this paper introduces a formal Blueprint Template for unambiguously describing a blueprint, as well as a Blueprint Lifecycle that guides developers through the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within an EC’s FP7 project is reported and an associated blueprint prototype implementation is presented.

  2. New approach for identifying boundary characteristics using transmissibility

    Science.gov (United States)

    Joo, Kyung-Hoon; Min, Dongwoo; Kim, Jun-Gu; Kang, Yeon June

    2017-04-01

    A novel approach is proposed for identifying boundary properties as a response model using transmissibility. This approach differs from those proposed in previous studies dealing with frequency response functions (FRFs) for joint identification. Transmissibility includes only response data, unlike FRFs that include force measurements. The boundary properties can be estimated by comparing the characteristics of the components under the free condition and connected to boundary conditions. When analyzing the components assembled compactly in the system for setting the shaker or measuring the impact force exerted on the component correctly, the proposed method could reduce the errors caused by an incorrectly measured force. The derived equation is verified using a discrete multiple degrees of freedom system with single boundary and multiple boundary conditions and by application to a beam, which is the simplest continuous structural form to validate the feasibility of the theory. The transmissibility defined by the apparent mass matrix is used for verifying the derived equation for identifying the boundary properties in the discrete system. However, when applying the equation to practical cases, as is the purpose of this research, the transmissibility matrix should be defined using only the response data. For this purpose, the accelerance matrix is modified slightly to the response matrix using the input as a unit force. This transmissibility matrix composed of response data is used for validating the equation in a continuous system. Furthermore, the effects of measurement noise are also investigated to assess the robustness of the method for application under practical conditions. Consequently, the proposed method could show reliable results by properly extracting the boundary properties in both cases. In many practical cases, this research is expected to contribute toward identifying the boundary properties in a complex system more conveniently compared to the method

  3. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...

  4. FRIGA, A New Approach To Identify Isotopes and Hypernuclei In N-Body Transport Models

    CERN Document Server

    Fèvre, A Le; Aichelin, J; Hartnack, Ch; Kireyev, V; Bratkovskaya, E

    2015-01-01

    We present a new algorithm to identify fragments in computer simulations of relativistic heavy ion collisions. It is based on the simulated annealing technique and can be applied to n-body transport models like the Quantum Molecular Dynamics. This new approach is able to predict isotope yields as well as hyper-nucleus production. In order to illustrate its predicting power, we confront this new method to experimental data, and show the sensitivity on the parameters which govern the cluster formation.

  5. Novel approaches to identify protective malaria vaccine candidates

    Directory of Open Access Journals (Sweden)

    Wan Ni eChia

    2014-11-01

    Full Text Available Efforts to develop vaccines against malaria have been the focus of substantial research activities for decades. Several categories of candidate vaccines are currently being developed for protection against malaria, based on antigens corresponding to the pre-erythrocytic, blood-stage or sexual stages of the parasite. Long lasting sterile protection from Plasmodium falciparum sporozoite challenge has been observed in human following vaccination with whole parasite formulations, clearly demonstrating that a protective immune response targeting predominantly the pre-erythrocytic stages can develop against malaria. However, most of vaccine candidates currently being investigated, which are mostly subunits vaccines, have not been able to induce substantial (>50% protection thus far. This is due to the fact that the antigens responsible for protection against the different parasite stages are still yet to be known and relevant correlates of protection have remained elusive. For a vaccine to be developed in a timely manner, novel approaches are required. In this article, we review the novel approaches that have been developed to identify the antigens for the development of an effective malaria vaccine.

  6. Identifying useful project management practices: A mixed methodology approach

    Directory of Open Access Journals (Sweden)

    Gabriela Fernandes

    2013-01-01

    Full Text Available This paper describes a mixed methodological research approach for identifying practitioner perceptions of the most useful project management (PM practices to improve project management performance. By identifying the perceived most useful tools and techniques, as having the most potential for increased contribution to project management performance, practitioners and organizations can select their priorities when improving PM practices. The research involved a programme of thirty interviews with Project Management professionals in Portugal, followed by a global survey. Completed questionnaires were received from 793 practitioners worldwide, covering 75 different countries. The results showed that the top twenty of the list of the most useful tools and techniques is composed of very well-known and widely used tools, such as: progress report; requirements analysis; progress meetings; risk identification; and project scope statement. PM practices in the top of list cover the overall PM life cycle from initiation to project closing, but particular relevance is given to tools and techniques from planning. The areas of knowledge, scope, time, risk, communication and integration, assume a high relevance, each with at least three PM practices on the top of the list.

  7. Computational systems biology approaches to anti-angiogenic cancer therapeutics.

    Science.gov (United States)

    Finley, Stacey D; Chu, Liang-Hui; Popel, Aleksander S

    2015-02-01

    Angiogenesis is an exquisitely regulated process that is required for physiological processes and is also important in numerous diseases. Tumors utilize angiogenesis to generate the vascular network needed to supply the cancer cells with nutrients and oxygen, and many cancer drugs aim to inhibit tumor angiogenesis. Anti-angiogenic therapy involves inhibiting multiple cell types, molecular targets, and intracellular signaling pathways. Computational tools are useful in guiding treatment strategies, predicting the response to treatment, and identifying new targets of interest. Here, we describe progress that has been made in applying mathematical modeling and bioinformatics approaches to study anti-angiogenic therapeutics in cancer.

  8. Bioinformatics approaches for identifying new therapeutic bioactive peptides in food

    Directory of Open Access Journals (Sweden)

    Nora Khaldi

    2012-10-01

    Full Text Available ABSTRACT:The traditional methods for mining foods for bioactive peptides are tedious and long. Similar to the drug industry, the length of time to identify and deliver a commercial health ingredient that reduces disease symptoms can take anything between 5 to 10 years. Reducing this time and effort is crucial in order to create new commercially viable products with clear and important health benefits. In the past few years, bioinformatics, the science that brings together fast computational biology, and efficient genome mining, is appearing as the long awaited solution to this problem. By quickly mining food genomes for characteristics of certain food therapeutic ingredients, researchers can potentially find new ones in a matter of a few weeks. Yet, surprisingly, very little success has been achieved so far using bioinformatics in mining for food bioactives.The absence of food specific bioinformatic mining tools, the slow integration of both experimental mining and bioinformatics, and the important difference between different experimental platforms are some of the reasons for the slow progress of bioinformatics in the field of functional food and more specifically in bioactive peptide discovery.In this paper I discuss some methods that could be easily translated, using a rational peptide bioinformatics design, to food bioactive peptide mining. I highlight the need for an integrated food peptide database. I also discuss how to better integrate experimental work with bioinformatics in order to improve the mining of food for bioactive peptides, therefore achieving a higher success rates.

  9. A semantic-web approach for modeling computing infrastructures

    NARCIS (Netherlands)

    M. Ghijsen; J. van der Ham; P. Grosso; C. Dumitru; H. Zhu; Z. Zhao; C. de Laat

    2013-01-01

    This paper describes our approach to modeling computing infrastructures. Our main contribution is the Infrastructure and Network Description Language (INDL) ontology. The aim of INDL is to provide technology independent descriptions of computing infrastructures, including the physical resources as w

  10. Identifying perinatal risk factors for infant maltreatment: an ecological approach

    Directory of Open Access Journals (Sweden)

    Hallisey Elaine J

    2006-12-01

    Full Text Available Abstract Background Child maltreatment and its consequences are a persistent problem throughout the world. Public health workers, human services officials, and others are interested in new and efficient ways to determine which geographic areas to target for intervention programs and resources. To improve assessment efforts, selected perinatal factors were examined, both individually and in various combinations, to determine if they are associated with increased risk of infant maltreatment. State of Georgia birth records and abuse and neglect data were analyzed using an area-based, ecological approach with the census tract as a surrogate for the community. Cartographic visualization suggested some correlation exists between risk factors and child maltreatment, so bivariate and multivariate regression were performed. The presence of spatial autocorrelation precluded the use of traditional ordinary least squares regression, therefore a spatial regression model coupled with maximum likelihood estimation was employed. Results Results indicate that all individual factors or their combinations are significantly associated with increased risk of infant maltreatment. The set of perinatal risk factors that best predicts infant maltreatment rates are: mother smoked during pregnancy, families with three or more siblings, maternal age less than 20 years, births to unmarried mothers, Medicaid beneficiaries, and inadequate prenatal care. Conclusion This model enables public health to take a proactive stance, to reasonably predict areas where poor outcomes are likely to occur, and to therefore more efficiently allocate resources. U.S. states that routinely collect the variables the National Center for Health Statistics (NCHS defines for birth certificates can easily identify areas that are at high risk for infant maltreatment. The authors recommend that agencies charged with reducing child maltreatment target communities that demonstrate the perinatal risks

  11. Reverse Vaccinology: An Approach for Identifying Leptospiral Vaccine Candidates

    Directory of Open Access Journals (Sweden)

    Odir A. Dellagostin

    2017-01-01

    Full Text Available Leptospirosis is a major public health problem with an incidence of over one million human cases each year. It is a globally distributed, zoonotic disease and is associated with significant economic losses in farm animals. Leptospirosis is caused by pathogenic Leptospira spp. that can infect a wide range of domestic and wild animals. Given the inability to control the cycle of transmission among animals and humans, there is an urgent demand for a new vaccine. Inactivated whole-cell vaccines (bacterins are routinely used in livestock and domestic animals, however, protection is serovar-restricted and short-term only. To overcome these limitations, efforts have focused on the development of recombinant vaccines, with partial success. Reverse vaccinology (RV has been successfully applied to many infectious diseases. A growing number of leptospiral genome sequences are now available in public databases, providing an opportunity to search for prospective vaccine antigens using RV. Several promising leptospiral antigens were identified using this approach, although only a few have been characterized and evaluated in animal models. In this review, we summarize the use of RV for leptospirosis and discuss the need for potential improvements for the successful development of a new vaccine towards reducing the burden of human and animal leptospirosis.

  12. Reverse Vaccinology: An Approach for Identifying Leptospiral Vaccine Candidates

    Science.gov (United States)

    Dellagostin, Odir A.; Grassmann, André A.; Rizzi, Caroline; Schuch, Rodrigo A.; Jorge, Sérgio; Oliveira, Thais L.; McBride, Alan J. A.; Hartwig, Daiane D.

    2017-01-01

    Leptospirosis is a major public health problem with an incidence of over one million human cases each year. It is a globally distributed, zoonotic disease and is associated with significant economic losses in farm animals. Leptospirosis is caused by pathogenic Leptospira spp. that can infect a wide range of domestic and wild animals. Given the inability to control the cycle of transmission among animals and humans, there is an urgent demand for a new vaccine. Inactivated whole-cell vaccines (bacterins) are routinely used in livestock and domestic animals, however, protection is serovar-restricted and short-term only. To overcome these limitations, efforts have focused on the development of recombinant vaccines, with partial success. Reverse vaccinology (RV) has been successfully applied to many infectious diseases. A growing number of leptospiral genome sequences are now available in public databases, providing an opportunity to search for prospective vaccine antigens using RV. Several promising leptospiral antigens were identified using this approach, although only a few have been characterized and evaluated in animal models. In this review, we summarize the use of RV for leptospirosis and discuss the need for potential improvements for the successful development of a new vaccine towards reducing the burden of human and animal leptospirosis. PMID:28098813

  13. Aluminium in Biological Environments: A Computational Approach

    Science.gov (United States)

    Mujika, Jon I; Rezabal, Elixabete; Mercero, Jose M; Ruipérez, Fernando; Costa, Dominique; Ugalde, Jesus M; Lopez, Xabier

    2014-01-01

    The increased availability of aluminium in biological environments, due to human intervention in the last century, raises concerns on the effects that this so far “excluded from biology” metal might have on living organisms. Consequently, the bioinorganic chemistry of aluminium has emerged as a very active field of research. This review will focus on our contributions to this field, based on computational studies that can yield an understanding of the aluminum biochemistry at a molecular level. Aluminium can interact and be stabilized in biological environments by complexing with both low molecular mass chelants and high molecular mass peptides. The speciation of the metal is, nonetheless, dictated by the hydrolytic species dominant in each case and which vary according to the pH condition of the medium. In blood, citrate and serum transferrin are identified as the main low molecular mass and high molecular mass molecules interacting with aluminium. The complexation of aluminium to citrate and the subsequent changes exerted on the deprotonation pathways of its tritable groups will be discussed along with the mechanisms for the intake and release of aluminium in serum transferrin at two pH conditions, physiological neutral and endosomatic acidic. Aluminium can substitute other metals, in particular magnesium, in protein buried sites and trigger conformational disorder and alteration of the protonation states of the protein's sidechains. A detailed account of the interaction of aluminium with proteic sidechains will be given. Finally, it will be described how alumnium can exert oxidative stress by stabilizing superoxide radicals either as mononuclear aluminium or clustered in boehmite. The possibility of promotion of Fenton reaction, and production of hydroxyl radicals will also be discussed. PMID:24757505

  14. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  15. Computational neuroscience approach to biomarkers and treatments for mental disorders.

    Science.gov (United States)

    Yahata, Noriaki; Kasai, Kiyoto; Kawato, Mitsuo

    2017-04-01

    Psychiatry research has long experienced a stagnation stemming from a lack of understanding of the neurobiological underpinnings of phenomenologically defined mental disorders. Recently, the application of computational neuroscience to psychiatry research has shown great promise in establishing a link between phenomenological and pathophysiological aspects of mental disorders, thereby recasting current nosology in more biologically meaningful dimensions. In this review, we highlight recent investigations into computational neuroscience that have undertaken either theory- or data-driven approaches to quantitatively delineate the mechanisms of mental disorders. The theory-driven approach, including reinforcement learning models, plays an integrative role in this process by enabling correspondence between behavior and disorder-specific alterations at multiple levels of brain organization, ranging from molecules to cells to circuits. Previous studies have explicated a plethora of defining symptoms of mental disorders, including anhedonia, inattention, and poor executive function. The data-driven approach, on the other hand, is an emerging field in computational neuroscience seeking to identify disorder-specific features among high-dimensional big data. Remarkably, various machine-learning techniques have been applied to neuroimaging data, and the extracted disorder-specific features have been used for automatic case-control classification. For many disorders, the reported accuracies have reached 90% or more. However, we note that rigorous tests on independent cohorts are critically required to translate this research into clinical applications. Finally, we discuss the utility of the disorder-specific features found by the data-driven approach to psychiatric therapies, including neurofeedback. Such developments will allow simultaneous diagnosis and treatment of mental disorders using neuroimaging, thereby establishing 'theranostics' for the first time in clinical

  16. Computational modeling identifies key gene regulatory interactions underlying phenobarbital-mediated tumor promotion

    Science.gov (United States)

    Luisier, Raphaëlle; Unterberger, Elif B.; Goodman, Jay I.; Schwarz, Michael; Moggs, Jonathan; Terranova, Rémi; van Nimwegen, Erik

    2014-01-01

    Gene regulatory interactions underlying the early stages of non-genotoxic carcinogenesis are poorly understood. Here, we have identified key candidate regulators of phenobarbital (PB)-mediated mouse liver tumorigenesis, a well-characterized model of non-genotoxic carcinogenesis, by applying a new computational modeling approach to a comprehensive collection of in vivo gene expression studies. We have combined our previously developed motif activity response analysis (MARA), which models gene expression patterns in terms of computationally predicted transcription factor binding sites with singular value decomposition (SVD) of the inferred motif activities, to disentangle the roles that different transcriptional regulators play in specific biological pathways of tumor promotion. Furthermore, transgenic mouse models enabled us to identify which of these regulatory activities was downstream of constitutive androstane receptor and β-catenin signaling, both crucial components of PB-mediated liver tumorigenesis. We propose novel roles for E2F and ZFP161 in PB-mediated hepatocyte proliferation and suggest that PB-mediated suppression of ESR1 activity contributes to the development of a tumor-prone environment. Our study shows that combining MARA with SVD allows for automated identification of independent transcription regulatory programs within a complex in vivo tissue environment and provides novel mechanistic insights into PB-mediated hepatocarcinogenesis. PMID:24464994

  17. Identifying predictors of physics item difficulty: A linear regression approach

    Directory of Open Access Journals (Sweden)

    Vanes Mesic

    2011-06-01

    Full Text Available Large-scale assessments of student achievement in physics are often approached with an intention to discriminate students based on the attained level of their physics competencies. Therefore, for purposes of test design, it is important that items display an acceptable discriminatory behavior. To that end, it is recommended to avoid extraordinary difficult and very easy items. Knowing the factors that influence physics item difficulty makes it possible to model the item difficulty even before the first pilot study is conducted. Thus, by identifying predictors of physics item difficulty, we can improve the test-design process. Furthermore, we get additional qualitative feedback regarding the basic aspects of student cognitive achievement in physics that are directly responsible for the obtained, quantitative test results. In this study, we conducted a secondary analysis of data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we explored the concept of “physics competence” and performed a content analysis of 123 physics items that were included within the above-mentioned assessments. Thereafter, an item database was created. Items were described by variables which reflect some basic cognitive aspects of physics competence. For each of the assessments, Rasch item difficulties were calculated in separate analyses. In order to make the item difficulties from different assessments comparable, a virtual test equating procedure had to be implemented. Finally, a regression model of physics item difficulty was created. It has been shown that 61.2% of item difficulty variance can be explained by factors which reflect the automaticity, complexity, and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the divergence of required thinking and interference effects between intuitive and formal

  18. Identifying predictors of physics item difficulty: A linear regression approach

    Science.gov (United States)

    Mesic, Vanes; Muratovic, Hasnija

    2011-06-01

    Large-scale assessments of student achievement in physics are often approached with an intention to discriminate students based on the attained level of their physics competencies. Therefore, for purposes of test design, it is important that items display an acceptable discriminatory behavior. To that end, it is recommended to avoid extraordinary difficult and very easy items. Knowing the factors that influence physics item difficulty makes it possible to model the item difficulty even before the first pilot study is conducted. Thus, by identifying predictors of physics item difficulty, we can improve the test-design process. Furthermore, we get additional qualitative feedback regarding the basic aspects of student cognitive achievement in physics that are directly responsible for the obtained, quantitative test results. In this study, we conducted a secondary analysis of data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we explored the concept of “physics competence” and performed a content analysis of 123 physics items that were included within the above-mentioned assessments. Thereafter, an item database was created. Items were described by variables which reflect some basic cognitive aspects of physics competence. For each of the assessments, Rasch item difficulties were calculated in separate analyses. In order to make the item difficulties from different assessments comparable, a virtual test equating procedure had to be implemented. Finally, a regression model of physics item difficulty was created. It has been shown that 61.2% of item difficulty variance can be explained by factors which reflect the automaticity, complexity, and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the divergence of required thinking and interference effects between intuitive and formal physics knowledge

  19. An optimized Leave One Out approach to efficiently identify outliers

    Science.gov (United States)

    Biagi, L.; Caldera, S.; Perego, D.

    2012-04-01

    contribution of each subvector is subtracted from the batch result by algebraic decompositions, with a minimal computational effort: this holds for the parameters, the a posteriori residuals and the variance. Therefore all the n subvectors of residuals can be checked. The algorithm provides exactly the same results of the usual LOO but it is significantly faster, because it does not require any iteration of the adjustment. In some way, this is an inverse application of the well known sequential LS where the parameters are estimated sequentially by adding the contribution of new observations as they are available. In the presentation, the optimized LOO is discussed. Its application to a very simple example of a levelling network is discussed and compared to the usual approaches for outliers identification, in view of a further study for the application to the real time quality check of positioning services.

  20. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  1. A novel pattern mining approach for identifying cognitive activity in EEG based functional brain networks.

    Science.gov (United States)

    Thilaga, M; Vijayalakshmi, R; Nadarajan, R; Nandagopal, D

    2016-06-01

    The complex nature of neuronal interactions of the human brain has posed many challenges to the research community. To explore the underlying mechanisms of neuronal activity of cohesive brain regions during different cognitive activities, many innovative mathematical and computational models are required. This paper presents a novel Common Functional Pattern Mining approach to demonstrate the similar patterns of interactions due to common behavior of certain brain regions. The electrode sites of EEG-based functional brain network are modeled as a set of transactions and node-based complex network measures as itemsets. These itemsets are transformed into a graph data structure called Functional Pattern Graph. By mining this Functional Pattern Graph, the common functional patterns due to specific brain functioning can be identified. The empirical analyses show the efficiency of the proposed approach in identifying the extent to which the electrode sites (transactions) are similar during various cognitive load states.

  2. Human Computer Interaction: An intellectual approach

    Directory of Open Access Journals (Sweden)

    Kuntal Saroha

    2011-08-01

    Full Text Available This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI relating tohuman psychology. Human-computer interaction (HCI isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. It involves input and output devices andthe interaction techniques that use them; how information ispresented and requested; how the computer’s actions arecontrolled and monitored; all forms of help, documentation,and training; the tools used to design, build, test, andevaluate user interfaces; and the processes that developersfollow when creating Interfaces.

  3. Computer science approach to quantum control

    Energy Technology Data Exchange (ETDEWEB)

    Janzing, D.

    2006-07-01

    Whereas it is obvious that every computation process is a physical process it has hardly been recognized that many complex physical processes bear similarities to computation processes. This is in particular true for the control of physical systems on the nanoscopic level: usually the system can only be accessed via a rather limited set of elementary control operations and for many purposes only a concatenation of a large number of these basic operations will implement the desired process. This concatenation is in many cases quite similar to building complex programs from elementary steps and principles for designing algorithm may thus be a paradigm for designing control processes. For instance, one can decrease the temperature of one part of a molecule by transferring its heat to the remaining part where it is then dissipated to the environment. But the implementation of such a process involves a complex sequence of electromagnetic pulses. This work considers several hypothetical control processes on the nanoscopic level and show their analogy to computation processes. We show that measuring certain types of quantum observables is such a complex task that every instrument that is able to perform it would necessarily be an extremely powerful computer. Likewise, the implementation of a heat engine on the nanoscale requires to process the heat in a way that is similar to information processing and it can be shown that heat engines with maximal efficiency would be powerful computers, too. In the same way as problems in computer science can be classified by complexity classes we can also classify control problems according to their complexity. Moreover, we directly relate these complexity classes for control problems to the classes in computer science. Unifying notions of complexity in computer science and physics has therefore two aspects: on the one hand, computer science methods help to analyze the complexity of physical processes. On the other hand, reasonable

  4. Computational dynamics for robotics systems using a non-strict computational approach

    Science.gov (United States)

    Orin, David E.; Wong, Ho-Cheung; Sadayappan, P.

    1989-01-01

    A Non-Strict computational approach for real-time robotics control computations is proposed. In contrast to the traditional approach to scheduling such computations, based strictly on task dependence relations, the proposed approach relaxes precedence constraints and scheduling is guided instead by the relative sensitivity of the outputs with respect to the various paths in the task graph. An example of the computation of the Inverse Dynamics of a simple inverted pendulum is used to demonstrate the reduction in effective computational latency through use of the Non-Strict approach. A speedup of 5 has been obtained when the processes of the task graph are scheduled to reduce the latency along the crucial path of the computation. While error is introduced by the relaxation of precedence constraints, the Non-Strict approach has a smaller error than the conventional Strict approach for a wide range of input conditions.

  5. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  6. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  7. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  8. Computational Models of Spreadsheet Development: Basis for Educational Approaches

    CERN Document Server

    Hodnigg, Karin; Mittermeir, Roland T

    2008-01-01

    Among the multiple causes of high error rates in spreadsheets, lack of proper training and of deep understanding of the computational model upon which spreadsheet computations rest might not be the least issue. The paper addresses this problem by presenting a didactical model focussing on cell interaction, thus exceeding the atomicity of cell computations. The approach is motivated by an investigation how different spreadsheet systems handle certain computational issues implied from moving cells, copy-paste operations, or recursion.

  9. Identifying Our Approaches to Language Learning Technologies: Improving Professional Development

    Science.gov (United States)

    Petrie, Gina Mikel; Avery, Lisa

    2011-01-01

    The mid- to late 1990s was an exciting time for those concerned with incorporating new technology into their teaching of English as a second or foreign language (ESL/EFL). Commonly referred to as Computer-Assisted Language Learning (CALL), or sometimes with the broader term Technology-Enhanced Language Learning (TELL), the field took huge leaps…

  10. A multi-criteria decision making approach to identify a vaccine formulation.

    Science.gov (United States)

    Dewé, Walthère; Durand, Christelle; Marion, Sandie; Oostvogels, Lidia; Devaster, Jeanne-Marie; Fourneau, Marc

    2016-01-01

    This article illustrates the use of a multi-criteria decision making approach, based on desirability functions, to identify an appropriate adjuvant composition for an influenza vaccine to be used in elderly. The proposed adjuvant system contained two main elements: monophosphoryl lipid and α-tocopherol with squalene in an oil/water emulsion. The objective was to elicit a stronger immune response while maintaining an acceptable reactogenicity and safety profile. The study design, the statistical models, the choice of the desirability functions, the computation of the overall desirability index, and the assessment of the robustness of the ranking are all detailed in this manuscript.

  11. 'Omics' approaches in tomato aimed at identifying candidate genes ...

    African Journals Online (AJOL)

    adriana

    2013-12-04

    Dec 4, 2013 ... identifying all the components of a single biological system is within our means; however, assigning ... discovery of new candidate genes/QTLs and/or to assign ... identify putative genes involved in their genetic control .... for adaptation to different environments. ..... provides insights into fleshy fruit evolution.

  12. Computational Approach to Dendritic Spine Taxonomy and Shape Transition Analysis

    Science.gov (United States)

    Bokota, Grzegorz; Magnowska, Marta; Kuśmierczyk, Tomasz; Łukasik, Michał; Roszkowska, Matylda; Plewczynski, Dariusz

    2016-01-01

    The common approach in morphological analysis of dendritic spines of mammalian neuronal cells is to categorize spines into subpopulations based on whether they are stubby, mushroom, thin, or filopodia shaped. The corresponding cellular models of synaptic plasticity, long-term potentiation, and long-term depression associate the synaptic strength with either spine enlargement or spine shrinkage. Although a variety of automatic spine segmentation and feature extraction methods were developed recently, no approaches allowing for an automatic and unbiased distinction between dendritic spine subpopulations and detailed computational models of spine behavior exist. We propose an automatic and statistically based method for the unsupervised construction of spine shape taxonomy based on arbitrary features. The taxonomy is then utilized in the newly introduced computational model of behavior, which relies on transitions between shapes. Models of different populations are compared using supplied bootstrap-based statistical tests. We compared two populations of spines at two time points. The first population was stimulated with long-term potentiation, and the other in the resting state was used as a control. The comparison of shape transition characteristics allowed us to identify the differences between population behaviors. Although some extreme changes were observed in the stimulated population, statistically significant differences were found only when whole models were compared. The source code of our software is freely available for non-commercial use1. Contact: d.plewczynski@cent.uw.edu.pl. PMID:28066226

  13. DNA enrichment approaches to identify unauthorized genetically modified organisms (GMOs).

    Science.gov (United States)

    Arulandhu, Alfred J; van Dijk, Jeroen P; Dobnik, David; Holst-Jensen, Arne; Shi, Jianxin; Zel, Jana; Kok, Esther J

    2016-07-01

    With the increased global production of different genetically modified (GM) plant varieties, chances increase that unauthorized GM organisms (UGMOs) may enter the food chain. At the same time, the detection of UGMOs is a challenging task because of the limited sequence information that will generally be available. PCR-based methods are available to detect and quantify known UGMOs in specific cases. If this approach is not feasible, DNA enrichment of the unknown adjacent sequences of known GMO elements is one way to detect the presence of UGMOs in a food or feed product. These enrichment approaches are also known as chromosome walking or gene walking (GW). In recent years, enrichment approaches have been coupled with next generation sequencing (NGS) analysis and implemented in, amongst others, the medical and microbiological fields. The present review will provide an overview of these approaches and an evaluation of their applicability in the identification of UGMOs in complex food or feed samples.

  14. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document a ...

  15. Molecular electromagnetism a computational chemistry approach

    CERN Document Server

    Sauer, Stephan P A

    2011-01-01

    A textbook for a one-semester course for students in chemistry physics and nanotechnology, this book examines the interaction of molecules with electric and magnetic fields as, for example in light. The book provides the necessary background knowledge for simulating these interactions on computers with modern quantum chemical software.

  16. A Method for Identifying Contours in Processing Digital Images from Computer Tomograph

    Science.gov (United States)

    Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela

    2011-09-01

    The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.

  17. Computational Approach To Understanding Autism Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Włodzisław Duch

    2012-01-01

    Full Text Available Every year the prevalence of Autism Spectrum of Disorders (ASD is rising. Is there a unifying mechanism of various ASD cases at the genetic, molecular, cellular or systems level? The hypothesis advanced in this paper is focused on neural dysfunctions that lead to problems with attention in autistic people. Simulations of attractor neural networks performing cognitive functions help to assess system long-term neurodynamics. The Fuzzy Symbolic Dynamics (FSD technique is used for the visualization of attractors in the semantic layer of the neural model of reading. Large-scale simulations of brain structures characterized by a high order of complexity requires enormous computational power, especially if biologically motivated neuron models are used to investigate the influence of cellular structure dysfunctions on the network dynamics. Such simulations have to be implemented on computer clusters in a grid-based architectures

  18. Music Genre Classification Systems - A Computational Approach

    OpenAIRE

    Ahrendt, Peter; Hansen, Lars Kai

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular...

  19. Computer Aided Interpretation Approach for Optical Tomographic Images

    CERN Document Server

    Klose, Christian D; Netz, Uwe; Beuthan, Juergen; Hielscher, Andreas H

    2010-01-01

    A computer-aided interpretation approach is proposed to detect rheumatic arthritis (RA) of human finger joints in optical tomographic images. The image interpretation method employs a multi-variate signal detection analysis aided by a machine learning classification algorithm, called Self-Organizing Mapping (SOM). Unlike in previous studies, this allows for combining multiple physical image parameters, such as minimum and maximum values of the absorption coefficient for identifying affected and not affected joints. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index, and mutual information. Different methods (i.e., clinical diagnostics, ultrasound imaging, magnet resonance imaging and inspection of optical tomographic images), were used as "ground truth"-benchmarks to determine the performance of image interpretations. Using data from 100 finger joints, findings suggest that some parameter combinations lead to higher sensitivities while...

  20. Computational Approaches for Probing the Formation of Atmospheric Molecular Clusters

    DEFF Research Database (Denmark)

    Elm, Jonas

    the performance of computational strategies in order to identify a sturdy methodology, which should be applicable for handling various issues related to atmospheric cluster formation. Density functional theory (DFT) is applied to study individual cluster formation steps. Utilizing large test sets of numerous...... atmospheric clusters I evaluate the performance of different DFT functionals, with a specific focus on how to control potential errors associated with the calculation of single point energies and evaluation of the thermal contribution to the Gibbs free energy. Using DFT I study two candidate systems (glycine...... acid could thereby enhance the further growth of an existing cluster by condensing on the surface. Conclusively, I find that the performance of a single DFT functional can lead to an inadequate description of investigated atmospheric systems and thereby recommend a joint DFT (J-DFT) approach...

  1. Computer Modeling of Violent Intent: A Content Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  2. Leaching from Heterogeneous Heck Catalysts: A Computational Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The possibility of carrying out a purely heterogeneous Heck reaction in practice without Pd leaching has been previously considered by a number of research groups but no general consent has yet arrived. Here, the reaction was, for the first time, evaluated by a simple computational approach. Modelling experiments were performed on one of the initial catalytic steps: phenyl halides attachment on Pd (111) to (100) and (111) to (111) ridges of a Pd crystal. Three surface structures of resulting [PhPdX] were identified as possible reactive intermediates. Following potential energy minimisation calculations based on a universal force field, the relative stabilities of these surface species were then determined. Results showed the most stable species to be one in which a Pd ridge atom is removed from the Pd crystal structure, suggesting Pd leaching induced by phenyl halides is energetically favourable.

  3. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...

  4. An Approach for Identifying Benefit Segments among Prospective College Students.

    Science.gov (United States)

    Miller, Patrick; And Others

    1990-01-01

    A study investigated the importance to 578 applicants of various benefits offered by a moderately selective private university. Applicants rated the institution on 43 academic, social, financial, religious, and curricular attributes. The objective was to test the efficacy of one approach to college market segmentation. Results support the utility…

  5. A Community-Based Approach to Identifying Influential Spreaders

    Directory of Open Access Journals (Sweden)

    Zhiying Zhao

    2015-04-01

    Full Text Available Identifying influential spreaders in complex networks has a significant impact on understanding and control of spreading process in networks. In this paper, we introduce a new centrality index to identify influential spreaders in a network based on the community structure of the network. The community-based centrality (CbC considers both the number and sizes of communities that are directly linked by a node. We discuss correlations between CbC and other classical centrality indices. Based on simulations of the single source of infection with the Susceptible-Infected-Recovered (SIR model, we find that CbC can help to identify some critical influential nodes that other indices cannot find. We also investigate the stability of CbC.

  6. Acoustic gravity waves: A computational approach

    Science.gov (United States)

    Hariharan, S. I.; Dutt, P. K.

    1987-01-01

    This paper discusses numerical solutions of a hyperbolic initial boundary value problem that arises from acoustic wave propagation in the atmosphere. Field equations are derived from the atmospheric fluid flow governed by the Euler equations. The resulting original problem is nonlinear. A first order linearized version of the problem is used for computational purposes. The main difficulty in the problem as with any open boundary problem is in obtaining stable boundary conditions. Approximate boundary conditions are derived and shown to be stable. Numerical results are presented to verify the effectiveness of these boundary conditions.

  7. Identifying the "Truly Disadvantaged": A Comprehensive Biosocial Approach

    Science.gov (United States)

    Barnes, J. C.; Beaver, Kevin M.; Connolly, Eric J.; Schwartz, Joseph A.

    2016-01-01

    There has been significant interest in examining the developmental factors that predispose individuals to chronic criminal offending. This body of research has identified some social-environmental risk factors as potentially important. At the same time, the research producing these results has generally failed to employ genetically sensitive…

  8. Identifying Subgroups among Hardcore Smokers: a Latent Profile Approach

    NARCIS (Netherlands)

    Bommelé, J.; Kleinjan, M.; Schoenmakers, T.M.; Eijnden, R. van den; Mheen, D. van de

    2015-01-01

    Introduction: Hardcore smokers are smokers who have little to no intention to quit. Previous research suggests that there are distinct subgroups among hardcore smokers and that these subgroups vary in the perceived pros and cons of smoking and quitting. Identifying these subgroups could help to deve

  9. Computational approaches for microalgal biofuel optimization: a review.

    Science.gov (United States)

    Koussa, Joseph; Chaiboonchoe, Amphun; Salehi-Ashtiani, Kourosh

    2014-01-01

    The increased demand and consumption of fossil fuels have raised interest in finding renewable energy sources throughout the globe. Much focus has been placed on optimizing microorganisms and primarily microalgae, to efficiently produce compounds that can substitute for fossil fuels. However, the path to achieving economic feasibility is likely to require strain optimization through using available tools and technologies in the fields of systems and synthetic biology. Such approaches invoke a deep understanding of the metabolic networks of the organisms and their genomic and proteomic profiles. The advent of next generation sequencing and other high throughput methods has led to a major increase in availability of biological data. Integration of such disparate data can help define the emergent metabolic system properties, which is of crucial importance in addressing biofuel production optimization. Herein, we review major computational tools and approaches developed and used in order to potentially identify target genes, pathways, and reactions of particular interest to biofuel production in algae. As the use of these tools and approaches has not been fully implemented in algal biofuel research, the aim of this review is to highlight the potential utility of these resources toward their future implementation in algal research.

  10. Computational Approaches for Microalgal Biofuel Optimization: A Review

    Directory of Open Access Journals (Sweden)

    Joseph Koussa

    2014-01-01

    Full Text Available The increased demand and consumption of fossil fuels have raised interest in finding renewable energy sources throughout the globe. Much focus has been placed on optimizing microorganisms and primarily microalgae, to efficiently produce compounds that can substitute for fossil fuels. However, the path to achieving economic feasibility is likely to require strain optimization through using available tools and technologies in the fields of systems and synthetic biology. Such approaches invoke a deep understanding of the metabolic networks of the organisms and their genomic and proteomic profiles. The advent of next generation sequencing and other high throughput methods has led to a major increase in availability of biological data. Integration of such disparate data can help define the emergent metabolic system properties, which is of crucial importance in addressing biofuel production optimization. Herein, we review major computational tools and approaches developed and used in order to potentially identify target genes, pathways, and reactions of particular interest to biofuel production in algae. As the use of these tools and approaches has not been fully implemented in algal biofuel research, the aim of this review is to highlight the potential utility of these resources toward their future implementation in algal research.

  11. Bioinformatic approaches to identifying and classifying Rab proteins.

    Science.gov (United States)

    Diekmann, Yoan; Pereira-Leal, José B

    2015-01-01

    The bioinformatic annotation of Rab GTPases is important, for example, to understand the evolution of the endomembrane system. However, Rabs are particularly challenging for standard annotation pipelines because they are similar to other small GTPases and form a large family with many paralogous subfamilies. Here, we describe a bioinformatic annotation pipeline specifically tailored to Rab GTPases. It proceeds in two steps: first, Rabs are distinguished from other proteins based on GTPase-specific motifs, overall sequence similarity to other Rabs, and the occurrence of Rab-specific motifs. Second, Rabs are classified taking either a more accurate but slower phylogenetic approach or a slightly less accurate but much faster bioinformatic approach. All necessary steps can either be performed locally or using the referenced online tools. An implementation of a slightly more involved version of the pipeline presented here is available at RabDB.org.

  12. Global computational algebraic topology approach for diffusion

    Science.gov (United States)

    Auclair-Fortier, Marie-Flavie; Ziou, Djemel; Allili, Madjid

    2004-05-01

    One physical process involved in many computer vision problems is the heat diffusion process. Such Partial differential equations are continuous and have to be discretized by some techniques, mostly mathematical processes like finite differences or finite elements. The continuous domain is subdivided into sub-domains in which there is only one value. The diffusion equation comes from the energy conservation then it is valid on a whole domain. We use the global equation instead of discretize the PDE obtained by a limit process on this global equation. To encode these physical global values over pixels of different dimensions, we use a computational algebraic topology (CAT)-based image model. This model has been proposed by Ziou and Allili and used for the deformation of curves and optical flow. It introduces the image support as a decomposition in terms of points, edges, surfaces, volumes, etc. Images of any dimensions can then be handled. After decomposing the physical principles of the heat transfer into basic laws, we recall the CAT-based image model and use it to encode the basic laws. We then present experimental results for nonlinear graylevel diffusion for denoising, ensuring thin features preservation.

  13. A complex network approach to cloud computing

    CERN Document Server

    Travieso, Gonzalo; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2015-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the users' tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlain by Erdos-Renyi and Barabasi-Albert topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of two indices: the cost of communication between the user and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter index, the ER topology provides better performance than the BA case for smaller average degrees and opposite behavior for larger average degrees. With respect to the cost, smaller values are found in the BA ...

  14. Computational approaches to homogeneous gold catalysis.

    Science.gov (United States)

    Faza, Olalla Nieto; López, Carlos Silva

    2015-01-01

    Homogenous gold catalysis has been exploding for the last decade at an outstanding pace. The best described reactivity of Au(I) and Au(III) species is based on gold's properties as a soft Lewis acid, but new reactivity patterns have recently emerged which further expand the range of transformations achievable using gold catalysis, with examples of dual gold activation, hydrogenation reactions, or Au(I)/Au(III) catalytic cycles.In this scenario, to develop fully all these new possibilities, the use of computational tools to understand at an atomistic level of detail the complete role of gold as a catalyst is unavoidable. In this work we aim to provide a comprehensive review of the available benchmark works on methodological options to study homogenous gold catalysis in the hope that this effort can help guide the choice of method in future mechanistic studies involving gold complexes. This is relevant because a representative number of current mechanistic studies still use methods which have been reported as inappropriate and dangerously inaccurate for this chemistry.Together with this, we describe a number of recent mechanistic studies where computational chemistry has provided relevant insights into non-conventional reaction paths, unexpected selectivities or novel reactivity, which illustrate the complexity behind gold-mediated organic chemistry.

  15. An approach to identify the optimal cloud in cloud federation

    Directory of Open Access Journals (Sweden)

    Saumitra Baleshwar Govil

    2012-01-01

    Full Text Available Enterprises are migrating towards cloud computing for their ability to provide agility, robustness and feasibility in operations. To increase the reliability and availability of services, clouds have grown into federated clouds i.e., union of clouds. There are still major issues in federated clouds, which when solved could lead to increased satisfaction to both service providers and clients alike. One such issue is to select the optimal foreign cloud amongst the federation, which provides services according to the client requirements. In this paper, we propose a model to select the optimal cloud service provider based on the capability and performance of the available clouds in the federation. We use two matrix models to obtain the capability and performance parametric values. They are matched with the client requirements and the optimal foreign cloud service provider is selected.

  16. cgaTOH: extended approach for identifying tracts of homozygosity.

    Directory of Open Access Journals (Sweden)

    Li Zhang

    Full Text Available Identification of disease variants via homozygosity mapping and investigation of the effects of genome-wide homozygosity regions on traits of biomedical importance have been widely applied recently. Nonetheless, the existing methods and algorithms to identify long tracts of homozygosity (TOH are not able to provide efficient and rigorous regions for further downstream association investigation. We expanded current methods to identify TOHs by defining "surrogate-TOH", a region covering a cluster of TOHs with specific characteristics. Our defined surrogate-TOH includes cTOH, viz a common TOH region where at least ten TOHs present; gTOH, whereby a group of highly overlapping TOHs share proximal boundaries; and aTOH, which are allelically-matched TOHs. Searching for gTOH and aTOH was based on a repeated binary spectral clustering algorithm, where a hierarchy of clusters is created and represented by a TOH cluster tree. Based on the proposed method of identifying different species of surrogate-TOH, our cgaTOH software was developed. The software provides an intuitive and interactive visualization tool for better investigation of the high-throughput output with special interactive navigation rings, which will find its applicability in both conventional association studies and more sophisticated downstream analyses. NCBI genome map viewer is incorporated into the system. Moreover, we discuss the choice of implementing appropriate empirical ranges of critical parameters by applying to disease models. This method identifies various patterned clusters of SNPs demonstrating extended homozygosity, thus one can observe different aspects of the multi-faceted characteristics of TOHs.

  17. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  18. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  19. A polyhedral approach to computing border bases

    CERN Document Server

    Braun, Gábor

    2009-01-01

    Border bases can be considered to be the natural extension of Gr\\"obner bases that have several advantages. Unfortunately, to date the classical border basis algorithm relies on (degree-compatible) term orderings and implicitly on reduced Gr\\"obner bases. We adapt the classical border basis algorithm to allow for calculating border bases for arbitrary degree-compatible order ideals, which is \\emph{independent} from term orderings. Moreover, the algorithm also supports calculating degree-compatible order ideals with \\emph{preference} on contained elements, even though finding a preferred order ideal is NP-hard. Effectively we retain degree-compatibility only to successively extend our computation degree-by-degree. The adaptation is based on our polyhedral characterization: order ideals that support a border basis correspond one-to-one to integral points of the order ideal polytope. This establishes a crucial connection between the ideal and the combinatorial structure of the associated factor spaces.

  20. Biologically motivated computationally intensive approaches to image pattern recognition

    NARCIS (Netherlands)

    Petkov, Nikolay

    1995-01-01

    This paper presents some of the research activities of the research group in vision as a grand challenge problem whose solution is estimated to need the power of Tflop/s computers and for which computational methods have yet to be developed. The concerned approaches are biologically motivated, in th

  1. An Approach to Dynamic Provisioning of Social and Computational Services

    NARCIS (Netherlands)

    Bonino da Silva Santos, Luiz Olavo; Sorathia, Vikram; Ferreira Pires, Luis; Sinderen, van Marten

    2010-01-01

    Service-Oriented Computing (SOC) builds upon the intuitive notion of service already known and used in our society for a long time. SOC-related approaches are based on computer-executable functional units that often represent automation of services that exist at the social level, i.e., services at t

  2. A Maximum Entropy Approach to Identifying Sentence Boundaries

    CERN Document Server

    Reynar, J C; Reynar, Jeffrey C.; Ratnaparkhi, Adwait

    1997-01-01

    We present a trainable model for identifying sentence boundaries in raw text. Given a corpus annotated with sentence boundaries, our model learns to classify each occurrence of ., ?, and ! as either a valid or invalid sentence boundary. The training procedure requires no hand-crafted rules, lexica, part-of-speech tags, or domain-specific information. The model can therefore be trained easily on any genre of English, and should be trainable on any other Roman-alphabet language. Performance is comparable to or better than the performance of similar systems, but we emphasize the simplicity of retraining for new domains.

  3. An Automatic Approach to Detect Software Anomalies in Cloud Computing Using Pragmatic Bayes Approach

    Directory of Open Access Journals (Sweden)

    Nethaji V

    2014-06-01

    Full Text Available Software detection of anomalies is a vital element of operations in data centers and service clouds. Statistical Process Control (SPC cloud charts sense routine anomalies and their root causes are identified based on the differential profiling strategy. By automating the tasks, most of the manual overhead incurred in detecting the software anomalies and the analysis time are reduced to a larger extent but detailed analysis of profiling data are not performed in most of the cases. On the other hand, the cloud scheduler judges both the requirements of the user and the available infrastructure to equivalent their requirements. OpenStack prototype works on cloud trust management which provides the scheduler but complexity occurs when hosting the cloud system. At the same time, Trusted Computing Base (TCB of a computing node does not achieve the scalability measure. This unique paradigm brings about many software anomalies, which have not been well studied. This work, a Pragmatic Bayes approach studies the problem of detecting software anomalies and ensures scalability by comparing information at the current time to historical data. In particular, PB approach uses the two component Gaussian mixture to deviations at current time in cloud environment. The introduction of Gaussian mixture in PB approach achieves higher scalability measure which involves supervising massive number of cells and fast enough to be potentially useful in many streaming scenarios. Wherein previous works has been ensured for scheduling often lacks of scalability, this paper shows the superiority of the method using a Bayes per section error rate procedure through simulation, and provides the detailed analysis of profiling data in the marginal distributions using the Amazon EC2 dataset. Extensive performance analysis shows that the PB approach is highly efficient in terms of runtime, scalability, software anomaly detection ratio, CPU utilization, density rate, and computational

  4. PROCESS OF IDENTIFYING COMPETENCIES BASED ON A FUNCTIONAL APPROACH

    Directory of Open Access Journals (Sweden)

    NAOUFAL SEFIANI

    2012-01-01

    Full Text Available To cope with fast change of the technological and organizational context, managers need tools to help them to improve competence management. Our contribution aims at supporting the task of competence identification that is considered as the first step of the management process. The proposed identification involves tree stages.The first stage concerns the research of competences based on a functional approach. The second stage is to define a typology of the component of competence (characterization and the third stage to define the core competencies of competence (prioritization. The application of the method in an industrial case in the logisticsfield confirms the possibility of using the “principle of solution” to provide a dynamic process for the identification of requisite competencies.

  5. General approaches in ensemble quantum computing

    Indian Academy of Sciences (India)

    V Vimalan; N Chandrakumar

    2008-01-01

    We have developed methodology for NMR quantum computing focusing on enhancing the efficiency of initialization, of logic gate implementation and of readout. Our general strategy involves the application of rotating frame pulse sequences to prepare pseudopure states and to perform logic operations. We demonstrate experimentally our methodology for both homonuclear and heteronuclear spin ensembles. On model two-spin systems, the initialization time of one of our sequences is three-fourths (in the heteronuclear case) or one-fourth (in the homonuclear case), of the typical pulsed free precession sequences, attaining the same initialization efficiency. We have implemented the logical SWAP operation in homonuclear AMX spin systems using selective isotropic mixing, reducing the duration taken to a third compared to the standard re-focused INEPT-type sequence. We introduce the 1D version for readout of the rotating frame SWAP operation, in an attempt to reduce readout time. We further demonstrate the Hadamard mode of 1D SWAP, which offers 2N-fold reduction in experiment time for a system with -working bits, attaining the same sensitivity as the standard 1D version.

  6. Delay Computation Using Fuzzy Logic Approach

    Directory of Open Access Journals (Sweden)

    Ramasesh G. R.

    2012-10-01

    Full Text Available The paper presents practical application of fuzzy sets and system theory in predicting delay, with reasonable accuracy, a wide range of factors pertaining to construction projects. In this paper we shall use fuzzy logic to predict delays on account of Delayed supplies and Labor shortage. It is observed that the project scheduling software use either deterministic method or probabilistic method for computation of schedule durations, delays, lags and other parameters. In other words, these methods use only quantitative inputs leaving-out the qualitative aspects associated with individual activity of work. The qualitative aspect viz., the expertise of the mason or the lack of experience can have a significant impact on the assessed duration. Such qualitative aspects do not find adequate representation in the Project Scheduling software. A realistic project is considered for which a PERT chart has been prepared using showing all the major activities in reasonable detail. This project has been periodically updated until its completion. It is observed that some of the activities are delayed due to extraneous factors resulting in the overall delay of the project. The software has the capability to calculate the overall delay through CPM (Critical Path Method when each of the activity-delays is reported. We shall now demonstrate that by using fuzzy logic, these delays could have been predicted well in advance.

  7. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  8. A Novel Approach for Identify Small and Capital Handwritten Letter

    Directory of Open Access Journals (Sweden)

    Ekta Tiwari

    2012-06-01

    Full Text Available A handwritten character is represented as a sequence of strokes whose features are extracted and classified. Although the off-line and on-line character recognition techniques have different approaches, they share a lot of common problems and solutions. The printed documents available in the form of books, papers, magazines, etc. are scanned using standard scanners which produce an image of the scanned document. The preprocessed image is segmented using an algorithm which decomposes the scanned text into paragraphs using special space detection technique and then the paragraphs into lines using vertical histograms, and lines into words using horizontal histograms, and words into character image glyphs using horizontal histograms. Each image glyph is comprised of 24x24 pixels. Thus a database of character image glyphs is created out of the segmentation phase. The various features that are considered for classification are the character height, character width, the number of horizontal lines (long and short, image centroid and special dots. we proposed extracted features were passed to a Support Vector Machine (SVM where the characters are classified by Supervised Learning Algorithm. These classes are mapped onto for recognition. Then the text is reconstructed using fonts.

  9. Newer Approaches to Identify Potential Untoward Effects in Functional Foods.

    Science.gov (United States)

    Marone, Palma Ann; Birkenbach, Victoria L; Hayes, A Wallace

    2016-01-01

    Globalization has greatly accelerated the numbers and variety of food and beverage products available worldwide. The exchange among greater numbers of countries, manufacturers, and products in the United States and worldwide has necessitated enhanced quality measures for nutritional products for larger populations increasingly reliant on functionality. These functional foods, those that provide benefit beyond basic nutrition, are increasingly being used for their potential to alleviate food insufficiency while enhancing quality and longevity of life. In the United States alone, a steady import increase of greater than 15% per year or 24 million shipments, over 70% products of which are food related, is regulated under the Food and Drug Administration (FDA). This unparalleled growth has resulted in the need for faster, cheaper, and better safety and efficacy screening methods in the form of harmonized guidelines and recommendations for product standardization. In an effort to meet this need, the in vitro toxicology testing market has similarly grown with an anticipatory 15% increase between 2010 and 2015 of US$1.3 to US$2.7 billion. Although traditionally occupying a small fraction of the market behind pharmaceuticals and cosmetic/household products, the scope of functional food testing, including additives/supplements, ingredients, residues, contact/processing, and contaminants, is potentially expansive. Similarly, as functional food testing has progressed, so has the need to identify potential adverse factors that threaten the safety and quality of these products.

  10. Identifying problematic concepts in SNOMED CT using a lexical approach.

    Science.gov (United States)

    Agrawal, Ankur; Perl, Yehoshua; Elhanan, Gai

    2013-01-01

    SNOMED CT (SCT) has been endorsed as a premier clinical terminology by many organizations with a perceived use within electronic health records and clinical information systems. However, there are indications that, at the moment, SCT is not optimally structured for its intended use by healthcare practitioners. A study is conducted to investigate the extent of inconsistencies among the concepts in SCT. A group auditing technique to improve the quality of SCT is introduced that can help identify problematic concepts with a high probability. Positional similarity sets are defined, which are groups of concepts that are lexically similar and the position of the differing word in the fully specified name of the concepts of a set that correspond to each other. A manual auditing of a sample of such sets found 38% of the sets exhibiting one or more inconsistent concepts. Group auditing techniques such as this can thus be very helpful to assure the quality of SCT, which will help expedite its adoption as a reference terminology for clinical purposes.

  11. A landscape ecology approach identifies important drivers of urban biodiversity.

    Science.gov (United States)

    Turrini, Tabea; Knop, Eva

    2015-04-01

    Cities are growing rapidly worldwide, yet a mechanistic understanding of the impact of urbanization on biodiversity is lacking. We assessed the impact of urbanization on arthropod diversity (species richness and evenness) and abundance in a study of six cities and nearby intensively managed agricultural areas. Within the urban ecosystem, we disentangled the relative importance of two key landscape factors affecting biodiversity, namely the amount of vegetated area and patch isolation. To do so, we a priori selected sites that independently varied in the amount of vegetated area in the surrounding landscape at the 500-m scale and patch isolation at the 100-m scale, and we hold local patch characteristics constant. As indicator groups, we used bugs, beetles, leafhoppers, and spiders. Compared to intensively managed agricultural ecosystems, urban ecosystems supported a higher abundance of most indicator groups, a higher number of bug species, and a lower evenness of bug and beetle species. Within cities, a high amount of vegetated area increased species richness and abundance of most arthropod groups, whereas evenness showed no clear pattern. Patch isolation played only a limited role in urban ecosystems, which contrasts findings from agro-ecological studies. Our results show that urban areas can harbor a similar arthropod diversity and abundance compared to intensively managed agricultural ecosystems. Further, negative consequences of urbanization on arthropod diversity can be mitigated by providing sufficient vegetated space in the urban area, while patch connectivity is less important in an urban context. This highlights the need for applying a landscape ecological approach to understand the mechanisms shaping urban biodiversity and underlines the potential of appropriate urban planning for mitigating biodiversity loss.

  12. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Mobile Cloud Computing: A Review on Smartphone Augmentation Approaches

    CERN Document Server

    Abolfazli, Saeid; Gani, Abdullah

    2012-01-01

    Smartphones have recently gained significant popularity in heavy mobile processing while users are increasing their expectations toward rich computing experience. However, resource limitations and current mobile computing advancements hinder this vision. Therefore, resource-intensive application execution remains a challenging task in mobile computing that necessitates device augmentation. In this article, smartphone augmentation approaches are reviewed and classified in two main groups, namely hardware and software. Generating high-end hardware is a subset of hardware augmentation approaches, whereas conserving local resource and reducing resource requirements approaches are grouped under software augmentation methods. Our study advocates that consreving smartphones' native resources, which is mainly done via task offloading, is more appropriate for already-developed applications than new ones, due to costly re-development process. Cloud computing has recently obtained momentous ground as one of the major co...

  14. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  15. What is intrinsic motivation? A typology of computational approaches

    Directory of Open Access Journals (Sweden)

    Pierre-Yves Oudeyer

    2009-11-01

    Full Text Available Intrinsic motivation, the causal mechanism for spontaneous exploration and curiosity, is a central concept in developmental psychology. It has been argued to be a crucial mechanism for open-ended cognitive development in humans, and as such has gathered a growing interest from developmental roboticists in the recent years. The goal of this paper is threefold. First, it provides a synthesis of the different approaches of intrinsic motivation in psychology. Second, by interpreting these approaches in a computational reinforcement learning framework, we argue that they are not operational and even sometimes inconsistent. Third, we set the ground for a systematic operational study of intrinsic motivation by presenting a formal typology of possible computational approaches. This typology is partly based on existing computational models, but also presents new ways of conceptualizing intrinsic motivation. We argue that this kind of computational typology might be useful for opening new avenues for research both in psychology and developmental robotics.

  16. Systematic Approach to Computational Design of Gene Regulatory Networks with Information Processing Capabilities.

    Science.gov (United States)

    Moskon, Miha; Mraz, Miha

    2014-01-01

    We present several measures that can be used in de novo computational design of biological systems with information processing capabilities. Their main purpose is to objectively evaluate the behavior and identify the biological information processing structures with the best dynamical properties. They can be used to define constraints that allow one to simplify the design of more complex biological systems. These measures can be applied to existent computational design approaches in synthetic biology, i.e., rational and automatic design approaches. We demonstrate their use on a) the computational models of several basic information processing structures implemented with gene regulatory networks and b) on a modular design of a synchronous toggle switch.

  17. High Volume Throughput Computing: Identifying and Characterizing Throughput Oriented Workloads in Data Centers

    CERN Document Server

    Zhan, Jianfeng; Sun, Ninghui; Wang, Lei; Jia, Zhen; Luo, Chunjie

    2012-01-01

    For the first time, this paper systematically identifies three categories of throughput oriented workloads in data centers: services, data processing applications, and interactive real-time applications, whose targets are to increase the volume of throughput in terms of processed requests or data, or supported maximum number of simultaneous subscribers, respectively, and we coins a new term high volume throughput computing (in short HVC) to describe those workloads and data center systems designed for them. We characterize and compare HVC with other computing paradigms, e.g., high throughput computing, warehouse-scale computing, and cloud computing, in terms of levels, workloads, metrics, coupling degree, data scales, and number of jobs or service instances. We also preliminarily report our ongoing work on the metrics and benchmarks for HVC systems, which is the foundation of designing innovative data center systems for HVC workloads.

  18. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  19. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia;

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The scope...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  20. New drug candidates for liposomal delivery identified by computer modeling of liposomes' remote loading and leakage.

    Science.gov (United States)

    Cern, Ahuva; Marcus, David; Tropsha, Alexander; Barenholz, Yechezkel; Goldblum, Amiram

    2017-02-16

    Remote drug loading into nano-liposomes is in most cases the best method for achieving high concentrations of active pharmaceutical ingredients (API) per nano-liposome that enable therapeutically viable API-loaded nano-liposomes, referred to as nano-drugs. This approach also enables controlled drug release. Recently, we constructed computational models to identify APIs that can achieve the desired high concentrations in nano-liposomes by remote loading. While those previous models included a broad spectrum of experimental conditions and dealt only with loading, here we reduced the scope to the molecular characteristics alone. We model and predict API suitability for nano-liposomal delivery by fixing the main experimental conditions: liposome lipid composition and size to be similar to those of Doxil® liposomes. On that basis, we add a prediction of drug leakage from the nano-liposomes during storage. The latter is critical for having pharmaceutically viable nano-drugs. The "load and leak" models were used to screen two large molecular databases in search of candidate APIs for delivery by nano-liposomes. The distribution of positive instances in both loading and leakage models was similar in the two databases screened. The screening process identified 667 molecules that were positives by both loading and leakage models (i.e., both high-loading and stable). Among them, 318 molecules received a high score in both properties and of these, 67 are FDA-approved drugs. This group of molecules, having diverse pharmacological activities, may be the basis for future liposomal drug development.

  1. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational...... thinking. We present two main theses on which the subject is based, and we present the included knowledge areas and didactical design principles. Finally we summarize the status and future plans for the subject and related development projects....

  2. Human Computer Interaction Approach in Developing Customer Relationship Management

    Directory of Open Access Journals (Sweden)

    Mohd H.N.M. Nasir

    2008-01-01

    Full Text Available Problem statement: Many published studies have found that more than 50% of Customer Relationship Management (CRM system implementations have failed due to the failure of system usability and does not fulfilled user expectation. This study presented the issues that contributed to the failures of CRM system and proposed a prototype of CRM system developed using Human Computer Interaction approaches in order to resolve the identified issues. Approach: In order to capture the users' requirements, a single in-depth case study of a multinational company was chosen in this research, in which the background, current conditions and environmental interactions were observed, recorded and analyzed for stages of patterns in relation to internal and external influences. Some techniques of blended data gathering which are interviews, naturalistic observation and studying user documentation were employed and then the prototype of CRM system was developed which incorporated User-Centered Design (UCD approach, Hierarchical Task Analysis (HTA, metaphor and identification of users' behaviors and characteristics. The implementation of these techniques, were then measured in terms of usability. Results: Based on the usability testing conducted, the results showed that most of the users agreed that the system is comfortable to work with by taking the quality attributes of learnability, memorizeablity, utility, sortability, font, visualization, user metaphor, information easy view and color as measurement parameters. Conclusions/Recommendations: By combining all these techniques, a comfort level for the users that leads to user satisfaction and higher usability degree can be achieved in a proposed CRM system. Thus, it is important that the companies should put usability quality attribute into a consideration before developing or procuring CRM system to ensure the implementation successfulness of the CRM system.

  3. Towards scalable quantum communication and computation: Novel approaches and realizations

    Science.gov (United States)

    Jiang, Liang

    Quantum information science involves exploration of fundamental laws of quantum mechanics for information processing tasks. This thesis presents several new approaches towards scalable quantum information processing. First, we consider a hybrid approach to scalable quantum computation, based on an optically connected network of few-qubit quantum registers. Specifically, we develop a novel scheme for scalable quantum computation that is robust against various imperfections. To justify that nitrogen-vacancy (NV) color centers in diamond can be a promising realization of the few-qubit quantum register, we show how to isolate a few proximal nuclear spins from the rest of the environment and use them for the quantum register. We also demonstrate experimentally that the nuclear spin coherence is only weakly perturbed under optical illumination, which allows us to implement quantum logical operations that use the nuclear spins to assist the repetitive-readout of the electronic spin. Using this technique, we demonstrate more than two-fold improvement in signal-to-noise ratio. Apart from direct application to enhance the sensitivity of the NV-based nano-magnetometer, this experiment represents an important step towards the realization of robust quantum information processors using electronic and nuclear spin qubits. We then study realizations of quantum repeaters for long distance quantum communication. Specifically, we develop an efficient scheme for quantum repeaters based on atomic ensembles. We use dynamic programming to optimize various quantum repeater protocols. In addition, we propose a new protocol of quantum repeater with encoding, which efficiently uses local resources (about 100 qubits) to identify and correct errors, to achieve fast one-way quantum communication over long distances. Finally, we explore quantum systems with topological order. Such systems can exhibit remarkable phenomena such as quasiparticles with anyonic statistics and have been proposed as

  4. Identifying niche-mediated regulatory factors of stem cell phenotypic state: a systems biology approach.

    Science.gov (United States)

    Ravichandran, Srikanth; Del Sol, Antonio

    2017-02-01

    Understanding how the cellular niche controls the stem cell phenotype is often hampered due to the complexity of variegated niche composition, its dynamics, and nonlinear stem cell-niche interactions. Here, we propose a systems biology view that considers stem cell-niche interactions as a many-body problem amenable to simplification by the concept of mean field approximation. This enables approximation of the niche effect on stem cells as a constant field that induces sustained activation/inhibition of specific stem cell signaling pathways in all stem cells within heterogeneous populations exhibiting the same phenotype (niche determinants). This view offers a new basis for the development of single cell-based computational approaches for identifying niche determinants, which has potential applications in regenerative medicine and tissue engineering. © 2017 The Authors. FEBS Letters published by John Wiley & Sons Ltd on behalf of Federation of European Biochemical Societies.

  5. A Grounded Theory Approach to Identifying and Measuring Forensic Data Acquisition Tasks

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2007-03-01

    Full Text Available As a relatively new field of study, little empirical research has been conducted pertaining to computer forensics.  This lack of empirical research contributes to problems for practitioners and academics alike.For the community of practitioners, problems arise from the dilemma of applying scientific methods to legal matters based on anecdotal training methods, and the academic community is hampered by a lack of theory in this evolving field.  A research study utilizing a multi-method approach to identify and measure tasks practitioners perform during forensic data acquisitions and lay a foundation for academic theory development was conducted in 2006 in conjunction with a doctoral dissertation.An overview of the study’s findings is presented within this article.

  6. A new computational strategy for identifying essential proteins based on network topological properties and biological information.

    Science.gov (United States)

    Qin, Chao; Sun, Yongqi; Dong, Yadong

    2017-01-01

    Essential proteins are the proteins that are indispensable to the survival and development of an organism. Deleting a single essential protein will cause lethality or infertility. Identifying and analysing essential proteins are key to understanding the molecular mechanisms of living cells. There are two types of methods for predicting essential proteins: experimental methods, which require considerable time and resources, and computational methods, which overcome the shortcomings of experimental methods. However, the prediction accuracy of computational methods for essential proteins requires further improvement. In this paper, we propose a new computational strategy named CoTB for identifying essential proteins based on a combination of topological properties, subcellular localization information and orthologous protein information. First, we introduce several topological properties of the protein-protein interaction (PPI) network. Second, we propose new methods for measuring orthologous information and subcellular localization and a new computational strategy that uses a random forest prediction model to obtain a probability score for the proteins being essential. Finally, we conduct experiments on four different Saccharomyces cerevisiae datasets. The experimental results demonstrate that our strategy for identifying essential proteins outperforms traditional computational methods and the most recently developed method, SON. In particular, our strategy improves the prediction accuracy to 89, 78, 79, and 85 percent on the YDIP, YMIPS, YMBD and YHQ datasets at the top 100 level, respectively.

  7. Energy-optimised pharmacophore approach to identify potential hotspots during inhibition of Class II HDAC isoforms.

    Science.gov (United States)

    Ganai, Shabir Ahmad; Shanmugam, Karthi; Mahadevan, Vijayalakshmi

    2015-01-01

    Histone deacetylases (HDACs) are conjugated enzymes that modulate chromatin architecture by deacetylating lysine residues on the histone tails leading to transcriptional repression. Pharmacological interventions of these enzymes with small molecule inhibitors called Histone deacetylase inhibitors (HDACi) have shown enhanced acetylation of the genome and are hence emerging as potential targets at the clinic. Type-specific inhibition of Class II HDACs has shown enhanced therapeutic benefits against developmental and neurodegenerative disorders. However, the structural identity of class-specific isoforms limits the potential of their inhibitors in precise targeting of their enzymes. Diverse strategies have been implemented to recognise the features in HDAC enzymes which may help in identifying isoform specificity factors. This work attempts a computational approach that combines in silico docking and energy-optimised pharmacophore (E-pharmacophore) mapping of 18 known HDAC inhibitors and has identified structural variations that regulate their interactions against the six Class II HDAC enzymes considered for the study. This combined approach establishes that inhibitors possessing higher number of aromatic rings in different structural regions might function as potent inhibitors, while inhibitors with scarce ring structures might point to compromised potency. This would aid the rationale for chemical optimisation and design of isoform selective HDAC inhibitors with enhanced affinity and therapeutic efficiency.

  8. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  9. An approach to computing direction relations between separated object groups

    Science.gov (United States)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  10. A tale of three bio-inspired computational approaches

    Science.gov (United States)

    Schaffer, J. David

    2014-05-01

    I will provide a high level walk-through for three computational approaches derived from Nature. First, evolutionary computation implements what we may call the "mother of all adaptive processes." Some variants on the basic algorithms will be sketched and some lessons I have gleaned from three decades of working with EC will be covered. Then neural networks, computational approaches that have long been studied as possible ways to make "thinking machines", an old dream of man's, and based upon the only known existing example of intelligence. Then, a little overview of attempts to combine these two approaches that some hope will allow us to evolve machines we could never hand-craft. Finally, I will touch on artificial immune systems, Nature's highly sophisticated defense mechanism, that has emerged in two major stages, the innate and the adaptive immune systems. This technology is finding applications in the cyber security world.

  11. The Formal Approach to Computer Game Rule Development Automation

    OpenAIRE

    Elena, A

    2009-01-01

    Computer game rules development is one of the weakly automated tasks in game development. This paper gives an overview of the ongoing research project which deals with automation of rules development for turn-based strategy computer games. Rules are the basic elements of these games. This paper proposes a new approach to automation including visual formal rules model creation, model verification and modelbased code generation.

  12. The process group approach to reliable distributed computing

    Science.gov (United States)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  13. Global identifiability of linear compartmental models--a computer algebra algorithm.

    Science.gov (United States)

    Audoly, S; D'Angiò, L; Saccomani, M P; Cobelli, C

    1998-01-01

    A priori global identifiability deals with the uniqueness of the solution for the unknown parameters of a model and is, thus, a prerequisite for parameter estimation of biological dynamic models. Global identifiability is however difficult to test, since it requires solving a system of algebraic nonlinear equations which increases both in nonlinearity degree and number of terms and unknowns with increasing model order. In this paper, a computer algebra tool, GLOBI (GLOBal Identifiability) is presented, which combines the topological transfer function method with the Buchberger algorithm, to test global identifiability of linear compartmental models. GLOBI allows for the automatic testing of a priori global identifiability of general structure compartmental models from general multi input-multi output experiments. Examples of usage of GLOBI to analyze a priori global identifiability of some complex biological compartmental models are provided.

  14. A recursive network approach can identify constitutive regulatory circuits in gene expression data

    Science.gov (United States)

    Blasi, Monica Francesca; Casorelli, Ida; Colosimo, Alfredo; Blasi, Francesco Simone; Bignami, Margherita; Giuliani, Alessandro

    2005-03-01

    The activity of the cell is often coordinated by the organisation of proteins into regulatory circuits that share a common function. Genome-wide expression profiles might contain important information on these circuits. Current approaches for the analysis of gene expression data include clustering the individual expression measurements and relating them to biological functions as well as modelling and simulation of gene regulation processes by additional computer tools. The identification of the regulative programmes from microarray experiments is limited, however, by the intrinsic difficulty of linear methods to detect low-variance signals and by the sensitivity of the different approaches. Here we face the problem of recognising invariant patterns of correlations among gene expression reminiscent of regulation circuits. We demonstrate that a recursive neural network approach can identify genetic regulation circuits from expression data for ribosomal and genome stability genes. The proposed method, by greatly enhancing the sensitivity of microarray studies, allows the identification of important aspects of genetic regulation networks and might be useful for the discrimination of the different players involved in regulation circuits. Our results suggest that the constitutive regulatory networks involved in the generic organisation of the cell display a high degree of clustering depending on a modular architecture.

  15. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  16. A distributed computing approach to mission operations support. [for spacecraft

    Science.gov (United States)

    Larsen, R. L.

    1975-01-01

    Computing mission operation support includes orbit determination, attitude processing, maneuver computation, resource scheduling, etc. The large-scale third-generation distributed computer network discussed is capable of fulfilling these dynamic requirements. It is shown that distribution of resources and control leads to increased reliability, and exhibits potential for incremental growth. Through functional specialization, a distributed system may be tuned to very specific operational requirements. Fundamental to the approach is the notion of process-to-process communication, which is effected through a high-bandwidth communications network. Both resource-sharing and load-sharing may be realized in the system.

  17. Computer-aided interpretation approach for optical tomographic images

    Science.gov (United States)

    Klose, Christian D.; Klose, Alexander D.; Netz, Uwe J.; Scheel, Alexander K.; Beuthan, Jürgen; Hielscher, Andreas H.

    2010-11-01

    A computer-aided interpretation approach is proposed to detect rheumatic arthritis (RA) in human finger joints using optical tomographic images. The image interpretation method employs a classification algorithm that makes use of a so-called self-organizing mapping scheme to classify fingers as either affected or unaffected by RA. Unlike in previous studies, this allows for combining multiple image features, such as minimum and maximum values of the absorption coefficient for identifying affected and not affected joints. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index, and mutual information. Different methods (i.e., clinical diagnostics, ultrasound imaging, magnet resonance imaging, and inspection of optical tomographic images), were used to produce ground truth benchmarks to determine the performance of image interpretations. Using data from 100 finger joints, findings suggest that some parameter combinations lead to higher sensitivities, while others to higher specificities when compared to single parameter classifications employed in previous studies. Maximum performances are reached when combining the minimum/maximum ratio of the absorption coefficient and image variance. In this case, sensitivities and specificities over 0.9 can be achieved. These values are much higher than values obtained when only single parameter classifications were used, where sensitivities and specificities remained well below 0.8.

  18. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  19. Assessing Trustworthiness in Social Media: A Social Computing Approach

    Science.gov (United States)

    2015-11-17

    31-May-2015 Approved for Public Release; Distribution Unlimited Final Report: Assessing Trustworthiness in Social Media : A Social Computing Approach... media . We propose to investigate research issues related to social media trustworthiness and its assessment by leveraging social research methods...attributes of interest associated with a particular social media user related to the received information. This tool provides a way to combine different

  20. A Unitifed Computational Approach to Oxide Aging Processes

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, D.J.; Fleetwood, D.M.; Hjalmarson, H.P.; Schultz, P.A.

    1999-01-27

    In this paper we describe a unified, hierarchical computational approach to aging and reliability problems caused by materials changes in the oxide layers of Si-based microelectronic devices. We apply this method to a particular low-dose-rate radiation effects problem

  1. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  2. The concept of computer software designed to identify and analyse logistics costs in agricultural enterprises

    Directory of Open Access Journals (Sweden)

    Karol Wajszczyk

    2009-01-01

    Full Text Available The study comprised research, development and computer programming works concerning the development of a concept for the IT tool to be used in the identification and analysis of logistics costs in agricultural enterprises in terms of the process-based approach. As a result of research and programming work an overall functional and IT concept of software was developed for the identification and analysis of logistics costs for agricultural enterprises.

  3. A Computationally Based Approach to Homogenizing Advanced Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  4. A GPU-Computing Approach to Solar Stokes Profile Inversion

    CERN Document Server

    Harker, Brian J

    2012-01-01

    We present a new computational approach to the inversion of solar photospheric Stokes polarization profiles, under the Milne-Eddington model, for vector magnetography. Our code, named GENESIS (GENEtic Stokes Inversion Strategy), employs multi-threaded parallel-processing techniques to harness the computing power of graphics processing units GPUs, along with algorithms designed to exploit the inherent parallelism of the Stokes inversion problem. Using a genetic algorithm (GA) engineered specifically for use with a GPU, we produce full-disc maps of the photospheric vector magnetic field from polarized spectral line observations recorded by the Synoptic Optical Long-term Investigations of the Sun (SOLIS) Vector Spectromagnetograph (VSM) instrument. We show the advantages of pairing a population-parallel genetic algorithm with data-parallel GPU-computing techniques, and present an overview of the Stokes inversion problem, including a description of our adaptation to the GPU-computing paradigm. Full-disc vector ma...

  5. Cloud Computing – A Unified Approach for Surveillance Issues

    Science.gov (United States)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  6. New approach for identifying the zero-order fringe in variable wavelength interferometry

    Science.gov (United States)

    Galas, Jacek; Litwin, Dariusz; Daszkiewicz, Marek

    2016-12-01

    The family of VAWI techniques (for transmitted and reflected light) is especially efficient for characterizing objects, when in the interference system the optical path difference exceeds a few wavelengths. The classical approach that consists in measuring the deflection of interference fringes fails because of strong edge effects. Broken continuity of interference fringes prevents from correct identification of the zero order fringe, which leads to significant errors. The family of these methods has been proposed originally by Professor Pluta in the 1980s but that time image processing facilities and computers were hardly available. Automated devices unfold a completely new approach to the classical measurement procedures. The Institute team has taken that new opportunity and transformed the technique into fully automated measurement devices offering commercial readiness of industry-grade quality. The method itself has been modified and new solutions and algorithms simultaneously have extended the field of application. This has concerned both construction aspects of the systems and software development in context of creating computerized instruments. The VAWI collection of instruments constitutes now the core of the Institute commercial offer. It is now practically applicable in industrial environment for measuring textile and optical fibers, strips of thin films, testing of wave plates and nonlinear affects in different materials. This paper describes new algorithms for identifying the zero order fringe, which increases the performance of the system as a whole and presents some examples of measurements of optical elements.

  7. Computational intelligence approaches for pattern discovery in biological systems.

    Science.gov (United States)

    Fogel, Gary B

    2008-07-01

    Biology, chemistry and medicine are faced by tremendous challenges caused by an overwhelming amount of data and the need for rapid interpretation. Computational intelligence (CI) approaches such as artificial neural networks, fuzzy systems and evolutionary computation are being used with increasing frequency to contend with this problem, in light of noise, non-linearity and temporal dynamics in the data. Such methods can be used to develop robust models of processes either on their own or in combination with standard statistical approaches. This is especially true for database mining, where modeling is a key component of scientific understanding. This review provides an introduction to current CI methods, their application to biological problems, and concludes with a commentary about the anticipated impact of these approaches in bioinformatics.

  8. Learn the Lagrangian: A Vector-Valued RKHS Approach to Identifying Lagrangian Systems.

    Science.gov (United States)

    Cheng, Ching-An; Huang, Han-Pang

    2016-12-01

    We study the modeling of Lagrangian systems with multiple degrees of freedom. Based on system dynamics, canonical parametric models require ad hoc derivations and sometimes simplification for a computable solution; on the other hand, due to the lack of prior knowledge in the system's structure, modern nonparametric models in machine learning face the curse of dimensionality, especially in learning large systems. In this paper, we bridge this gap by unifying the theories of Lagrangian systems and vector-valued reproducing kernel Hilbert space. We reformulate Lagrangian systems with kernels that embed the governing Euler-Lagrange equation-the Lagrangian kernels-and show that these kernels span a subspace capturing the Lagrangian's projection as inverse dynamics. By such property, our model uses only inputs and outputs as in machine learning and inherits the structured form as in system dynamics, thereby removing the need for the mundane derivations for new systems as well as the generalization problem in learning from scratches. In effect, it learns the system's Lagrangian, a simpler task than directly learning the dynamics. To demonstrate, we applied the proposed kernel to identify the robot inverse dynamics in simulations and experiments. Our results present a competitive novel approach to identifying Lagrangian systems, despite using only inputs and outputs.

  9. Identifying a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Science.gov (United States)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-07-12

    In a parallel computer, a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: identifying, by each compute node of the subcommunicator, all logical planes that include the compute node; calculating, by each compute node for each identified logical plane that includes the compute node, an area of the identified logical plane; initiating, by a root node of the subcommunicator, a gather operation; receiving, by the root node from each compute node of the subcommunicator, each node's calculated areas as contribution data to the gather operation; and identifying, by the root node in dependence upon the received calculated areas, a logical plane of the subcommunicator having the greatest area.

  10. Identifying a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator in a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Kristan D.; Faraj, Daniel A.

    2016-07-12

    In a parallel computer, a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: identifying, by each compute node of the subcommunicator, all logical planes that include the compute node; calculating, by each compute node for each identified logical plane that includes the compute node, an area of the identified logical plane; initiating, by a root node of the subcommunicator, a gather operation; receiving, by the root node from each compute node of the subcommunicator, each node's calculated areas as contribution data to the gather operation; and identifying, by the root node in dependence upon the received calculated areas, a logical plane of the subcommunicator having the greatest area.

  11. Gene-based Association Approach Identify Genes Across Stress Traits in Fruit Flies

    DEFF Research Database (Denmark)

    Rohde, Palle Duun; Edwards, Stefan McKinnon; Sarup, Pernille Merete;

    approach grouping variants accordingly to gene position, thus lowering the number of statistical tests performed and increasing the probability of identifying genes with small to moderate effects. Using this approach we identify numerous genes associated with different types of stresses in Drosophila...

  12. A systems biology approach to identify the signalling network regulated by Rho-GDI-γ during neural stem cell differentiation.

    Science.gov (United States)

    Wang, Jiao; Hu, Fuyan; Cheng, Hua; Zhao, Xing-Ming; Wen, Tieqiao

    2012-11-01

    Understanding the molecular mechanism that underlies the differentiation of neural stem cells (NSCs) is vital to develop regenerative medicines for neurological disorders. In our previous work, Rho-GDI-γ was found to be able to prompt neuronal differentiation when it was down regulated. However, it is unclear how Rho-GDI-γ regulates this differentiation process. Therefore, a novel systems biology approach is presented here to identify putative signalling pathways regulated by Rho-GDI-γ during NSC differentiation, and these pathways can provide insights into the NSC differentiation mechanisms. In particular, our proposed approach combines the predictive power of computational biology and molecular experiments. With different biological experiments, the genes in the computationally identified signalling network were validated to be indeed regulated by Rho-GDI-γ during the differentiation of NSCs. In particular, one randomly selected pathway involving Vcp, Mapk8, Ywhae and Ywhah was experimentally verified to be regulated by Rho-GDI-γ. These promising results demonstrate the effectiveness of our proposed systems biology approach, indicating the potential predictive power of integrating computational and experimental approaches.

  13. Neuromolecular computing: a new approach to human brain evolution.

    Science.gov (United States)

    Wallace, R; Price, H

    1999-09-01

    Evolutionary approaches in human cognitive neurobiology traditionally emphasize macroscopic structures. It may soon be possible to supplement these studies with models of human information-processing of the molecular level. Thin-film, simulation, fluorescence microscopy, and high-resolution X-ray crystallographic studies provide evidence for transiently organized neural membrane molecular systems with possible computational properties. This review article examines evidence for hydrophobic-mismatch molecular interactions within phospholipid microdomains of a neural membrane bilayer. It is proposed that these interactions are a massively parallel algorithm which can rapidly compute near-optimal solutions to complex cognitive and physiological problems. Coupling of microdomain activity to permenant ion movements at ligand-gated and voltage-gated channels permits the conversion of molecular computations into neuron frequency codes. Evidence for microdomain transport of proteins to specific locations within the bilayer suggests that neuromolecular computation may be under some genetic control and thus modifiable by natural selection. A possible experimental approach for examining evolutionary changes in neuromolecular computation is briefly discussed.

  14. A Multi-step and Multi-level approach for Computer Aided Molecular Design

    DEFF Research Database (Denmark)

    A general multi-step approach for setting up, solving and solution analysis of computer aided molecular design (CAMD) problems is presented. The approach differs from previous work within the field of CAMD since it also addresses the need for a computer aided problem formulation and result analysis....... The problem formulation step incorporates a knowledge base for the identification and setup of the design criteria. Candidate compounds are identified using a multi-level generate and test CAMD solution algorithm capable of designing molecules having a high level of molecular detail. A post solution step...... using an Integrated Computer Aided System (ICAS) for result analysis and verification is included in the methodology. Keywords: CAMD, separation processes, knowledge base, molecular design, solvent selection, substitution, group contribution, property prediction, ICAS Introduction The use of Computer...

  15. One approach for evaluating the Distributed Computing Design System (DCDS)

    Science.gov (United States)

    Ellis, J. T.

    1985-01-01

    The Distributed Computer Design System (DCDS) provides an integrated environment to support the life cycle of developing real-time distributed computing systems. The primary focus of DCDS is to significantly increase system reliability and software development productivity, and to minimize schedule and cost risk. DCDS consists of integrated methodologies, languages, and tools to support the life cycle of developing distributed software and systems. Smooth and well-defined transistions from phase to phase, language to language, and tool to tool provide a unique and unified environment. An approach to evaluating DCDS highlights its benefits.

  16. An evolutionary computational approach for the dynamic Stackelberg competition problems

    Directory of Open Access Journals (Sweden)

    Lorena Arboleda-Castro

    2016-06-01

    Full Text Available Stackelberg competition models are an important family of economical decision problems from game theory, in which the main goal is to find optimal strategies between two competitors taking into account their hierarchy relationship. Although these models have been widely studied in the past, it is important to note that very few works deal with uncertainty scenarios, especially those that vary over time. In this regard, the present research studies this topic and proposes a computational method for solving efficiently dynamic Stackelberg competition models. The computational experiments suggest that the proposed approach is effective for problems of this nature.

  17. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    Science.gov (United States)

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  18. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  19. A computational toy model for shallow landslides: Molecular dynamics approach

    Science.gov (United States)

    Martelloni, Gianluca; Bagnoli, Franco; Massaro, Emanuele

    2013-09-01

    The aim of this paper is to propose a 2D computational algorithm for modeling the triggering and propagation of shallow landslides caused by rainfall. We used a molecular dynamics (MD) approach, similar to the discrete element method (DEM), that is suitable to model granular material and to observe the trajectory of a single particle, so to possibly identify its dynamical properties. We consider that the triggering of shallow landslides is caused by the decrease of the static friction along the sliding surface due to water infiltration by rainfall. Thence the triggering is caused by the two following conditions: (a) a threshold speed of the particles and (b) a condition on the static friction, between the particles and the slope surface, based on the Mohr-Coulomb failure criterion. The latter static condition is used in the geotechnical model to estimate the possibility of landslide triggering. The interaction force between particles is modeled, in the absence of experimental data, by means of a potential similar to the Lennard-Jones one. The viscosity is also introduced in the model and for a large range of values of the model's parameters, we observe a characteristic velocity pattern, with acceleration increments, typical of real landslides. The results of simulations are quite promising: the energy and time triggering distribution of local avalanches show a power law distribution, analogous to the observed Gutenberg-Richter and Omori power law distributions for earthquakes. Finally, it is possible to apply the method of the inverse surface displacement velocity [4] for predicting the failure time.

  20. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words

    Directory of Open Access Journals (Sweden)

    Bingkun Wang

    2015-01-01

    Full Text Available With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  1. A Fuzzy Computing Model for Identifying Polarity of Chinese Sentiment Words.

    Science.gov (United States)

    Wang, Bingkun; Huang, Yongfeng; Wu, Xian; Li, Xing

    2015-01-01

    With the spurt of online user-generated contents on web, sentiment analysis has become a very active research issue in data mining and natural language processing. As the most important indicator of sentiment, sentiment words which convey positive and negative polarity are quite instrumental for sentiment analysis. However, most of the existing methods for identifying polarity of sentiment words only consider the positive and negative polarity by the Cantor set, and no attention is paid to the fuzziness of the polarity intensity of sentiment words. In order to improve the performance, we propose a fuzzy computing model to identify the polarity of Chinese sentiment words in this paper. There are three major contributions in this paper. Firstly, we propose a method to compute polarity intensity of sentiment morphemes and sentiment words. Secondly, we construct a fuzzy sentiment classifier and propose two different methods to compute the parameter of the fuzzy classifier. Thirdly, we conduct extensive experiments on four sentiment words datasets and three review datasets, and the experimental results indicate that our model performs better than the state-of-the-art methods.

  2. Computational Approach for Multi Performances Optimization of EDM

    Directory of Open Access Journals (Sweden)

    Yusoff Yusliza

    2016-01-01

    Full Text Available This paper proposes a new computational approach employed in obtaining optimal parameters of multi performances EDM. Regression and artificial neural network (ANN are used as the modeling techniques meanwhile multi objective genetic algorithm (multiGA is used as the optimization technique. Orthogonal array L256 is implemented in the procedure of network function and network architecture selection. Experimental studies are carried out to verify the machining performances suggested by this approach. The highest MRR value obtained from OrthoANN – MPR – MultiGA is 205.619 mg/min and the lowest Ra value is 0.0223μm.

  3. Computer Mechatronics: A Radical Approach to Mechatronics Education

    OpenAIRE

    Nilsson, Martin

    2005-01-01

    This paper describes some distinguishing features of a course on mechatronics, based on computer science. We propose a teaching approach called Controlled Problem-Based Learning (CPBL). We have applied this method on three generations (2003-2005) of mainly fourth-year undergraduate students at Lund University (LTH). Although students found the course difficult, there were no dropouts, and all students attended the examination 2005.

  4. COMPTEL skymapping: a new approach using parallel computing

    OpenAIRE

    Strong, A.W.; Bloemen, H.; Diehl, R.; Hermsen, W.; Schoenfelder, V.

    1998-01-01

    Large-scale skymapping with COMPTEL using the full survey database presents challenging problems on account of the complex response and time-variable background. A new approach which attempts to address some of these problems is described, in which the information about each observation is preserved throughout the analysis. In this method, a maximum-entropy algorithm is used to determine image and background simultaneously. Because of the extreme computing requirements, the method has been im...

  5. Review: the physiological and computational approaches for atherosclerosis treatment.

    Science.gov (United States)

    Wang, Wuchen; Lee, Yugyung; Lee, Chi H

    2013-09-01

    The cardiovascular disease has long been an issue that causes severe loss in population, especially those conditions associated with arterial malfunction, being attributable to atherosclerosis and subsequent thrombotic formation. This article reviews the physiological mechanisms that underline the transition from plaque formation in atherosclerotic process to platelet aggregation and eventually thrombosis. The physiological and computational approaches, such as percutaneous coronary intervention and stent design modeling, to detect, evaluate and mitigate this malicious progression were also discussed.

  6. A spline-based approach for computing spatial impulse responses.

    Science.gov (United States)

    Ellis, Michael A; Guenther, Drake; Walker, William F

    2007-05-01

    Computer simulations are an essential tool for the design of phased-array ultrasonic imaging systems. FIELD II, which determines the two-way temporal response of a transducer at a point in space, is the current de facto standard for ultrasound simulation tools. However, the need often arises to obtain two-way spatial responses at a single point in time, a set of dimensions for which FIELD II is not well optimized. This paper describes an analytical approach for computing the two-way, far-field, spatial impulse response from rectangular transducer elements under arbitrary excitation. The described approach determines the response as the sum of polynomial functions, making computational implementation quite straightforward. The proposed algorithm, named DELFI, was implemented as a C routine under Matlab and results were compared to those obtained under similar conditions from the well-established FIELD II program. Under the specific conditions tested here, the proposed algorithm was approximately 142 times faster than FIELD II for computing spatial sensitivity functions with similar amounts of error. For temporal sensitivity functions with similar amounts of error, the proposed algorithm was about 1.7 times slower than FIELD II using rectangular elements and 19.2 times faster than FIELD II using triangular elements. DELFI is shown to be an attractive complement to FIELD II, especially when spatial responses are needed at a specific point in time.

  7. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  8. A Computational Procedure for Identifying Bilinear Representations of Nonlinear Systems Using Volterra Kernels

    Science.gov (United States)

    Kvaternik, Raymond G.; Silva, Walter A.

    2008-01-01

    A computational procedure for identifying the state-space matrices corresponding to discrete bilinear representations of nonlinear systems is presented. A key feature of the method is the use of first- and second-order Volterra kernels (first- and second-order pulse responses) to characterize the system. The present method is based on an extension of a continuous-time bilinear system identification procedure given in a 1971 paper by Bruni, di Pillo, and Koch. The analytical and computational considerations that underlie the original procedure and its extension to the title problem are presented and described, pertinent numerical considerations associated with the process are discussed, and results obtained from the application of the method to a variety of nonlinear problems from the literature are presented. The results of these exploratory numerical studies are decidedly promising and provide sufficient credibility for further examination of the applicability of the method.

  9. Identifiability and self-presentation: computer-mediated communication and intergroup interaction.

    Science.gov (United States)

    Douglas, K M; McGarty, C

    2001-09-01

    This research investigated the intergroup properties of hostile 'flaming' behaviour in computer-mediated communication and how flaming language is affected by Internet identifiability, or identifiability by name and e-mail address/geographical location as is common to Internet communication. According to the Social Identity Model of Deindividuation Effects (SIDE; e.g. Reicher, Spears, & Postmes, 1995) there may be strategic reasons for identifiable groups members to act in a more group-normative manner in the presence of an audience, to gain acceptance from the in-group, to avoid punishment from the out-group, or to assert their identity to the out-group. For these reasons, it was predicted that communicators would produce more stereotype-consistent (group-normative) descriptions of out-group members' behaviours when their descriptions were identifiable to an audience. In one archival and three experimental studies, it was found that identifiability to an in-group audience was associated with higher levels of stereotype-consistent language when communicators described anonymous out-group targets. These results extend SIDE and suggest the importance of an in-group audience for the expression of stereotypical views.

  10. A systems biological approach to identify key transcription factors and their genomic neighborhoods in human sarcomas

    Institute of Scientific and Technical Information of China (English)

    Antti Ylip(a)(a); Olli Yli-Harja; Wei Zhang; Matti Nykter

    2011-01-01

    Identification of genetic signatures is the main objective for many computational oncology studies. The signature usually consists of numerous genes that are differentially expressed between two clinically distinct groups of samples, such as tumor subtypes. Prospectively, many signatures have been found to generalize poorly to other datasets and, thus, have rarely been accepted into clinical use. Recognizing the limited success of traditionally generated signatures, we developed a systems biology-based framework for robust identification of key transcription factors and their genomic regulatory neighborhoods. Application of the framework to study the differences between gastrointestinal stromal tumor (GIST) and leiomyosarcoma (LMS) resulted in the identification of nine transcription factors (SRF, NKX2-5, CCDC6, LEF1, VDR, ZNF250, TRIM63, MAF, and MYC). Functional annotations of the obtained neighborhoods identified the biological processes which the key transcription factors regulate differently between the tumor types. Analyzing the differences in the expression patterns using our approach resulted in a more robust genetic signature and more biological insight into the diseases compared to a traditional genetic signature.

  11. Analysis of diabetic retinopathy biomarker VEGF gene by computational approaches

    OpenAIRE

    Jayashree Sadasivam; Ramesh, N.; K. Vijayalakshmi; Vinni Viridi; Shiva prasad

    2012-01-01

    Diabetic retinopathy, the most common diabetic eye disease, is caused by changes in the blood vessels of the retina which remains the major cause. It is characterized by vascular permeability and increased tissue ischemia and angiogenesis. One of the biomarker for Diabetic retinopathy has been identified as Vascular Endothelial Growth Factor ( VEGF )gene by computational analysis. VEGF is a sub-family of growth factors, the platelet-derived growth factor family of cystine-knot growth factors...

  12. A computational language approach to modeling prose recall in schizophrenia.

    Science.gov (United States)

    Rosenstein, Mark; Diaz-Asper, Catherine; Foltz, Peter W; Elvevåg, Brita

    2014-06-01

    Many cortical disorders are associated with memory problems. In schizophrenia, verbal memory deficits are a hallmark feature. However, the exact nature of this deficit remains elusive. Modeling aspects of language features used in memory recall have the potential to provide means for measuring these verbal processes. We employ computational language approaches to assess time-varying semantic and sequential properties of prose recall at various retrieval intervals (immediate, 30 min and 24 h later) in patients with schizophrenia, unaffected siblings and healthy unrelated control participants. First, we model the recall data to quantify the degradation of performance with increasing retrieval interval and the effect of diagnosis (i.e., group membership) on performance. Next we model the human scoring of recall performance using an n-gram language sequence technique, and then with a semantic feature based on Latent Semantic Analysis. These models show that automated analyses of the recalls can produce scores that accurately mimic human scoring. The final analysis addresses the validity of this approach by ascertaining the ability to predict group membership from models built on the two classes of language features. Taken individually, the semantic feature is most predictive, while a model combining the features improves accuracy of group membership prediction slightly above the semantic feature alone as well as over the human rating approach. We discuss the implications for cognitive neuroscience of such a computational approach in exploring the mechanisms of prose recall.

  13. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  14. Identifying inhibitory compounds in lignocellulosic biomass hydrolysates using an exometabolomics approach

    NARCIS (Netherlands)

    Zha, Y.; Westerhuis, J.A.; Muilwijk, B.; Overkamp, K.M.; Nijmeijer, B.M.; Coulier, L.; Smilde, A.K.; Punt, P.J.

    2014-01-01

    Background: Inhibitors are formed that reduce the fermentation performance of fermenting yeast during the pretreatment process of lignocellulosic biomass. An exometabolomics approach was applied to systematically identify inhibitors in lignocellulosic biomass hydrolysates.Results: We studied the com

  15. Identifying inhibitory compounds in lignocellulosic biomass hydrolysates using an exometabolomics approach

    NARCIS (Netherlands)

    Y. Zha; J.A. Westerhuis; B. Muilwijk; K.M. Overkamp; B.M. Nijmeijer; L. Coulier; A.K. Smilde; P.J. Punt

    2014-01-01

    BACKGROUND: Inhibitors are formed that reduce the fermentation performance of fermenting yeast during the pretreatment process of lignocellulosic biomass. An exometabolomics approach was applied to systematically identify inhibitors in lignocellulosic biomass hydrolysates. RESULTS: We studied the co

  16. Real time method and computer system for identifying radioactive materials from HPGe gamma-ray spectroscopy

    Science.gov (United States)

    Rowland, Mark S.; Howard, Douglas E.; Wong, James L.; Jessup, James L.; Bianchini, Greg M.; Miller, Wayne O.

    2007-10-23

    A real-time method and computer system for identifying radioactive materials which collects gamma count rates from a HPGe gamma-radiation detector to produce a high-resolution gamma-ray energy spectrum. A library of nuclear material definitions ("library definitions") is provided, with each uniquely associated with a nuclide or isotope material and each comprising at least one logic condition associated with a spectral parameter of a gamma-ray energy spectrum. The method determines whether the spectral parameters of said high-resolution gamma-ray energy spectrum satisfy all the logic conditions of any one of the library definitions, and subsequently uniquely identifies the material type as that nuclide or isotope material associated with the satisfied library definition. The method is iteratively repeated to update the spectrum and identification in real time.

  17. A computational approach to chemical etiologies of diabetes

    DEFF Research Database (Denmark)

    Audouze, Karine Marie Laure; Brunak, Søren; Grandjean, Philippe

    2013-01-01

    , and perfluorooctanoic acid. For these substances we integrated disease and pathway annotations on top of protein interactions to reveal possible pathogenetic pathways that deserve empirical testing. The approach is general and can address other public health concerns in addition to identifying diabetogenic chemicals...

  18. Subclinical coronary atherosclerosis identified by coronary computed tomographic angiography in asymptomatic morbidly obese patients

    Directory of Open Access Journals (Sweden)

    Peter A. McCullough

    2010-09-01

    Full Text Available Obesity is a common public health problem and obese individuals in particular have a disproportionate incidence of acute coronary events. This study was undertaken to identify coronary artery lesions as well as associated clinical features, risk factors and demographics in patients with a body mass index (BMI >40 kg/m2 without known coronary artery disease (CAD. Morbidly obese subjects were prospectively recruited to undergo coronary computed tomographic angiography (CCTA using a dual-source computed tomography (CT system. CAD was defined as the presence of any atherosclerotic lesion in any one coronary artery segment. The presence, location, and severity of atherosclerosis were related to patient characteristics. Forty-one patients (28 women, mean age, 50.4±10.0 years, mean BMI, 43.8±4.8 kg/m2 served as the study population. Of these, 25 patients (61% had at least one coronary stenosis. All but 2 patients within the CAD cohort had coronary artery calcium (CAC scores >0, and most plaques identified (75.4% were non-calcified. There was a predilection of calcified and non-calcified atherosclerosis involving the left anterior descending (LAD coronary artery compared with other coronary segments. Univariate predictors of CAD included older age, dyslipidemia, and diabetes. In this preliminary study of young morbidly obese patients, CCTA detected a high prevalence of calcified and non-calcified CAD, although the later predominated.

  19. Rapid, computer vision-enabled murine screening system identifies neuropharmacological potential of two new mechanisms

    Directory of Open Access Journals (Sweden)

    Steven L Roberds

    2011-09-01

    Full Text Available The lack of predictive in vitro models for behavioral phenotypes impedes rapid advancement in neuropharmacology and psychopharmacology. In vivo behavioral assays are more predictive of activity in human disorders, but such assays are often highly resource-intensive. Here we describe the successful application of a computer vision-enabled system to identify potential neuropharmacological activity of two new mechanisms. The analytical system was trained using multiple drugs that are used clinically to treat depression, schizophrenia, anxiety, and other psychiatric or behavioral disorders. During blinded testing the PDE10 inhibitor TP-10 produced a signature of activity suggesting potential antipsychotic activity. This finding is consistent with TP-10’s activity in multiple rodent models that is similar to that of clinically used antipsychotic drugs. The CK1ε inhibitor PF-670462 produced a signature consistent with anxiolytic activity and, at the highest dose tested, behavioral effects similar to that of opiate analgesics. Neither TP-10 nor PF-670462 was included in the training set. Thus, computer vision-based behavioral analysis can facilitate drug discovery by identifying neuropharmacological effects of compounds acting through new mechanisms.

  20. An Experimentally Based Computer Search Identifies Unstructured Membrane-binding Sites in Proteins

    Science.gov (United States)

    Brzeska, Hanna; Guag, Jake; Remmert, Kirsten; Chacko, Susan; Korn, Edward D.

    2010-01-01

    Programs exist for searching protein sequences for potential membrane-penetrating segments (hydrophobic regions) and for lipid-binding sites with highly defined tertiary structures, such as PH, FERM, C2, ENTH, and other domains. However, a rapidly growing number of membrane-associated proteins (including cytoskeletal proteins, kinases, GTP-binding proteins, and their effectors) bind lipids through less structured regions. Here, we describe the development and testing of a simple computer search program that identifies unstructured potential membrane-binding sites. Initially, we found that both basic and hydrophobic amino acids, irrespective of sequence, contribute to the binding to acidic phospholipid vesicles of synthetic peptides that correspond to the putative membrane-binding domains of Acanthamoeba class I myosins. Based on these results, we modified a hydrophobicity scale giving Arg- and Lys-positive, rather than negative, values. Using this basic and hydrophobic scale with a standard search algorithm, we successfully identified previously determined unstructured membrane-binding sites in all 16 proteins tested. Importantly, basic and hydrophobic searches identified previously unknown potential membrane-binding sites in class I myosins, PAKs and CARMIL (capping protein, Arp2/3, myosin I linker; a membrane-associated cytoskeletal scaffold protein), and synthetic peptides and protein domains containing these newly identified sites bound to acidic phospholipids in vitro. PMID:20018884

  1. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  2. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Perrin H. Beatty

    2016-10-01

    Full Text Available A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields.

  3. Solubility of nonelectrolytes: a first-principles computational approach.

    Science.gov (United States)

    Jackson, Nicholas E; Chen, Lin X; Ratner, Mark A

    2014-05-15

    Using a combination of classical molecular dynamics and symmetry adapted intermolecular perturbation theory, we develop a high-accuracy computational method for examining the solubility energetics of nonelectrolytes. This approach is used to accurately compute the cohesive energy density and Hildebrand solubility parameters of 26 molecular liquids. The energy decomposition of symmetry adapted perturbation theory is then utilized to develop multicomponent Hansen-like solubility parameters. These parameters are shown to reproduce the solvent categorizations (nonpolar, polar aprotic, or polar protic) of all molecular liquids studied while lending quantitative rigor to these qualitative categorizations via the introduction of simple, easily computable parameters. Notably, we find that by monitoring the first-order exchange energy contribution to the total interaction energy, one can rigorously determine the hydrogen bonding character of a molecular liquid. Finally, this method is applied to compute explicitly the Flory interaction parameter and the free energy of mixing for two different small molecule mixtures, reproducing the known miscibilities. This methodology represents an important step toward the prediction of molecular solubility from first principles.

  4. [Computer work and De Quervain's tenosynovitis: an evidence based approach].

    Science.gov (United States)

    Gigante, M R; Martinotti, I; Cirla, P E

    2012-01-01

    The debate around the role of the work at personal computer as cause of De Quervain's Tenosynovitis was developed partially, without considering multidisciplinary available data. A systematic review of the literature, using an evidence-based approach, was performed. In disorders associated with the use of VDU, we must distinguish those at the upper limbs and among them those related to an overload. Experimental studies on the occurrence of De Quervain's Tenosynovitis are quite limited, as well as clinically are quite difficult to prove the professional etiology, considering the interference due to other activities of daily living or to the biological susceptibility (i.e. anatomical variability, sex, age, exercise). At present there is no evidence of any connection between De Quervain syndrome and time of use of the personal computer or keyboard, limited evidence of correlation is found with time using a mouse. No data are available regarding the use exclusively or predominantly for personal laptops or mobile "smart phone".

  5. Benchmarking of computer codes and approaches for modeling exposure scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R.R. [EG and G Idaho, Inc., Idaho Falls, ID (United States); Rittmann, P.D.; Wood, M.I. [Westinghouse Hanford Co., Richland, WA (United States); Cook, J.R. [Westinghouse Savannah River Co., Aiken, SC (United States)

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided.

  6. Computational approaches for rational design of proteins with novel functionalities

    Directory of Open Access Journals (Sweden)

    Manish Kumar Tiwari

    2012-09-01

    Full Text Available Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  7. Computational approaches for rational design of proteins with novel functionalities.

    Science.gov (United States)

    Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar; Kim, In-Won; Lee, Jung-Kul

    2012-01-01

    Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  8. A Novel Synthesis of Computational Approaches Enables Optimization of Grasp Quality of Tendon-Driven Hands

    Science.gov (United States)

    Inouye, Joshua M.; Kutch, Jason J.; Valero-Cuevas, Francisco J.

    2013-01-01

    We propose a complete methodology to find the full set of feasible grasp wrenches and the corresponding wrench-direction-independent grasp quality for a tendon-driven hand with arbitrary design parameters. Monte Carlo simulations on two representative designs combined with multiple linear regression identified the parameters with the greatest potential to increase this grasp metric. This synthesis of computational approaches now enables the systematic design, evaluation, and optimization of tendon-driven hands. PMID:23335864

  9. Computational Approaches for Probing the Formation of Atmospheric Molecular Clusters

    DEFF Research Database (Denmark)

    Elm, Jonas

    This thesis presents the investigation of atmospheric molecular clusters using computational methods. Previous investigations have focused on solving problems related to atmospheric nucleation, and have not been targeted at the performance of the applied methods. This thesis focuses on assessing...... the performance of computational strategies in order to identify a sturdy methodology, which should be applicable for handling various issues related to atmospheric cluster formation. Density functional theory (DFT) is applied to study individual cluster formation steps. Utilizing large test sets of numerous...... and pinic acid) for atmospheric cluster formation. Glycine is found to have a similar potential as ammonia in enhancing atmospheric nucleation. Pinic acid molecules form favourable clusters with sulfuric acid, but with formation free energies which are too low to explain observed nucleation rates. Pinic...

  10. Approaches to Computer Modeling of Phosphate Hide-Out.

    Science.gov (United States)

    1984-06-28

    phosphate acts as a buffer to keep pH at a value above which acid corrosion occurs . and below which caustic corrosion becomes significant. Difficulties are...ionization of dihydrogen phosphate : HIPO - + + 1PO, K (B-7) H+ + - £Iao 1/1, (B-8) H , PO4 - + O- - H0 4 + H20 K/Kw (0-9) 19 * Such zero heat...OF STANDARDS-1963-A +. .0 0 0 9t~ - 4 NRL Memorandum Report 5361 4 Approaches to Computer Modeling of Phosphate Hide-Out K. A. S. HARDY AND J. C

  11. Exploiting Self-organization in Bioengineered Systems: A Computational Approach.

    Science.gov (United States)

    Davis, Delin; Doloman, Anna; Podgorski, Gregory J; Vargis, Elizabeth; Flann, Nicholas S

    2017-01-01

    The productivity of bioengineered cell factories is limited by inefficiencies in nutrient delivery and waste and product removal. Current solution approaches explore changes in the physical configurations of the bioreactors. This work investigates the possibilities of exploiting self-organizing vascular networks to support producer cells within the factory. A computational model simulates de novo vascular development of endothelial-like cells and the resultant network functioning to deliver nutrients and extract product and waste from the cell culture. Microbial factories with vascular networks are evaluated for their scalability, robustness, and productivity compared to the cell factories without a vascular network. Initial studies demonstrate that at least an order of magnitude increase in production is possible, the system can be scaled up, and the self-organization of an efficient vascular network is robust. The work suggests that bioengineered multicellularity may offer efficiency improvements difficult to achieve with physical engineering approaches.

  12. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    Institute of Scientific and Technical Information of China (English)

    Junaid Ali Khan; Muhammad Asif Zahoor Raja; Ijaz Mansoor Qureshi

    2011-01-01

    @@ We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs).The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error.The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique.The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations.We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods.The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy.With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.%We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.

  13. Integration of experimental and computational methods for identifying geometric, thermal and diffusive properties of biomaterials

    Science.gov (United States)

    Weres, Jerzy; Kujawa, Sebastian; Olek, Wiesław; Czajkowski, Łukasz

    2016-04-01

    Knowledge of physical properties of biomaterials is important in understanding and designing agri-food and wood processing industries. In the study presented in this paper computational methods were developed and combined with experiments to enhance identification of agri-food and forest product properties, and to predict heat and water transport in such products. They were based on the finite element model of heat and water transport and supplemented with experimental data. Algorithms were proposed for image processing, geometry meshing, and inverse/direct finite element modelling. The resulting software system was composed of integrated subsystems for 3D geometry data acquisition and mesh generation, for 3D geometry modelling and visualization, and for inverse/direct problem computations for the heat and water transport processes. Auxiliary packages were developed to assess performance, accuracy and unification of data access. The software was validated by identifying selected properties and using the estimated values to predict the examined processes, and then comparing predictions to experimental data. The geometry, thermal conductivity, specific heat, coefficient of water diffusion, equilibrium water content and convective heat and water transfer coefficients in the boundary layer were analysed. The estimated values, used as an input for simulation of the examined processes, enabled reduction in the uncertainty associated with predictions.

  14. Computer simulation modeling of abnormal behavior: a program approach.

    Science.gov (United States)

    Reilly, K D; Freese, M R; Rowe, P B

    1984-07-01

    A need for modeling abnormal behavior on a comprehensive, systematic basis exists. Computer modeling and simulation tools offer especially good opportunities to establish such a program of studies. Issues concern deciding which modeling tools to use, how to relate models to behavioral data, what level of modeling to employ, and how to articulate theory to facilitate such modeling. Four levels or types of modeling, two qualitative and two quantitative, are identified. Their properties are examined and interrelated to include illustrative applications to the study of abnormal behavior, with an emphasis on schizophrenia.

  15. A new approach in CHP steam turbines thermodynamic cycles computations

    Directory of Open Access Journals (Sweden)

    Grković Vojin R.

    2012-01-01

    Full Text Available This paper presents a new approach in mathematical modeling of thermodynamic cycles and electric power of utility district-heating and cogeneration steam turbines. The approach is based on the application of the dimensionless mass flows, which describe the thermodynamic cycle of a combined heat and power steam turbine. The mass flows are calculated relative to the mass flow to low pressure turbine. The procedure introduces the extraction mass flow load parameter νh which clearly indicates the energy transformation process, as well as the cogeneration turbine design features, but also its fitness for the electrical energy system requirements. The presented approach allows fast computations, as well as direct calculation of the selected energy efficiency indicators. The approach is exemplified with the calculation results of the district heat power to electric power ratio, as well as the cycle efficiency, versus νh. The influence of νh on the conformity of a combined heat and power turbine to the grid requirements is also analyzed and discussed. [Projekat Ministarstva nauke Republike Srbije, br. 33049: Development of CHP demonstration plant with gasification of biomass

  16. Automated computation of arbor densities: a step toward identifying neuronal cell types.

    Science.gov (United States)

    Sümbül, Uygar; Zlateski, Aleksandar; Vishwanathan, Ashwin; Masland, Richard H; Seung, H Sebastian

    2014-01-01

    The shape and position of a neuron convey information regarding its molecular and functional identity. The identification of cell types from structure, a classic method, relies on the time-consuming step of arbor tracing. However, as genetic tools and imaging methods make data-driven approaches to neuronal circuit analysis feasible, the need for automated processing increases. Here, we first establish that mouse retinal ganglion cell types can be as precise about distributing their arbor volumes across the inner plexiform layer as they are about distributing the skeletons of the arbors. Then, we describe an automated approach to computing the spatial distribution of the dendritic arbors, or arbor density, with respect to a global depth coordinate based on this observation. Our method involves three-dimensional reconstruction of neuronal arbors by a supervised machine learning algorithm, post-processing of the enhanced stacks to remove somata and isolate the neuron of interest, and registration of neurons to each other using automatically detected arbors of the starburst amacrine interneurons as fiducial markers. In principle, this method could be generalizable to other structures of the CNS, provided that they allow sparse labeling of the cells and contain a reliable axis of spatial reference.

  17. Automated computation of arbor densities: a step toward identifying neuronal cell types

    Directory of Open Access Journals (Sweden)

    Uygar eSümbül

    2014-11-01

    Full Text Available The shape and position of a neuron convey information regarding its molecular and functional identity. The identification of cell types from structure, a classic method, relies on the time-consuming step of arbor tracing. However, as genetic tools and imaging methods make data-driven approaches to neuronal circuit analysis feasible, the need for automated processing increases. Here, we first establish that mouse retinal ganglion cell types can be as precise about distributing their arbor volumes across the inner plexiform layer as they are about distributing the skeletons of the arbors. Then, we describe an automated approach to computing the spatial distribution of the dendritic arbors, or arbor density, with respect to a global depth coordinate based on this observation. Our method involves three-dimensional reconstruction of neuronal arbors by a supervised machine learning algorithm, post-processing of the enhanced stacks to remove somata and isolate the neuron of interest, and registration of neurons to each other using automatically detected arbors of the starburst amacrine interneurons as fiducial markers. In principle, this method could be generalizable to other structures of the CNS, provided that they allow sparse labeling of the cells and contain a reliable axis of spatial reference.

  18. An engineering based approach for hydraulic computations in river flows

    Science.gov (United States)

    Di Francesco, S.; Biscarini, C.; Pierleoni, A.; Manciola, P.

    2016-06-01

    This paper presents an engineering based approach for hydraulic risk evaluation. The aim of the research is to identify a criteria for the choice of the simplest and appropriate model to use in different scenarios varying the characteristics of main river channel. The complete flow field, generally expressed in terms of pressure, velocities, accelerations can be described through a three dimensional approach that consider all the flow properties varying in all directions. In many practical applications for river flow studies, however, the greatest changes occur only in two dimensions or even only in one. In these cases the use of simplified approaches can lead to accurate results, with easy to build and faster simulations. The study has been conducted taking in account a dimensionless parameter of channels (ratio of curvature radius and width of the channel (R/B).

  19. An Approach for Location privacy in Pervasive Computing Environment

    Directory of Open Access Journals (Sweden)

    Sudheer Kumar Singh

    2010-05-01

    Full Text Available This paper focus on location privacy in location based services, Location privacy is a particular type of information privacy that can be defined as the ability to prevent others from learning one’s current or past location. Many systems such as GPS implicitly and automatically give its users location privacy. Once user sends his or her current location to the application server, Application server stores current locations of users in application server database. User can not delete or modify his or her location data after sending once to application server. Addressing this problem, Here in this paper, we are giving theoretical concept for protecting location privacy in pervasive computing environment. This approach based on user anonymity based location privacy. Going through the basic user anonymity based a location privacy approach that uses trusted proxy. By analysis of this approach, we propose an improvement over it using dummy-locations of users and also dummies of requested services by users from the application server. In this paper, this approach reduces the user’s overheads to extracting necessary information from reply message coming from application server. In this approach, user send a message having (current location and ID+ requested service to the trusted proxy and trusted proxy generates dummies location related to current location and also generates temporary pseudonym corresponding to real ID of users. After Analysis of this approach wehave found on problem with requested service. Addressing this problem, we improve our method by using dummies of requested service generated by trusted proxy. Trusted proxy generated Dummies (false position by dummies location algorithms.

  20. Genetic braid optimization: A heuristic approach to compute quasiparticle braids

    Science.gov (United States)

    McDonald, Ross B.; Katzgraber, Helmut G.

    2013-02-01

    In topologically protected quantum computation, quantum gates can be carried out by adiabatically braiding two-dimensional quasiparticles, reminiscent of entangled world lines. Bonesteel [Phys. Rev. Lett.10.1103/PhysRevLett.95.140503 95, 140503 (2005)], as well as Leijnse and Flensberg [Phys. Rev. B10.1103/PhysRevB.86.104511 86, 104511 (2012)], recently provided schemes for computing quantum gates from quasiparticle braids. Mathematically, the problem of executing a gate becomes that of finding a product of the generators (matrices) in that set that approximates the gate best, up to an error. To date, efficient methods to compute these gates only strive to optimize for accuracy. We explore the possibility of using a generic approach applicable to a variety of braiding problems based on evolutionary (genetic) algorithms. The method efficiently finds optimal braids while allowing the user to optimize for the relative utilities of accuracy and/or length. Furthermore, when optimizing for error only, the method can quickly produce efficient braids.

  1. Novel computational approaches for the analysis of cosmic magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)

    2016-07-01

    In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.

  2. Computational approaches to understand cardiac electrophysiology and arrhythmias

    Science.gov (United States)

    Roberts, Byron N.; Yang, Pei-Chi; Behrens, Steven B.; Moreno, Jonathan D.

    2012-01-01

    Cardiac rhythms arise from electrical activity generated by precisely timed opening and closing of ion channels in individual cardiac myocytes. These impulses spread throughout the cardiac muscle to manifest as electrical waves in the whole heart. Regularity of electrical waves is critically important since they signal the heart muscle to contract, driving the primary function of the heart to act as a pump and deliver blood to the brain and vital organs. When electrical activity goes awry during a cardiac arrhythmia, the pump does not function, the brain does not receive oxygenated blood, and death ensues. For more than 50 years, mathematically based models of cardiac electrical activity have been used to improve understanding of basic mechanisms of normal and abnormal cardiac electrical function. Computer-based modeling approaches to understand cardiac activity are uniquely helpful because they allow for distillation of complex emergent behaviors into the key contributing components underlying them. Here we review the latest advances and novel concepts in the field as they relate to understanding the complex interplay between electrical, mechanical, structural, and genetic mechanisms during arrhythmia development at the level of ion channels, cells, and tissues. We also discuss the latest computational approaches to guiding arrhythmia therapy. PMID:22886409

  3. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  4. A Kriging surrogate model coupled in simulation-optimization approach for identifying release history of groundwater sources

    Science.gov (United States)

    Zhao, Ying; Lu, Wenxi; Xiao, Chuanning

    2016-02-01

    As the incidence frequency of groundwater pollution increases, many methods that identify source characteristics of pollutants are being developed. In this study, a simulation-optimization approach was applied to determine the duration and magnitude of pollutant sources. Such problems are time consuming because thousands of simulation models are required to run the optimization model. To address this challenge, the Kriging surrogate model was proposed to increase computational efficiency. Accuracy, time consumption, and the robustness of the Kriging model were tested on both homogenous and non-uniform media, as well as steady-state and transient flow and transport conditions. The results of three hypothetical cases demonstrate that the Kriging model has the ability to solve groundwater contaminant source problems that could occur during field site source identification problems with a high degree of accuracy and short computation times and is thus very robust.

  5. Outbreaks source: A new mathematical approach to identify their possible location

    Science.gov (United States)

    Buscema, Massimo; Grossi, Enzo; Breda, Marco; Jefferson, Tom

    2009-11-01

    Classical epidemiology has generally relied on the description and explanation of the occurrence of infectious diseases in relation to time occurrence of events rather than to place of occurrence. In recent times, computer generated dot maps have facilitated the modeling of the spread of infectious epidemic diseases either with classical statistics approaches or with artificial “intelligent systems”. Few attempts, however, have been made so far to identify the origin of the epidemic spread rather than its evolution by mathematical topology methods. We report on the use of a new artificial intelligence method (the H-PST Algorithm) and we compare this new technique with other well known algorithms to identify the source of three examples of infectious disease outbreaks derived from literature. The H-PST algorithm is a new system able to project a distances matrix of points (events) into a bi-dimensional space, with the generation of a new point, named hidden unit. This new hidden unit deforms the original Euclidean space and transforms it into a new space (cognitive space). The cost function of this transformation is the minimization of the differences between the original distance matrix among the assigned points and the distance matrix of the same points projected into the bi-dimensional map (or any different set of constraints). For many reasons we will discuss, the position of the hidden unit shows to target the outbreak source in many epidemics much better than the other classic algorithms specifically targeted for this task. Compared with main algorithms known in the location theory, the hidden unit was within yards of the outbreak source in the first example (the 2007 epidemic of Chikungunya fever in Italy). The hidden unit was located in the river between the two village epicentres of the spread exactly where the index case was living. Equally in the second (the 1967 foot and mouth disease epidemic in England), and the third (1854 London Cholera epidemic

  6. Identifying Immune Drivers of Gulf War Illness Using a Novel Daily Sampling Approach

    Science.gov (United States)

    2015-10-01

    Anal Chem. 1964;36(8):1627–39. Submit your next manuscript to BioMed Central and take full advantage of: • Convenient online submission • Thorough...AWARD NUMBER: W81XWH-12-1-0557 TITLE:Identifying Immune Drivers of Gulf War Illness Using a Novel Daily Sampling Approach PRINCIPAL INVESTIGATOR...4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER W81XWH-12-1-0557 Identifying Immune Drivers of Gulf War Illness Using a Novel Daily Sampling Approach

  7. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  8. A Novel Approach of Load Balancing in Cloud Computing using Computational Intelligence

    Directory of Open Access Journals (Sweden)

    Shabnam Sharma

    2016-02-01

    Full Text Available Nature Inspired Meta-Heuristic algorithms are proved to be beneficial for solving real world combinatorial problems such as minimum spanning tree, knapsack problem, process planning problems, load balancing and many more. In this research work, existing meta-heuristic approaches are discussed. Due to astonishing feature of echolocation, bat algorithm has drawn major attention in recent years and is applicable in different applications such vehicle routing optimization, time-tabling in railway optimization problems, load balancing in cloud computing etc. Later, the biological behaviour of bats is explored and various areas of further research are discussed. Finally, the main objective of the research paper is to propose an algorithm for one of the most important application, which is load balancing in cloud computing environment.

  9. A computational technique to identify the optimal stiffness matrix for a discrete nuclear fuel assembly model

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu, E-mail: nkpark@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Kyoung-Joo, E-mail: kyoungjoo@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Kyoung-Hong, E-mail: kyounghong@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Suh, Jung-Min, E-mail: jmsuh@knfc.co.kr [R and D Center, KEPCO Nuclear Fuel Co., LTD., 493 Deokjin-dong, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)

    2013-02-15

    Highlights: ► An identification method of the optimal stiffness matrix for a fuel assembly structure is discussed. ► The least squares optimization method is introduced, and a closed form solution of the problem is derived. ► The method can be expanded to the system with the limited number of modes. ► Identification error due to the perturbed mode shape matrix is analyzed. ► Verification examples show that the proposed procedure leads to a reliable solution. -- Abstract: A reactor core structural model which is used to evaluate the structural integrity of the core contains nuclear fuel assembly models. Since the reactor core consists of many nuclear fuel assemblies, the use of a refined fuel assembly model leads to a considerable amount of computing time for performing nonlinear analyses such as the prediction of seismic induced vibration behaviors. The computational time could be reduced by replacing the detailed fuel assembly model with a simplified model that has fewer degrees of freedom, but the dynamic characteristics of the detailed model must be maintained in the simplified model. Such a model based on an optimal design method is proposed in this paper. That is, when a mass matrix and a mode shape matrix are given, the optimal stiffness matrix of a discrete fuel assembly model can be estimated by applying the least squares minimization method. The verification of the method is completed by comparing test results and simulation results. This paper shows that the simplified model's dynamic behaviors are quite similar to experimental results and that the suggested method is suitable for identifying reliable mathematical model for fuel assemblies.

  10. Identifying Learning Trajectories While Playing a Learning-to-Learn Computer Game in Different Children and Instruction Types

    NARCIS (Netherlands)

    de Koning-Veenstra, Baukje; Timmerman, Marieke; van Geert, Paul; van der Meulen, Bieuwe

    2014-01-01

    This research focuses on identifying learning trajectories expressed among children playing a learning-to-learn computer game and examining the relationships between the learning trajectories and individual characteristics such as developmental age, prior knowledge, and instruction type (adult- and/

  11. Identifying Learning Trajectories While Playing a Learning-to-Learn Computer Game in Different Children and Instruction Types

    NARCIS (Netherlands)

    de Koning-Veenstra, Baukje; Timmerman, Marieke; van Geert, Paul; van der Meulen, Bieuwe

    2014-01-01

    This research focuses on identifying learning trajectories expressed among children playing a learning-to-learn computer game and examining the relationships between the learning trajectories and individual characteristics such as developmental age, prior knowledge, and instruction type (adult- and/

  12. Knowledge-Assisted Approach to Identify Pathways with Differential Dependencies | Office of Cancer Genomics

    Science.gov (United States)

    We have previously developed a statistical method to identify gene sets enriched with condition-specific genetic dependencies. The method constructs gene dependency networks from bootstrapped samples in one condition and computes the divergence between distributions of network likelihood scores from different conditions. It was shown to be capable of sensitive and specific identification of pathways with phenotype-specific dysregulation, i.e., rewiring of dependencies between genes in different conditions.

  13. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  14. An operational modal analysis approach based on parametrically identified multivariable transmissibilities

    Science.gov (United States)

    Devriendt, Christof; De Sitter, Gert; Guillaume, Patrick

    2010-07-01

    In this contribution the approach to identify modal parameters from output-only (scalar) transmissibility measurements [C. Devriendt, P. Guillaume, The use of transmissibility measurements in output-only modal analysis, Mechanical Systems and Signal Processing 21 (7) (2007) 2689-2696] is generalized to multivariable transmissibilities. In general, the poles that are identified from (scalar as well as multivariable) transmissibility measurements do not correspond with the system's poles. However, by combining transmissibility measurements under different loading conditions, it is shown in this paper how model parameters can be identified from multivariable transmissibility measurements.

  15. Local-basis-function approach to computed tomography

    Science.gov (United States)

    Hanson, K. M.; Wecksung, G. W.

    1985-12-01

    In the local basis-function approach, a reconstruction is represented as a linear expansion of basis functions, which are arranged on a rectangular grid and possess a local region of support. The basis functions considered here are positive and may overlap. It is found that basis functions based on cubic B-splines offer significant improvements in the calculational accuracy that can be achieved with iterative tomographic reconstruction algorithms. By employing repetitive basis functions, the computational effort involved in these algorithms can be minimized through the use of tabulated values for the line or strip integrals over a single-basis function. The local nature of the basis functions reduces the difficulties associated with applying local constraints on reconstruction values, such as upper and lower limits. Since a reconstruction is specified everywhere by a set of coefficients, display of a coarsely represented image does not require an arbitrary choice of an interpolation function.

  16. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes...... of action in humans, to group them according to their modes of action, and to hypothesize on their potential effects on human health. We extracted human proteins associated to prochloraz, tebuconazole, epoxiconazole, procymidone, and mancozeb and enriched each protein set by using a high confidence human...... protein interactome. Then, we explored modes of action of the chemicals, by integrating protein-disease information to the resulting protein networks. The dominating human adverse effects affected were reproductive disorders followed by adrenal diseases. Our results indicated that prochloraz, tebuconazole...

  17. Computational approaches to substrate-based cell motility

    Science.gov (United States)

    Ziebert, Falko; Aranson, Igor S.

    2016-07-01

    Substrate-based crawling motility of eukaryotic cells is essential for many biological functions, both in developing and mature organisms. Motility dysfunctions are involved in several life-threatening pathologies such as cancer and metastasis. Motile cells are also a natural realisation of active, self-propelled 'particles', a popular research topic in nonequilibrium physics. Finally, from the materials perspective, assemblies of motile cells and evolving tissues constitute a class of adaptive self-healing materials that respond to the topography, elasticity and surface chemistry of the environment and react to external stimuli. Although a comprehensive understanding of substrate-based cell motility remains elusive, progress has been achieved recently in its modelling on the whole-cell level. Here we survey the most recent advances in computational approaches to cell movement and demonstrate how these models improve our understanding of complex self-organised systems such as living cells.

  18. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  19. A Function-First Approach to Identifying Formulaic Language in Academic Writing

    Science.gov (United States)

    Durrant, Philip; Mathews-Aydinli, Julie

    2011-01-01

    There is currently much interest in creating pedagogically-oriented descriptions of formulaic language. Research in this area has typically taken what we call a "form-first" approach, in which formulas are identified as the most frequent recurrent forms in a relevant corpus. While this research continues to yield valuable results, the present…

  20. The Baby TALK Model: An Innovative Approach to Identifying High-Risk Children and Families

    Science.gov (United States)

    Villalpando, Aimee Hilado; Leow, Christine; Hornstein, John

    2012-01-01

    This research report examines the Baby TALK model, an innovative early childhood intervention approach used to identify, recruit, and serve young children who are at-risk for developmental delays, mental health needs, and/or school failure, and their families. The report begins with a description of the model. This description is followed by an…

  1. Identifying Useful Auxiliary Variables for Incomplete Data Analyses: A Note on a Group Difference Examination Approach

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2014-01-01

    This research note contributes to the discussion of methods that can be used to identify useful auxiliary variables for analyses of incomplete data sets. A latent variable approach is discussed, which is helpful in finding auxiliary variables with the property that if included in subsequent maximum likelihood analyses they may enhance considerably…

  2. A Comprehensive Approach to Identifying Intervention Targets for Patient-Safety Improvement in a Hospital Setting

    Science.gov (United States)

    Cunningham, Thomas R.; Geller, E. Scott

    2012-01-01

    Despite differences in approaches to organizational problem solving, healthcare managers and organizational behavior management (OBM) practitioners share a number of practices, and connecting healthcare management with OBM may lead to improvements in patient safety. A broad needs-assessment methodology was applied to identify patient-safety…

  3. A goal-oriented approach to identify and engineer ;and use systems.

    NARCIS (Netherlands)

    Hengsdijk, H.; Ittersum, van M.K.

    2002-01-01

    This paper describes a formalized approach to identify and engineer future-oriented land use systems. Such land use systems can be used to explore options for strategic decision making with respect to land use policy and to do ex-ante assessment of land use alternatives to be further tested or

  4. Identifying Core Mobile Learning Faculty Competencies Based Integrated Approach: A Delphi Study

    Science.gov (United States)

    Elbarbary, Rafik Said

    2015-01-01

    This study is based on the integrated approach as a concept framework to identify, categorize, and rank a key component of mobile learning core competencies for Egyptian faculty members in higher education. The field investigation framework used four rounds Delphi technique to determine the importance rate of each component of core competencies…

  5. A computational approach for deciphering the organization of glycosaminoglycans.

    Directory of Open Access Journals (Sweden)

    Jean L Spencer

    Full Text Available BACKGROUND: Increasing evidence has revealed important roles for complex glycans as mediators of normal and pathological processes. Glycosaminoglycans are a class of glycans that bind and regulate the function of a wide array of proteins at the cell-extracellular matrix interface. The specific sequence and chemical organization of these polymers likely define function; however, identification of the structure-function relationships of glycosaminoglycans has been met with challenges associated with the unique level of complexity and the nontemplate-driven biosynthesis of these biopolymers. METHODOLOGY/PRINCIPAL FINDINGS: To address these challenges, we have devised a computational approach to predict fine structure and patterns of domain organization of the specific glycosaminoglycan, heparan sulfate (HS. Using chemical composition data obtained after complete and partial digestion of mixtures of HS chains with specific degradative enzymes, the computational analysis produces populations of theoretical HS chains with structures that meet both biosynthesis and enzyme degradation rules. The model performs these operations through a modular format consisting of input/output sections and three routines called chainmaker, chainbreaker, and chainsorter. We applied this methodology to analyze HS preparations isolated from pulmonary fibroblasts and epithelial cells. Significant differences in the general organization of these two HS preparations were observed, with HS from epithelial cells having a greater frequency of highly sulfated domains. Epithelial HS also showed a higher density of specific HS domains that have been associated with inhibition of neutrophil elastase. Experimental analysis of elastase inhibition was consistent with the model predictions and demonstrated that HS from epithelial cells had greater inhibitory activity than HS from fibroblasts. CONCLUSIONS/SIGNIFICANCE: This model establishes the conceptual framework for a new class of

  6. A computational approach for deciphering the organization of glycosaminoglycans.

    Science.gov (United States)

    Spencer, Jean L; Bernanke, Joel A; Buczek-Thomas, Jo Ann; Nugent, Matthew A

    2010-02-23

    Increasing evidence has revealed important roles for complex glycans as mediators of normal and pathological processes. Glycosaminoglycans are a class of glycans that bind and regulate the function of a wide array of proteins at the cell-extracellular matrix interface. The specific sequence and chemical organization of these polymers likely define function; however, identification of the structure-function relationships of glycosaminoglycans has been met with challenges associated with the unique level of complexity and the nontemplate-driven biosynthesis of these biopolymers. To address these challenges, we have devised a computational approach to predict fine structure and patterns of domain organization of the specific glycosaminoglycan, heparan sulfate (HS). Using chemical composition data obtained after complete and partial digestion of mixtures of HS chains with specific degradative enzymes, the computational analysis produces populations of theoretical HS chains with structures that meet both biosynthesis and enzyme degradation rules. The model performs these operations through a modular format consisting of input/output sections and three routines called chainmaker, chainbreaker, and chainsorter. We applied this methodology to analyze HS preparations isolated from pulmonary fibroblasts and epithelial cells. Significant differences in the general organization of these two HS preparations were observed, with HS from epithelial cells having a greater frequency of highly sulfated domains. Epithelial HS also showed a higher density of specific HS domains that have been associated with inhibition of neutrophil elastase. Experimental analysis of elastase inhibition was consistent with the model predictions and demonstrated that HS from epithelial cells had greater inhibitory activity than HS from fibroblasts. This model establishes the conceptual framework for a new class of computational tools to use to assess patterns of domain organization within

  7. An Organic Computing Approach to Self-organising Robot Ensembles

    Directory of Open Access Journals (Sweden)

    Sebastian Albrecht von Mammen

    2016-11-01

    Full Text Available Similar to the Autonomous Computing initiative, that has mainly been advancing techniques for self-optimisation focussing on computing systems and infrastructures, Organic Computing (OC has been driving the development of system design concepts and algorithms for self-adaptive systems at large. Examples of application domains include, for instance, traffic management and control, cloud services, communication protocols, and robotic systems. Such an OC system typically consists of a potentially large set of autonomous and self-managed entities, where each entity acts with a local decision horizon. By means of cooperation of the individual entities, the behaviour of the entire ensemble system is derived. In this article, we present our work on how autonomous, adaptive robot ensembles can benefit from OC technology. Our elaborations are aligned with the different layers of an observer/controller framework which provides the foundation for the individuals' adaptivity at system design-level. Relying on an extended Learning Classifier System (XCS in combination with adequate simulation techniques, this basic system design empowers robot individuals to improve their individual and collaborative performances, e.g. by means of adapting to changing goals and conditions.Not only for the sake of generalisability, but also because of its enormous transformative potential, we stage our research in the domain of robot ensembles that are typically comprised of several quad-rotors and that organise themselves to fulfil spatial tasks such as maintenance of building facades or the collaborative search for mobile targets. Our elaborations detail the architectural concept, provide examples of individual self-optimisation as well as of the optimisation of collaborative efforts, and we show how the user can control the ensembles at multiple levels of abstraction. We conclude with a summary of our approach and an outlook on possible future steps.

  8. Identifying the Factors that Influence Computer Use in the Early Childhood Classroom

    Science.gov (United States)

    Edwards, Suzy

    2005-01-01

    Computers have become an increasingly accepted learning tool in the early childhood classroom. Despite initial concerns regarding the effect of computers on children's development, past research has indicated that computer use by young children can support their learning and developmental outcomes (Siraj-Blatchford & Whitebread, 2003; Yelland,…

  9. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA.

    Directory of Open Access Journals (Sweden)

    Kumar Parijat Tripathi

    Full Text Available RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool, QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery tools. It offers a report on statistical analysis of functional and Gene Ontology (GO annotation's enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein-protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA by ab initio methods helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is

  10. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA

    Science.gov (United States)

    Zuccaro, Antonio; Guarracino, Mario Rosario

    2015-01-01

    RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool), QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery) tools. It offers a report on statistical analysis of functional and Gene Ontology (GO) annotation’s enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein—protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA) by ab initio methods) helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is freely

  11. Impact of Hybrid Intelligent Computing in Identifying Constructive Weather Parameters for Modeling Effective Rainfall Prediction

    Directory of Open Access Journals (Sweden)

    M. Sudha

    2015-12-01

    Full Text Available Uncertain atmosphere is a prevalent factor affecting the existing prediction approaches. Rough set and fuzzy set theories as proposed by Pawlak and Zadeh have become an effective tool for handling vagueness and fuzziness in the real world scenarios. This research work describes the impact of Hybrid Intelligent System (HIS for strategic decision support in meteorology. In this research a novel exhaustive search based Rough set reduct Selection using Genetic Algorithm (RSGA is introduced to identify the significant input feature subset. The proposed model could identify the most effective weather parameters efficiently than other existing input techniques. In the model evaluation phase two adaptive techniques were constructed and investigated. The proposed Artificial Neural Network based on Back Propagation learning (ANN-BP and Adaptive Neuro Fuzzy Inference System (ANFIS was compared with existing Fuzzy Unordered Rule Induction Algorithm (FURIA, Structural Learning Algorithm on Vague Environment (SLAVE and Particle Swarm OPtimization (PSO. The proposed rainfall prediction models outperformed when trained with the input generated using RSGA. A meticulous comparison of the performance indicates ANN-BP model as a suitable HIS for effective rainfall prediction. The ANN-BP achieved 97.46% accuracy with a nominal misclassification rate of 0.0254 %.

  12. An Approach for Indoor Path Computation among Obstacles that Considers User Dimension

    Directory of Open Access Journals (Sweden)

    Liu Liu

    2015-12-01

    Full Text Available People often transport objects within indoor environments, who need enough space for the motion. In such cases, the accessibility of indoor spaces relies on the dimensions, which includes a person and her/his operated objects. This paper proposes a new approach to avoid obstacles and compute indoor paths with respect to the user dimension. The approach excludes inaccessible spaces for a user in five steps: (1 compute the minimum distance between obstacles and find the inaccessible gaps; (2 group obstacles according to the inaccessible gaps; (3 identify groups of obstacles that influence the path between two locations; (4 compute boundaries for the selected groups; and (5 build a network in the accessible area around the obstacles in the room. Compared to the Minkowski sum method for outlining inaccessible spaces, the proposed approach generates simpler polygons for groups of obstacles that do not contain inner rings. The creation of a navigation network becomes easier based on these simple polygons. By using this approach, we can create user- and task-specific networks in advance. Alternatively, the accessible path can be generated on the fly before the user enters a room.

  13. Identifying bioaccumulative halogenated organic compounds using a nontargeted analytical approach: seabirds as sentinels.

    Directory of Open Access Journals (Sweden)

    Christopher J Millow

    Full Text Available Persistent organic pollutants (POPs are typically monitored via targeted mass spectrometry, which potentially identifies only a fraction of the contaminants actually present in environmental samples. With new anthropogenic compounds continuously introduced to the environment, novel and proactive approaches that provide a comprehensive alternative to targeted methods are needed in order to more completely characterize the diversity of known and unknown compounds likely to cause adverse effects. Nontargeted mass spectrometry attempts to extensively screen for compounds, providing a feasible approach for identifying contaminants that warrant future monitoring. We employed a nontargeted analytical method using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOF-MS to characterize halogenated organic compounds (HOCs in California Black skimmer (Rynchops niger eggs. Our study identified 111 HOCs; 84 of these compounds were regularly detected via targeted approaches, while 27 were classified as typically unmonitored or unknown. Typically unmonitored compounds of note in bird eggs included tris(4-chlorophenylmethane (TCPM, tris(4-chlorophenylmethanol (TCPMOH, triclosan, permethrin, heptachloro-1'-methyl-1,2'-bipyrrole (MBP, as well as four halogenated unknown compounds that could not be identified through database searching or the literature. The presence of these compounds in Black skimmer eggs suggests they are persistent, bioaccumulative, potentially biomagnifying, and maternally transferring. Our results highlight the utility and importance of employing nontargeted analytical tools to assess true contaminant burdens in organisms, as well as to demonstrate the value in using environmental sentinels to proactively identify novel contaminants.

  14. Gene-based Association Approach Identify Genes Across Stress Traits in Fruit Flies

    DEFF Research Database (Denmark)

    Rohde, Palle Duun; Edwards, Stefan McKinnon; Sarup, Pernille Merete

    Identification of genes explaining variation in quantitative traits or genetic risk factors of human diseases requires both good phenotypic- and genotypic data, but also efficient statistical methods. Genome-wide association studies may reveal association between phenotypic variation and variation...... at nucleotide level, thus potentially identify genetic variants. However, testing million of polymorphic nucleotide positions requires conservative correction for multiple testing which lowers the probability of finding genes with small to moderate effects. To alleviate this, we apply a gene based association...... approach grouping variants accordingly to gene position, thus lowering the number of statistical tests performed and increasing the probability of identifying genes with small to moderate effects. Using this approach we identify numerous genes associated with different types of stresses in Drosophila...

  15. Role of peripheral quantitative computed tomography in identifying disuse osteoporosis in paraplegia

    Energy Technology Data Exchange (ETDEWEB)

    Coupaud, Sylvie [University of Glasgow, Centre for Rehabilitation Engineering, Department of Mechanical Engineering, Glasgow (United Kingdom); Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom); McLean, Alan N.; Allan, David B. [Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom)

    2009-10-15

    Disuse osteoporosis is a major long-term health consequence of spinal cord injury (SCI) that still needs to be addressed. Its management in SCI should begin with accurate diagnosis, followed by targeted treatments in the most vulnerable subgroups. We present data quantifying disuse osteoporosis in a cross-section of the Scottish paraplegic population to identify subgroups with lowest bone mineral density (BMD). Forty-seven people with chronic SCI at levels T2-L2 were scanned using peripheral quantitative computed tomography at four tibial sites and two femoral sites, at the Queen Elizabeth National Spinal Injuries Unit, Glasgow (UK). At the distal epiphyses, trabecular BMD (BMDtrab), total BMD, total bone cross-sectional area (CSA) and bone mineral content (BMC) were determined. In the diaphyses, cortical BMD, total bone CSA, cortical CSA and BMC were calculated. Bone, muscle and fat CSAs were estimated in the lower leg and thigh. BMDtrab decreased exponentially with time since injury at different rates in the tibia and femur. At most sites, female paraplegics had significantly lower BMC, total bone CSA and muscle CSA than male paraplegics. Subjects with lumbar SCI tended to have lower bone values and smaller muscle CSAs than in thoracic SCI. At the distal epiphyses of the tibia and femur, there is generally a rapid and extensive reduction in BMDtrab after SCI. Female subjects, and those with lumbar SCI, tend to have lower bone values than males or those with thoracic SCI, respectively. (orig.)

  16. A computational method to help identify and measure metal lines in high resolution QSO spectra

    Institute of Scientific and Technical Information of China (English)

    Xi-Heng Shi; David Tytler; Jin-Liang Hou; David Kirkman; Jeffery Lee; Benjamin Ou

    2011-01-01

    A computational code is developed to help identify metal absorption lines in high resolution QSO spectra,especially in the Lyα forest.The input to the code includes a list of line central wavelengths,column densities and Doppler widths.The code then searches for candidate metal absorption systems and assesses the probability that each system could be real.The framework of the strategy we employ is described in detail and we discuss how to estimate the errors in line profile fitting that are essential to identification.A series of artificial spectra is constructed to calibrate the performance of the code.Due to the effects of blending and noise on Voigt profile fitting,the completeness of the identification depends on the column density of absorbers.For intermediate and strong artificial metal absorbers,more than 90% could be confirmed by the code.The results of applying the code to the real spectra of QSOs HS0757+5218 and Q0100+1300 are also presented.

  17. [Key effect genes responding to nerve injury identified by gene ontology and computer pattern recognition].

    Science.gov (United States)

    Pan, Qian; Peng, Jin; Zhou, Xue; Yang, Hao; Zhang, Wei

    2012-07-01

    In order to screen out important genes from large gene data of gene microarray after nerve injury, we combine gene ontology (GO) method and computer pattern recognition technology to find key genes responding to nerve injury, and then verify one of these screened-out genes. Data mining and gene ontology analysis of gene chip data GSE26350 was carried out through MATLAB software. Cd44 was selected from screened-out key gene molecular spectrum by comparing genes' different GO terms and positions on score map of principal component. Function interferences were employed to influence the normal binding of Cd44 and one of its ligands, chondroitin sulfate C (CSC), to observe neurite extension. Gene ontology analysis showed that the first genes on score map (marked by red *) mainly distributed in molecular transducer activity, receptor activity, protein binding et al molecular function GO terms. Cd44 is one of six effector protein genes, and attracted us with its function diversity. After adding different reagents into the medium to interfere the normal binding of CSC and Cd44, varying-degree remissions of CSC's inhibition on neurite extension were observed. CSC can inhibit neurite extension through binding Cd44 on the neuron membrane. This verifies that important genes in given physiological processes can be identified by gene ontology analysis of gene chip data.

  18. A Computational Approach to Politeness with Application to Social Factors

    CERN Document Server

    Danescu-Niculescu-Mizil, Cristian; Jurafsky, Dan; Leskovec, Jure; Potts, Christopher

    2013-01-01

    We propose a computational framework for identifying linguistic aspects of politeness. Our starting point is a new corpus of requests annotated for politeness, which we use to evaluate aspects of politeness theory and to uncover new interactions between politeness markers and context. These findings guide our construction of a classifier with domain-independent lexical and syntactic features operationalizing key components of politeness theory, such as indirection, deference, impersonalization and modality. Our classifier achieves close to human performance and is effective across domains. We use our framework to study the relationship between politeness and social power, showing that polite Wikipedia editors are more likely to achieve high status through elections, but, once elevated, they become less polite. We see a similar negative correlation between politeness and power on Stack Exchange, where users at the top of the reputation scale are less polite than those at the bottom. Finally, we apply our class...

  19. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant

    Science.gov (United States)

    Beatty, William S.; Kesler, Dylan C.; Webb, Elisabeth B.; Raedeke, Andrew H.; Naylor, Luke W.; Humburg, Dale D.

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted

  20. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant.

    Science.gov (United States)

    Beatty, William S; Kesler, Dylan C; Webb, Elisabeth B; Raedeke, Andrew H; Naylor, Luke W; Humburg, Dale D

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted

  1. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant.

    Directory of Open Access Journals (Sweden)

    William S Beatty

    Full Text Available The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international, whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result

  2. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  3. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  4. Using an interdisciplinary approach to identify factors that affect clinicians' compliance with evidence-based guidelines.

    Science.gov (United States)

    Gurses, Ayse P; Marsteller, Jill A; Ozok, A Ant; Xiao, Yan; Owens, Sharon; Pronovost, Peter J

    2010-08-01

    Our objective was to identify factors that affect clinicians' compliance with the evidence-based guidelines using an interdisciplinary approach and develop a conceptual framework that can provide a comprehensive and practical guide for designing effective interventions. A literature review and a brainstorming session with 11 researchers from a variety of scientific disciplines were used to identify theoretical and conceptual models describing clinicians' guideline compliance. MEDLINE, EMBASE, CINAHL, and the bibliographies of the papers identified were used as data sources for identifying the relevant theoretical and conceptual models. Thirteen different models that originated from various disciplines including medicine, rural sociology, psychology, human factors and systems engineering, organizational management, marketing, and health education were identified. Four main categories of factors that affect compliance emerged from our analysis: clinician characteristics, guideline characteristics, system characteristics, and implementation characteristics. Based on these findings, we developed an interdisciplinary conceptual framework that specifies the expected interrelationships among these four categories of factors and their impact on clinicians' compliance. An interdisciplinary approach is needed to improve clinicians' compliance with evidence-based guidelines. The conceptual framework from this research can provide a comprehensive and systematic guide to identify barriers to guideline compliance and design effective interventions to improve patient safety.

  5. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  6. Alternative approaches for identifying acute systemic toxicity: Moving from research to regulatory testing.

    Science.gov (United States)

    Hamm, Jon; Sullivan, Kristie; Clippinger, Amy J; Strickland, Judy; Bell, Shannon; Bhhatarai, Barun; Blaauboer, Bas; Casey, Warren; Dorman, David; Forsby, Anna; Garcia-Reyero, Natàlia; Gehen, Sean; Graepel, Rabea; Hotchkiss, Jon; Lowit, Anna; Matheson, Joanna; Reaves, Elissa; Scarano, Louis; Sprankle, Catherine; Tunkel, Jay; Wilson, Dan; Xia, Menghang; Zhu, Hao; Allen, David

    2017-06-01

    Acute systemic toxicity testing provides the basis for hazard labeling and risk management of chemicals. A number of international efforts have been directed at identifying non-animal alternatives for in vivo acute systemic toxicity tests. A September 2015 workshop, Alternative Approaches for Identifying Acute Systemic Toxicity: Moving from Research to Regulatory Testing, reviewed the state-of-the-science of non-animal alternatives for this testing and explored ways to facilitate implementation of alternatives. Workshop attendees included representatives from international regulatory agencies, academia, nongovernmental organizations, and industry. Resources identified as necessary for meaningful progress in implementing alternatives included compiling and making available high-quality reference data, training on use and interpretation of in vitro and in silico approaches, and global harmonization of testing requirements. Attendees particularly noted the need to characterize variability in reference data to evaluate new approaches. They also noted the importance of understanding the mechanisms of acute toxicity, which could be facilitated by the development of adverse outcome pathways. Workshop breakout groups explored different approaches to reducing or replacing animal use for acute toxicity testing, with each group crafting a roadmap and strategy to accomplish near-term progress. The workshop steering committee has organized efforts to implement the recommendations of the workshop participants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. A model-based approach to identify binding sites in CLIP-Seq data.

    Directory of Open Access Journals (Sweden)

    Tao Wang

    Full Text Available Cross-linking immunoprecipitation coupled with high-throughput sequencing (CLIP-Seq has made it possible to identify the targeting sites of RNA-binding proteins in various cell culture systems and tissue types on a genome-wide scale. Here we present a novel model-based approach (MiClip to identify high-confidence protein-RNA binding sites from CLIP-seq datasets. This approach assigns a probability score for each potential binding site to help prioritize subsequent validation experiments. The MiClip algorithm has been tested in both HITS-CLIP and PAR-CLIP datasets. In the HITS-CLIP dataset, the signal/noise ratios of miRNA seed motif enrichment produced by the MiClip approach are between 17% and 301% higher than those by the ad hoc method for the top 10 most enriched miRNAs. In the PAR-CLIP dataset, the MiClip approach can identify ∼50% more validated binding targets than the original ad hoc method and two recently published methods. To facilitate the application of the algorithm, we have released an R package, MiClip (http://cran.r-project.org/web/packages/MiClip/index.html, and a public web-based graphical user interface software (http://galaxy.qbrc.org/tool_runner?tool_id=mi_clip for customized analysis.

  8. Identifying prognostic features by bottom-up approach and correlating to drug repositioning.

    Directory of Open Access Journals (Sweden)

    Wei Li

    Full Text Available Traditionally top-down method was used to identify prognostic features in cancer research. That is to say, differentially expressed genes usually in cancer versus normal were identified to see if they possess survival prediction power. The problem is that prognostic features identified from one set of patient samples can rarely be transferred to other datasets. We apply bottom-up approach in this study: survival correlated or clinical stage correlated genes were selected first and prioritized by their network topology additionally, then a small set of features can be used as a prognostic signature.Gene expression profiles of a cohort of 221 hepatocellular carcinoma (HCC patients were used as a training set, 'bottom-up' approach was applied to discover gene-expression signatures associated with survival in both tumor and adjacent non-tumor tissues, and compared with 'top-down' approach. The results were validated in a second cohort of 82 patients which was used as a testing set.Two sets of gene signatures separately identified in tumor and adjacent non-tumor tissues by bottom-up approach were developed in the training cohort. These two signatures were associated with overall survival times of HCC patients and the robustness of each was validated in the testing set, and each predictive performance was better than gene expression signatures reported previously. Moreover, genes in these two prognosis signature gave some indications for drug-repositioning on HCC. Some approved drugs targeting these markers have the alternative indications on hepatocellular carcinoma.Using the bottom-up approach, we have developed two prognostic gene signatures with a limited number of genes that associated with overall survival times of patients with HCC. Furthermore, prognostic markers in these two signatures have the potential to be therapeutic targets.

  9. A cellular genetics approach identifies gene-drug interactions and pinpoints drug toxicity pathway nodes

    Directory of Open Access Journals (Sweden)

    Oscar Takeo Suzuki

    2014-08-01

    Full Text Available New approaches to toxicity testing have incorporated high-throughput screening across a broad-range of in vitro assays to identify potential key events in response to chemical or drug treatment. To date, these approaches have primarily utilized repurposed drug discovery assays. In this study, we describe an approach that combines in vitro screening with genetic approaches for the experimental identification of genes and pathways involved in chemical or drug toxicity. Primary embryonic fibroblasts isolated from 32 genetically-characterized inbred mouse strains were treated in concentration-response format with 65 compounds, including pharmaceutical drugs, environmental chemicals, and compounds with known modes-of-action. Integrated cellular responses were measured at 24 and 72 hours using high-content imaging and included cell loss, membrane permeability, mitochondrial function, and apoptosis. Genetic association analysis of cross-strain differences in the cellular responses resulted in a collection of candidate loci potentially underlying the variable strain response to each chemical. As a demonstration of the approach, one candidate gene involved in rotenone sensitivity, Cybb, was experimentally validated in vitro and in vivo. Pathway analysis on the combined list of candidate loci across all chemicals identified a number of over-connected nodes that may serve as core regulatory points in toxicity pathways.

  10. Computed tomography vs magnetic resonance imaging for identifying acute lesions in pediatric traumatic brain injury.

    Science.gov (United States)

    Buttram, Sandra D W; Garcia-Filion, Pamela; Miller, Jeffrey; Youssfi, Mostafa; Brown, S Danielle; Dalton, Heidi J; Adelson, P David

    2015-02-01

    Pediatric traumatic brain injury (TBI) is a leading cause of morbidity and mortality in children. Computed tomography (CT) is the modality of choice to screen for brain injuries. MRI may provide more clinically relevant information. The purpose of this study was to compare lesion detection between CT and MRI after TBI. Retrospective cohort of children (0-21 years) with TBI between 2008 and 2010 at a Level 1 pediatric trauma center with a head CT scan on day of injury and a brain MRI scan within 2 weeks of injury. Agreement between CT and MRI was determined by κ statistic and stratified by injury mechanism. One hundred five children were studied. Of these, 78% had mild TBI. The MRI scan was obtained a median of 1 day (interquartile range, 1-2) after CT. Overall, CT and MRI demonstrated poor agreement (κ=-0.083; P=.18). MRI detected a greater number of intraparenchymal lesions (n=36; 34%) compared with CT (n=16; 15%) (P<.001). Among patients with abusive head trauma, MRI detected intraparenchymal lesions in 16 (43%), compared with only 4 (11%) lesions with CT (P=.03). Of 8 subjects with a normal CT scan, 6 out of 8 had abnormal lesions on MRI. Compared with CT, MRI identified significantly more intraparenchymal lesions in pediatric TBI, particularly in children with abusive head trauma. The prognostic value of identification of intraparenchymal lesions by MRI is unknown but warrants additional inquiry. Risks and benefits from early MRI (including sedation, time, and lack of radiation exposure) compared with CT should be weighed by clinicians. Copyright © 2015 by the American Academy of Pediatrics.

  11. A multivariate and stochastic approach to identify key variables to rank dairy farms on profitability.

    Science.gov (United States)

    Atzori, A S; Tedeschi, L O; Cannas, A

    2013-05-01

    The economic efficiency of dairy farms is the main goal of farmers. The objective of this work was to use routinely available information at the dairy farm level to develop an index of profitability to rank dairy farms and to assist the decision-making process of farmers to increase the economic efficiency of the entire system. A stochastic modeling approach was used to study the relationships between inputs and profitability (i.e., income over feed cost; IOFC) of dairy cattle farms. The IOFC was calculated as: milk revenue + value of male calves + culling revenue - herd feed costs. Two databases were created. The first one was a development database, which was created from technical and economic variables collected in 135 dairy farms. The second one was a synthetic database (sDB) created from 5,000 synthetic dairy farms using the Monte Carlo technique and based on the characteristics of the development database data. The sDB was used to develop a ranking index as follows: (1) principal component analysis (PCA), excluding IOFC, was used to identify principal components (sPC); and (2) coefficient estimates of a multiple regression of the IOFC on the sPC were obtained. Then, the eigenvectors of the sPC were used to compute the principal component values for the original 135 dairy farms that were used with the multiple regression coefficient estimates to predict IOFC (dRI; ranking index from development database). The dRI was used to rank the original 135 dairy farms. The PCA explained 77.6% of the sDB variability and 4 sPC were selected. The sPC were associated with herd profile, milk quality and payment, poor management, and reproduction based on the significant variables of the sPC. The mean IOFC in the sDB was 0.1377 ± 0.0162 euros per liter of milk (€/L). The dRI explained 81% of the variability of the IOFC calculated for the 135 original farms. When the number of farms below and above 1 standard deviation (SD) of the dRI were calculated, we found that 21

  12. Online-Based Approaches to Identify Real Journals and Publishers from Hijacked Ones

    DEFF Research Database (Denmark)

    Asadi, Amin; Rahbar, Nader; Asadi, Meisam

    2017-01-01

    The aim of the present paper was to introduce some online-based approaches to evaluate scientific journals and publishers and to differentiate them from the hijacked ones, regardless of their disciplines. With the advent of open-access journals, many hijacked journals and publishers have...... deceitfully assumed the mantle of authenticity in order to take advantage of researchers and students. Although these hijacked journals and publishers can be identified through checking their advertisement techniques and their websites, these ways do not always result in their identification. There exist...... certain online-based approaches, such as using Master Journal List provided by Thomson Reuters, and Scopus database, and using the DOI of a paper, to certify the realness of a journal or publisher. It is indispensable that inexperienced students and researchers know these methods so as to identify...

  13. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    Science.gov (United States)

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  14. An Approach to Identify and Characterize a Subunit Candidate Shigella Vaccine Antigen.

    Science.gov (United States)

    Pore, Debasis; Chakrabarti, Manoj K

    2016-01-01

    Shigellosis remains a serious issue throughout the developing countries, particularly in children under the age of 5. Numerous strategies have been tested to develop vaccines targeting shigellosis; unfortunately despite several years of extensive research, no safe, effective, and inexpensive vaccine against shigellosis is available so far. Here, we illustrate in detail an approach to identify and establish immunogenic outer membrane proteins from Shigella flexneri 2a as subunit vaccine candidates.

  15. Scientific and methodical approach to identifying lags of the stabilisation financial policy

    OpenAIRE

    Savchenko Kostyantyn V.

    2013-01-01

    The article studies the problem of identifying lags when applying instruments of stabilisation financial policy of the state. The author justifies a scientific and methodical approach to this identification. It offers a selection of instruments of money-loan and budget-tax policies, which exert the biggest influence upon the general indicator of the state of economy. These instruments are as follows: average weighted rate by all instruments (average quarterly value); refinancing of banks by t...

  16. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    OpenAIRE

    2012-01-01

    A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about da...

  17. A Systematic Approach to Identify Promising New Items for Small to Medium Enterprises: A Case Study

    Directory of Open Access Journals (Sweden)

    Sukjae Jeong

    2016-11-01

    Full Text Available Despite the growing importance of identifying new business items for small and medium enterprises (SMEs, most previous studies focus on conglomerates. The paucity of empirical studies has also led to limited real-life applications. Hence, this study proposes a systematic approach to find new business items (NBIs that help the prospective SMEs develop, evaluate, and select viable business items to survive the competitive environment. The proposed approach comprises two stages: (1 the classification of diversification of SMEs; and (2 the searching and screening of business items. In the first stage, SMEs are allocated to five groups, based on their internal technological competency and external market conditions. In the second stage, based on the types of SMEs identified in the first stage, a set of alternative business items is derived by combining the results of portfolio analysis and benchmarking analysis. After deriving new business items, a market and technology-driven matrix analysis is utilized to screen suitable business items, and the Bruce Merrifield-Ohe (BMO method is used to categorize and identify prospective items based on market attractiveness and internal capability. To illustrate the applicability of the proposed approach, a case study is presented.

  18. Systems approaches in osteoarthritis: Identifying routes to novel diagnostic and therapeutic strategies.

    Science.gov (United States)

    Mueller, Alan J; Peffers, Mandy J; Proctor, Carole J; Clegg, Peter D

    2017-03-20

    Systems orientated research offers the possibility of identifying novel therapeutic targets and relevant diagnostic markers for complex diseases such as osteoarthritis. This review demonstrates that the osteoarthritis research community has been slow to incorporate systems orientated approaches into research studies, although a number of key studies reveal novel insights into the regulatory mechanisms that contribute both to joint tissue homeostasis and its dysfunction. The review introduces both top-down and bottom-up approaches employed in the study of osteoarthritis. A holistic and multiscale approach, where clinical measurements may predict dysregulation and progression of joint degeneration, should be a key objective in future research. The review concludes with suggestions for further research and emerging trends not least of which is the coupled development of diagnostic tests and therapeutics as part of a concerted effort by the osteoarthritis research community to meet clinical needs. This article is protected by copyright. All rights reserved.

  19. An approach to identify issues affecting ERP implementation in Indian SMEs

    Directory of Open Access Journals (Sweden)

    Rana Basu

    2012-06-01

    Full Text Available Purpose: The purpose of this paper is to present the findings of a study which is based on the results of a comprehensive compilation of literature and subsequent analysis of ERP implementation success issues in context to Indian Small and Medium scale Enterprises (SME’s. This paper attempts to explore the existing literature and highlight those issues on ERP implementation and further to this the researchers applied TOPSIS (Technique for order preference by similarity to ideal solution method to prioritize issues affecting successful implementation of ERP. Design/methodology/approach: Based on the literature review certain issues leading to successful ERP implementation have been identified and to identify key issues Pareto Analysis (80-20 Rule have been applied. Further to extraction of key issues a survey based on TOPSIS was carried out in Indian small and medium scale enterprises. Findings: Based on review of literature 25 issues have been identified and further Pareto analysis has been done to extract key issues which is further prioritized by applying Topsis method. Research limitations/implications: Beside those identified issues there may be other issues that need to be explored. There is scope to enhance this study by taking into consideration different type of industries and by extending number of respondents. Practical implications: By identifying key issues for SMEs, managers can better prioritize issues to make implementation process smooth without disruption. ERP vendors can take inputs from this study to change their implementation approach while targeting small scale enterprises. Originality/value: There is no published literature available which followed a similar approach in identification of the critical issues affecting ERP in small and mid-sized companies in India or in any developing economy.

  20. A Near-Term Quantum Computing Approach for Hard Computational Problems in Space Exploration

    CERN Document Server

    Smelyanskiy, Vadim N; Knysh, Sergey I; Williams, Colin P; Johnson, Mark W; Thom, Murray C; Macready, William G; Pudenz, Kristen L

    2012-01-01

    In this article, we show how to map a sampling of the hardest artificial intelligence problems in space exploration onto equivalent Ising models that then can be attacked using quantum annealing implemented in D-Wave machine. We overview the existing results as well as propose new Ising model implementations for quantum annealing. We review supervised and unsupervised learning algorithms for classification and clustering with applications to feature identification and anomaly detection. We introduce algorithms for data fusion and image matching for remote sensing applications. We overview planning problems for space exploration mission applications and algorithms for diagnostics and recovery with applications to deep space missions. We describe combinatorial optimization algorithms for task assignment in the context of autonomous unmanned exploration. Finally, we discuss the ways to circumvent the limitation of the Ising mapping using a "blackbox" approach based on ideas from probabilistic computing. In this ...

  1. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    Science.gov (United States)

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  2. An Approach for Identifying Cytokines Based on a Novel Ensemble Classifier

    Directory of Open Access Journals (Sweden)

    Quan Zou

    2013-01-01

    Full Text Available Biology is meaningful and important to identify cytokines and investigate their various functions and biochemical mechanisms. However, several issues remain, including the large scale of benchmark datasets, serious imbalance of data, and discovery of new gene families. In this paper, we employ the machine learning approach based on a novel ensemble classifier to predict cytokines. We directly selected amino acids sequences as research objects. First, we pretreated the benchmark data accurately. Next, we analyzed the physicochemical properties and distribution of whole amino acids and then extracted a group of 120-dimensional (120D valid features to represent sequences. Third, in the view of the serious imbalance in benchmark datasets, we utilized a sampling approach based on the synthetic minority oversampling technique algorithm and K-means clustering undersampling algorithm to rebuild the training set. Finally, we built a library for dynamic selection and circulating combination based on clustering (LibD3C and employed the new training set to realize cytokine classification. Experiments showed that the geometric mean of sensitivity and specificity obtained through our approach is as high as 93.3%, which proves that our approach is effective for identifying cytokines.

  3. Identifying and assessing the application of ecosystem services approaches in environmental policies and decision making.

    Science.gov (United States)

    Van Wensem, Joke; Calow, Peter; Dollacker, Annik; Maltby, Lorraine; Olander, Lydia; Tuvendal, Magnus; Van Houtven, George

    2017-01-01

    The presumption is that ecosystem services (ES) approaches provide a better basis for environmental decision making than do other approaches because they make explicit the connection between human well-being and ecosystem structures and processes. However, the existing literature does not provide a precise description of ES approaches for environmental policy and decision making, nor does it assess whether these applications will make a difference in terms of changing decisions and improving outcomes. We describe 3 criteria that can be used to identify whether and to what extent ES approaches are being applied: 1) connect impacts all the way from ecosystem changes to human well-being, 2) consider all relevant ES affected by the decision, and 3) consider and compare the changes in well-being of different stakeholders. As a demonstration, we then analyze retrospectively whether and how the criteria were met in different decision-making contexts. For this assessment, we have developed an analysis format that describes the type of policy, the relevant scales, the decisions or questions, the decision maker, and the underlying documents. This format includes a general judgment of how far the 3 ES criteria have been applied. It shows that the criteria can be applied to many different decision-making processes, ranging from the supranational to the local scale and to different parts of decision-making processes. In conclusion we suggest these criteria could be used for assessments of the extent to which ES approaches have been and should be applied, what benefits and challenges arise, and whether using ES approaches made a difference in the decision-making process, decisions made, or outcomes of those decisions. Results from such studies could inform future use and development of ES approaches, draw attention to where the greatest benefits and challenges are, and help to target integration of ES approaches into policies, where they can be most effective. Integr Environ

  4. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hong; Dwaraknath, Shyam S.; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A.

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available.

  5. Lexical is as lexical does: computational approaches to lexical representation

    Science.gov (United States)

    Woollams, Anna M.

    2015-01-01

    In much of neuroimaging and neuropsychology, regions of the brain have been associated with ‘lexical representation’, with little consideration as to what this cognitive construct actually denotes. Within current computational models of word recognition, there are a number of different approaches to the representation of lexical knowledge. Structural lexical representations, found in original theories of word recognition, have been instantiated in modern localist models. However, such a representational scheme lacks neural plausibility in terms of economy and flexibility. Connectionist models have therefore adopted distributed representations of form and meaning. Semantic representations in connectionist models necessarily encode lexical knowledge. Yet when equipped with recurrent connections, connectionist models can also develop attractors for familiar forms that function as lexical representations. Current behavioural, neuropsychological and neuroimaging evidence shows a clear role for semantic information, but also suggests some modality- and task-specific lexical representations. A variety of connectionist architectures could implement these distributed functional representations, and further experimental and simulation work is required to discriminate between these alternatives. Future conceptualisations of lexical representations will therefore emerge from a synergy between modelling and neuroscience. PMID:25893204

  6. Dielectric properties of periodic heterostructures: A computational electrostatics approach

    Science.gov (United States)

    Brosseau, C.; Beroual, A.

    1999-04-01

    The dielectric properties of heterogeneous materials for various condensed-matter systems are important for several technologies, e.g. impregnated polymers for high-density capacitors, polymer carbon black mixtures for automotive tires and current limiters in circuit protection. These multiscale systems lead to challenging problems of connecting microstructural features (shape, spatial arrangement and size distribution of inclusions) to macroscopic materials response (permittivity, conductivity). In this paper, we briefly discuss an ab initio computational electrostatics approach, based either on the use of the field calculation package FLUX3D (or FLUX2D) and a conventional finite elements method, or the use of the field calculation package PHI3D and the resolution of boundary integral equations, for calculating the effective permittivity of two-component dielectric heterostructures. Numerical results concerning inclusions of permittivity \\varepsilon_1 with various geometrical shapes periodically arranged in a host matrix of permittivity \\varepsilon_2 are provided. Next we discuss these results in terms of phenomenological mixing laws, analytical theory and connectedness. During the pursuit of these activities, several interesting phenomena were discovered that will stimulate further investigation.

  7. Applying a cloud computing approach to storage architectures for spacecraft

    Science.gov (United States)

    Baldor, Sue A.; Quiroz, Carlos; Wood, Paul

    As sensor technologies, processor speeds, and memory densities increase, spacecraft command, control, processing, and data storage systems have grown in complexity to take advantage of these improvements and expand the possible missions of spacecraft. Spacecraft systems engineers are increasingly looking for novel ways to address this growth in complexity and mitigate associated risks. Looking to conventional computing, many solutions have been executed to solve both the problem of complexity and heterogeneity in systems. In particular, the cloud-based paradigm provides a solution for distributing applications and storage capabilities across multiple platforms. In this paper, we propose utilizing a cloud-like architecture to provide a scalable mechanism for providing mass storage in spacecraft networks that can be reused on multiple spacecraft systems. By presenting a consistent interface to applications and devices that request data to be stored, complex systems designed by multiple organizations may be more readily integrated. Behind the abstraction, the cloud storage capability would manage wear-leveling, power consumption, and other attributes related to the physical memory devices, critical components in any mass storage solution for spacecraft. Our approach employs SpaceWire networks and SpaceWire-capable devices, although the concept could easily be extended to non-heterogeneous networks consisting of multiple spacecraft and potentially the ground segment.

  8. Tailor-made Design of Chemical Blends using Decomposition-based Computer-aided Approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Manan, Zainuddin Abd.; Gernaey, Krist

    methodology for blended liquid products that identifies a set of feasible chemical blends. The blend design problem is formulated as a nonlinear programming (NLP) model where the objective is to find the optimal blended gasoline or diesel product subject to blend chemicals and their compositions, a set...... to a specified priority. Finally, a short list of candidates, ordered in terms of specified performance criteria, is produced for final testing and selection. This systematic and computer-aided approach is illustrated through a case study involving the design of blends of gasoline with oxygenates from biomass...

  9. Design of tailor-made chemical blend using a decomposition-based computer-aided approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Gernaey, Krist; Manan, Z.A.

    2011-01-01

    methodology for blended liquid products that identifies a set of feasible chemical blends. The blend design problem is formulated as a Mixed Integer Nonlinear Programming (MINLP) model where the objective is to find the optimal blended gasoline or diesel product subject to types of chemicals....... The application of this systematic and computer-aided approach is illustrated through a case study involving the design of blends of gasoline with oxygenated compounds resulting from degradation and fermentation of biomass for use in internal combustion engines. Emphasis is given here on the concepts used...

  10. Identifying Patients in the Acute Psychiatric Hospital Who May Benefit From a Palliative Care Approach.

    Science.gov (United States)

    Burton, M Caroline; Warren, Mark; Cha, Stephen S; Stevens, Maria; Blommer, Megan; Kung, Simon; Lapid, Maria I

    2016-04-01

    Identifying patients who will benefit from a palliative care approach is the first critical step in integrating palliative with curative therapy. Criteria are established that identify hospitalized medical patients who are near end of life, yet there are no criteria with respect to hospitalized patients with psychiatric disorders. The records of 276 consecutive patients admitted to a dedicated inpatient psychiatric unit were reviewed to identify prognostic criteria predictive of mortality. Mortality predictors were 2 or more admissions in the past year (P = .0114) and older age (P = .0006). Twenty-two percent of patients met National Hospice and Palliative Care Organization noncancer criteria for dementia. Palliative care intervention should be considered when treating inpatients with psychiatric disorders, especially older patients who have a previous hospitalization or history of dementia.

  11. Candidate gene linkage approach to identify DNA variants that predispose to preterm birth

    DEFF Research Database (Denmark)

    Bream, Elise N A; Leppellere, Cara R; Cooper, Margaret E

    2013-01-01

    Background:The aim of this study was to identify genetic variants contributing to preterm birth (PTB) using a linkage candidate gene approach.Methods:We studied 99 single-nucleotide polymorphisms (SNPs) for 33 genes in 257 families with PTBs segregating. Nonparametric and parametric analyses were...... used. Premature infants and mothers of premature infants were defined as affected cases in independent analyses.Results:Analyses with the infant as the case identified two genes with evidence of linkage: CRHR1 (P = 0.0012) and CYP2E1 (P = 0.0011). Analyses with the mother as the case identified four...... through the infant and/or the mother in the etiology of PTB....

  12. An Educational Approach to Computationally Modeling Dynamical Systems

    Science.gov (United States)

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  13. Examining the Roles of Blended Learning Approaches in Computer-Supported Collaborative Learning (CSCL) Environments: A Delphi Study

    Science.gov (United States)

    So, Hyo-Jeong; Bonk, Curtis J.

    2010-01-01

    In this study, a Delphi method was used to identify and predict the roles of blended learning approaches in computer-supported collaborative learning (CSCL) environments. The Delphi panel consisted of experts in online learning from different geographic regions of the world. This study discusses findings related to (a) pros and cons of blended…

  14. A computationally identified compound antagonizes excess FGF-23 signaling in renal tubules and a mouse model of hypophosphatemia.

    Science.gov (United States)

    Xiao, Zhousheng; Riccardi, Demian; Velazquez, Hector A; Chin, Ai L; Yates, Charles R; Carrick, Jesse D; Smith, Jeremy C; Baudry, Jerome; Quarles, L Darryl

    2016-11-22

    Fibroblast growth factor-23 (FGF-23) interacts with a binary receptor complex composed of α-Klotho (α-KL) and FGF receptors (FGFRs) to regulate phosphate and vitamin D metabolism in the kidney. Excess FGF-23 production, which causes hypophosphatemia, is genetically inherited or occurs with chronic kidney disease. Among other symptoms, hypophosphatemia causes vitamin D deficiency and the bone-softening disorder rickets. Current therapeutics that target the receptor complex have limited utility clinically. Using a computationally driven, structure-based, ensemble docking and virtual high-throughput screening approach, we identified four novel compounds predicted to selectively inhibit FGF-23-induced activation of the FGFR/α-KL complex. Additional modeling and functional analysis found that Zinc13407541 bound to FGF-23 and disrupted its interaction with the FGFR1/α-KL complex; experiments in a heterologous cell expression system showed that Zinc13407541 selectivity inhibited α-KL-dependent FGF-23 signaling. Zinc13407541 also inhibited FGF-23 signaling in isolated renal tubules ex vivo and partially reversed the hypophosphatemic effects of excess FGF-23 in a mouse model. These chemical probes provide a platform to develop lead compounds to treat disorders caused by excess FGF-23.

  15. Identifying controlling variables for math computation fluency through experimental analysis: the interaction of stimulus control and reinforcing consequences.

    Science.gov (United States)

    Hofstadter-Duke, Kristi L; Daly, Edward J

    2015-03-01

    This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.

  16. The Metacognitive Approach to Computer Education: Making Explicit the Learning Journey

    Science.gov (United States)

    Phelps, Renata

    2007-01-01

    This paper presents a theoretical and practical exploration of a metacognitive approach to computer education, developed through a three-year action research project. It is argued that the approach contrasts significantly with often-employed directive and competency-based approaches to computer education and is more appropriate in addressing the…

  17. How damaged brains repeat words: a computational approach.

    Science.gov (United States)

    Nozari, Nazbanou; Dell, Gary S

    2013-09-01

    Two routes have been proposed for auditory repetition: a lexical route which activates a lexical item and retrieves its phonology, and a nonlexical route which maps input phonology directly onto output phonology. But when is the nonlexical route recruited? In a sample of 103 aphasic patients, we use computational models to select patients who do and do not recruit the nonlexical route, and compare them in light of three hypotheses: 1 - Lexical-phonological hypothesis: when the lexical route is weak, the nonlexical route is recruited. 2 - Nonlexical hypothesis: when the nonlexical route is weak, it is abandoned. 3 - Semantic-access hypothesis: when access to meaning fails, the nonlexical route is recruited. In neurocognitive terms, hypotheses 1 and 2 identify different aspects of the intactness of the dorsal stream, while the third hypothesis focuses on the ventral stream. Our findings (and a subsequent meta-analysis of four studies) support hypotheses 2 and 3. Ultimately, we claim that the choice about whether to recruit the nonlexical route is guided, not by assessment of production abilities that support repetition, but instead by relying on accessible cues, namely whether the speaker understands the word, or can remember its sequence of phonemes. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. A computational toy model for shallow landslides: Molecular Dynamics approach

    CERN Document Server

    Martelloni, Gianluca; Massaro, Emanuele

    2012-01-01

    The aim of this paper is to propose a 2D computational algorithm for modeling of the trigger and the propagation of shallow landslides caused by rainfall. We used a Molecular Dynamics (MD) inspired model, similar to discrete element method (DEM), that is suitable to model granular material and to observe the trajectory of single particle, so to identify its dynamical properties. We consider that the triggering of shallow landslides is caused by the decrease of the static friction along the sliding surface due to water infiltration by rainfall. Thence the triggering is caused by two following conditions: (a) a threshold speed of the particles and (b) a condition on the static friction, between particles and slope surface, based on the Mohr-Coulomb failure criterion. The latter static condition is used in the geotechnical model to estimate the possibility of landslide triggering. Finally the interaction force between particles is defined trough a potential that, in the absence of experimental data, we have mode...

  19. The Ulam Index: Methods of Theoretical Computer Science Help in Identifying Chemical Substances

    Science.gov (United States)

    Beltran, Adriana; Salvador, James

    1997-01-01

    In this paper, we show how methods developed for solving a theoretical computer problem of graph isomorphism are used in structural chemistry. We also discuss potential applications of these methods to exobiology: the search for life outside Earth.

  20. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  1. A Monomial Chaos Approach for Efficient Uncertainty Quantification in Computational Fluid Dynamics

    NARCIS (Netherlands)

    Witteveen, J.A.S.; Bijl, H.

    2006-01-01

    A monomial chaos approach is proposed for efficient uncertainty quantification in nonlinear computational problems. Propagating uncertainty through nonlinear equations can still be computationally intensive for existing uncertainty quantification methods. It usually results in a set of nonlinear equ

  2. A physarum-inspired prize-collecting steiner tree approach to identify subnetworks for drug repositioning.

    Science.gov (United States)

    Sun, Yahui; Hameed, Pathima Nusrath; Verspoor, Karin; Halgamuge, Saman

    2016-12-05

    Drug repositioning can reduce the time, costs and risks of drug development by identifying new therapeutic effects for known drugs. It is challenging to reposition drugs as pharmacological data is large and complex. Subnetwork identification has already been used to simplify the visualization and interpretation of biological data, but it has not been applied to drug repositioning so far. In this paper, we fill this gap by proposing a new Physarum-inspired Prize-Collecting Steiner Tree algorithm to identify subnetworks for drug repositioning. Drug Similarity Networks (DSN) are generated using the chemical, therapeutic, protein, and phenotype features of drugs. In DSNs, vertex prizes and edge costs represent the similarities and dissimilarities between drugs respectively, and terminals represent drugs in the cardiovascular class, as defined in the Anatomical Therapeutic Chemical classification system. A new Physarum-inspired Prize-Collecting Steiner Tree algorithm is proposed in this paper to identify subnetworks. We apply both the proposed algorithm and the widely-used GW algorithm to identify subnetworks in our 18 generated DSNs. In these DSNs, our proposed algorithm identifies subnetworks with an average Rand Index of 81.1%, while the GW algorithm can only identify subnetworks with an average Rand Index of 64.1%. We select 9 subnetworks with high Rand Index to find drug repositioning opportunities. 10 frequently occurring drugs in these subnetworks are identified as candidates to be repositioned for cardiovascular diseases. We find evidence to support previous discoveries that nitroglycerin, theophylline and acarbose may be able to be repositioned for cardiovascular diseases. Moreover, we identify seven previously unknown drug candidates that also may interact with the biological cardiovascular system. These discoveries show our proposed Prize-Collecting Steiner Tree approach as a promising strategy for drug repositioning.

  3. Translational informatics approach for identifying the functional molecular communicators linking coronary artery disease, infection and inflammation.

    Science.gov (United States)

    Sharma, Ankit; Ghatge, Madankumar; Mundkur, Lakshmi; Vangala, Rajani Kanth

    2016-05-01

    Translational informatics approaches are required for the integration of diverse and accumulating data to enable the administration of effective translational medicine specifically in complex diseases such as coronary artery disease (CAD). In the current study, a novel approach for elucidating the association between infection, inflammation and CAD was used. Genes for CAD were collected from the CAD‑gene database and those for infection and inflammation were collected from the UniProt database. The cytomegalovirus (CMV)‑induced genes were identified from the literature and the CAD‑associated clinical phenotypes were obtained from the Unified Medical Language System. A total of 55 gene ontologies (GO) termed functional communicator ontologies were identified in the gene sets linking clinical phenotypes in the diseasome network. The network topology analysis suggested that important functions including viral entry, cell adhesion, apoptosis, inflammatory and immune responses networked with clinical phenotypes. Microarray data was extracted from the Gene Expression Omnibus (dataset: GSE48060) for highly networked disease myocardial infarction. Further analysis of differentially expressed genes and their GO terms suggested that CMV infection may trigger a xenobiotic response, oxidative stress, inflammation and immune modulation. Notably, the current study identified γ‑glutamyl transferase (GGT)‑5 as a potential biomarker with an odds ratio of 1.947, which increased to 2.561 following the addition of CMV and CMV‑neutralizing antibody (CMV‑NA) titers. The C‑statistics increased from 0.530 for conventional risk factors (CRFs) to 0.711 for GGT in combination with the above mentioned infections and CRFs. Therefore, the translational informatics approach used in the current study identified a potential molecular mechanism for CMV infection in CAD, and a potential biomarker for risk prediction.

  4. Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.

    Science.gov (United States)

    Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin

    2015-02-01

    To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.

  5. Identifying comorbid depression and disruptive behavior disorders: comparison of two approaches used in adolescent studies.

    Science.gov (United States)

    Vander Stoep, Ann; Adrian, Molly C; Rhew, Isaac C; McCauley, Elizabeth; Herting, Jerald R; Kraemer, Helena C

    2012-07-01

    Interest in commonly co-occurring depression and disruptive behavior disorders in children has yielded a small body of research that estimates the prevalence of this comorbid condition and compares children with the comorbid condition and children with depression or disruptive behavior disorders alone with respect to antecedents and outcomes. Prior studies have used one of two different approaches to measure comorbid disorders: (1) meeting criteria for two DSM or ICD diagnoses or (2) scoring .5 SD above the mean or higher on two dimensional scales. This study compares two snapshots of comorbidity taken simultaneously in the same sample with each of the measurement approaches. The Developmental Pathways Project administered structured diagnostic interviews as well as dimensional scales to a community-based sample of 521 11-12 year olds to assess depression and disruptive behavior disorders. Clinical caseness indicators of children identified as "comorbid" by each method were examined concurrently and 3-years later. Cross-classification of adolescents via the two approaches revealed low agreement. When other indicators of caseness, including functional impairment, need for services, and clinical elevations on other symptom scales were examined, adolescents identified as comorbid via dimensional scales only were similar to those who were identified as comorbid via DSM-IV diagnostic criteria. Findings suggest that when relying solely on DSM diagnostic criteria for comorbid depression and disruptive behavior disorders, many adolescents with significant impairment will be overlooked. Findings also suggest that lower dimensional scale thresholds can be set when comorbid conditions, rather than single forms of psychopathology, are being identified.

  6. Identifying New Candidate Genes and Chemicals Related to Prostate Cancer Using a Hybrid Network and Shortest Path Approach

    Science.gov (United States)

    Yuan, Fei; Zhou, You; Wang, Meng; Yang, Jing; Wu, Kai; Lu, Changhong; Kong, Xiangyin; Cai, Yu-Dong

    2015-01-01

    Prostate cancer is a type of cancer that occurs in the male prostate, a gland in the male reproductive system. Because prostate cancer cells may spread to other parts of the body and can influence human reproduction, understanding the mechanisms underlying this disease is critical for designing effective treatments. The identification of as many genes and chemicals related to prostate cancer as possible will enhance our understanding of this disease. In this study, we proposed a computational method to identify new candidate genes and chemicals based on currently known genes and chemicals related to prostate cancer by applying a shortest path approach in a hybrid network. The hybrid network was constructed according to information concerning chemical-chemical interactions, chemical-protein interactions, and protein-protein interactions. Many of the obtained genes and chemicals are associated with prostate cancer. PMID:26504486

  7. Investigation of different modeling approaches for computational fluid dynamics simulation of high-pressure rocket combustors

    Science.gov (United States)

    Ivancic, B.; Riedmann, H.; Frey, M.; Knab, O.; Karl, S.; Hannemann, K.

    2016-07-01

    The paper summarizes technical results and first highlights of the cooperation between DLR and Airbus Defence and Space (DS) within the work package "CFD Modeling of Combustion Chamber Processes" conducted in the frame of the Propulsion 2020 Project. Within the addressed work package, DLR Göttingen and Airbus DS Ottobrunn have identified several test cases where adequate test data are available and which can be used for proper validation of the computational fluid dynamics (CFD) tools. In this paper, the first test case, the Penn State chamber (RCM1), is discussed. Presenting the simulation results from three different tools, it is shown that the test case can be computed properly with steady-state Reynolds-averaged Navier-Stokes (RANS) approaches. The achieved simulation results reproduce the measured wall heat flux as an important validation parameter very well but also reveal some inconsistencies in the test data which are addressed in this paper.

  8. Importance of multi-modal approaches to effectively identify cataract cases from electronic health records.

    Science.gov (United States)

    Peissig, Peggy L; Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B

    2012-01-01

    There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries.

  9. A computational intelligence approach to the Mars Precision Landing problem

    Science.gov (United States)

    Birge, Brian Kent, III

    Various proposed Mars missions, such as the Mars Sample Return Mission (MRSR) and the Mars Smart Lander (MSL), require precise re-entry terminal position and velocity states. This is to achieve mission objectives including rendezvous with a previous landed mission, or reaching a particular geographic landmark. The current state of the art footprint is in the magnitude of kilometers. For this research a Mars Precision Landing is achieved with a landed footprint of no more than 100 meters, for a set of initial entry conditions representing worst guess dispersions. Obstacles to reducing the landed footprint include trajectory dispersions due to initial atmospheric entry conditions (entry angle, parachute deployment height, etc.), environment (wind, atmospheric density, etc.), parachute deployment dynamics, unavoidable injection error (propagated error from launch on), etc. Weather and atmospheric models have been developed. Three descent scenarios have been examined. First, terminal re-entry is achieved via a ballistic parachute with concurrent thrusting events while on the parachute, followed by a gravity turn. Second, terminal re-entry is achieved via a ballistic parachute followed by gravity turn to hover and then thrust vector to desired location. Third, a guided parafoil approach followed by vectored thrusting to reach terminal velocity is examined. The guided parafoil is determined to be the best architecture. The purpose of this study is to examine the feasibility of using a computational intelligence strategy to facilitate precision planetary re-entry, specifically to take an approach that is somewhat more intuitive and less rigid, and see where it leads. The test problems used for all research are variations on proposed Mars landing mission scenarios developed by NASA. A relatively recent method of evolutionary computation is Particle Swarm Optimization (PSO), which can be considered to be in the same general class as Genetic Algorithms. An improvement over

  10. Improving accuracy for identifying related PubMed queries by an integrated approach.

    Science.gov (United States)

    Lu, Zhiyong; Wilbur, W John

    2009-10-01

    PubMed is the most widely used tool for searching biomedical literature online. As with many other online search tools, a user often types a series of multiple related queries before retrieving satisfactory results to fulfill a single information need. Meanwhile, it is also a common phenomenon to see a user type queries on unrelated topics in a single session. In order to study PubMed users' search strategies, it is necessary to be able to automatically separate unrelated queries and group together related queries. Here, we report a novel approach combining both lexical and contextual analyses for segmenting PubMed query sessions and identifying related queries and compare its performance with the previous approach based solely on concept mapping. We experimented with our integrated approach on sample data consisting of 1539 pairs of consecutive user queries in 351 user sessions. The prediction results of 1396 pairs agreed with the gold-standard annotations, achieving an overall accuracy of 90.7%. This demonstrates that our approach is significantly better than the previously published method. By applying this approach to a one day query log of PubMed, we found that a significant proportion of information needs involved more than one PubMed query, and that most of the consecutive queries for the same information need are lexically related. Finally, the proposed PubMed distance is shown to be an accurate and meaningful measure for determining the contextual similarity between biological terms. The integrated approach can play a critical role in handling real-world PubMed query log data as is demonstrated in our experiments.

  11. Identifying overlapping and hierarchical thematic structures in networks of scholarly papers: a comparison of three approaches.

    Science.gov (United States)

    Havemann, Frank; Gläser, Jochen; Heinz, Michael; Struck, Alexander

    2012-01-01

    The aim of this paper is to introduce and assess three algorithms for the identification of overlapping thematic structures in networks of papers. We implemented three recently proposed approaches to the identification of overlapping and hierarchical substructures in graphs and applied the corresponding algorithms to a network of 492 information-science papers coupled via their cited sources. The thematic substructures obtained and overlaps produced by the three hierarchical cluster algorithms were compared to a content-based categorisation, which we based on the interpretation of titles, abstracts, and keywords. We defined sets of papers dealing with three topics located on different levels of aggregation: h-index, webometrics, and bibliometrics. We identified these topics with branches in the dendrograms produced by the three cluster algorithms and compared the overlapping topics they detected with one another and with the three predefined paper sets. We discuss the advantages and drawbacks of applying the three approaches to paper networks in research fields.

  12. Identifying overlapping and hierarchical thematic structures in networks of scholarly papers: a comparison of three approaches.

    Directory of Open Access Journals (Sweden)

    Frank Havemann

    Full Text Available The aim of this paper is to introduce and assess three algorithms for the identification of overlapping thematic structures in networks of papers. We implemented three recently proposed approaches to the identification of overlapping and hierarchical substructures in graphs and applied the corresponding algorithms to a network of 492 information-science papers coupled via their cited sources. The thematic substructures obtained and overlaps produced by the three hierarchical cluster algorithms were compared to a content-based categorisation, which we based on the interpretation of titles, abstracts, and keywords. We defined sets of papers dealing with three topics located on different levels of aggregation: h-index, webometrics, and bibliometrics. We identified these topics with branches in the dendrograms produced by the three cluster algorithms and compared the overlapping topics they detected with one another and with the three predefined paper sets. We discuss the advantages and drawbacks of applying the three approaches to paper networks in research fields.

  13. Identifying Overlapping and Hierarchical Thematic Structures in Networks of Scholarly Papers: A Comparison of Three Approaches

    CERN Document Server

    Havemann, Frank; Heinz, Michael; Struck, Alexander

    2011-01-01

    We implemented three recently proposed approaches to the identification of overlapping and hierarchical substructures in graphs and applied the corresponding algorithms to a network of 492 information-science papers coupled via their cited sources. The thematic substructures obtained and overlaps produced by the three hierarchical cluster algorithms were compared to a content-based categorisation, which we based on the interpretation of titles and keywords. We defined sets of papers dealing with three topics located on different levels of aggregation: h-index, webometrics, and bibliometrics. We identified these topics with branches in the dendrograms produced by the three cluster algorithms and compared the overlapping topics they detected with one another and with the three pre-defined paper sets. We discuss the advantages and drawbacks of applying the three approaches to paper networks in research fields.

  14. A Systematic Approach to Identify Candidate Transcription Factors that Control Cell Identity

    Directory of Open Access Journals (Sweden)

    Ana C. D’Alessio

    2015-11-01

    Full Text Available Hundreds of transcription factors (TFs are expressed in each cell type, but cell identity can be induced through the activity of just a small number of core TFs. Systematic identification of these core TFs for a wide variety of cell types is currently lacking and would establish a foundation for understanding the transcriptional control of cell identity in development, disease, and cell-based therapy. Here, we describe a computational approach that generates an atlas of candidate core TFs for a broad spectrum of human cells. The potential impact of the atlas was demonstrated via cellular reprogramming efforts where candidate core TFs proved capable of converting human fibroblasts to retinal pigment epithelial-like cells. These results suggest that candidate core TFs from the atlas will prove a useful starting point for studying transcriptional control of cell identity and reprogramming in many human cell types.

  15. Objective definition of rosette shape variation using a combined computer vision and data mining approach.

    Directory of Open Access Journals (Sweden)

    Anyela Camargo

    Full Text Available Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.

  16. Objective definition of rosette shape variation using a combined computer vision and data mining approach.

    Science.gov (United States)

    Camargo, Anyela; Papadopoulou, Dimitra; Spyropoulou, Zoi; Vlachonasios, Konstantinos; Doonan, John H; Gay, Alan P

    2014-01-01

    Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.

  17. Identifying western yellow-billed cuckoo breeding habitat with a dual modelling approach

    Science.gov (United States)

    Johnson, Matthew J.; Hatten, James R.; Holmes, Jennifer A.; Shafroth, Patrick B.

    2017-01-01

    The western population of the yellow-billed cuckoo (Coccyzus americanus) was recently listed as threatened under the federal Endangered Species Act. Yellow-billed cuckoo conservation efforts require the identification of features and area requirements associated with high quality, riparian forest habitat at spatial scales that range from nest microhabitat to landscape, as well as lower-suitability areas that can be enhanced or restored. Spatially explicit models inform conservation efforts by increasing ecological understanding of a target species, especially at landscape scales. Previous yellow-billed cuckoo modelling efforts derived plant-community maps from aerial photography, an expensive and oftentimes inconsistent approach. Satellite models can remotely map vegetation features (e.g., vegetation density, heterogeneity in vegetation density or structure) across large areas with near perfect repeatability, but they usually cannot identify plant communities. We used aerial photos and satellite imagery, and a hierarchical spatial scale approach, to identify yellow-billed cuckoo breeding habitat along the Lower Colorado River and its tributaries. Aerial-photo and satellite models identified several key features associated with yellow-billed cuckoo breeding locations: (1) a 4.5 ha core area of dense cottonwood-willow vegetation, (2) a large native, heterogeneously dense forest (72 ha) around the core area, and (3) moderately rough topography. The odds of yellow-billed cuckoo occurrence decreased rapidly as the amount of tamarisk cover increased or when cottonwood-willow vegetation was limited. We achieved model accuracies of 75–80% in the project area the following year after updating the imagery and location data. The two model types had very similar probability maps, largely predicting the same areas as high quality habitat. While each model provided unique information, a dual-modelling approach provided a more complete picture of yellow-billed cuckoo habitat

  18. Computational Experiment Approach to Controlled Evolution of Procurement Pattern in Cluster Supply Chain

    Directory of Open Access Journals (Sweden)

    Xiao Xue

    2015-01-01

    Full Text Available Companies have been aware of the benefits of developing Cluster Supply Chains (CSCs, and they are spending a great deal of time and money attempting to develop the new business pattern. Yet, the traditional techniques for identifying CSCs have strong theoretical antecedents, but seem to have little traction in the field. We believe this is because the standard techniques fail to capture evolution over time, nor provide useful intervention measures to reach goals. To address these problems, we introduce an agent-based modeling approach to evaluate CSCs. Taking collaborative procurement as research object, our approach is composed of three parts: model construction, model instantiation, and computational experiment. We use the approach to explore the service charging policy problem in collaborative procurement. Three kinds of service charging polices are compared in the same experiment environment. Finally, “Fixed Cost” is identified as the optimal policy under the stable market environment. The case study can help us to understand the workflow of applying the approach, and provide valuable decision support applications to industry.

  19. PATE, a gene expressed in prostate cancer, normal prostate, and testis, identified by a functional genomic approach

    Science.gov (United States)

    Bera, Tapan K.; Maitra, Rangan; Iavarone, Carlo; Salvatore, Giuliana; Kumar, Vasantha; Vincent, James J.; Sathyanarayana, B. K.; Duray, Paul; Lee, B. K.; Pastan, Ira

    2002-03-01

    To identify target antigens for prostate cancer therapy, we have combined computer-based screening of the human expressed sequence tag database and experimental expression analysis to identify genes that are expressed in normal prostate and prostate cancer but not in essential human tissues. Using this approach, we identified a gene that is expressed specifically in prostate cancer, normal prostate, and testis. The gene has a 1.5-kb transcript that encodes a protein of 14 kDa. We named this gene PATE (expressed in prostate and testis). In situ hybridization shows that PATE mRNA is expressed in the epithelial cells of prostate cancers and in normal prostate. Transfection of the PATE cDNA with a Myc epitope tag into NIH 3T3 cells and subsequent cell fractionation analysis shows that the PATE protein is localized in the membrane fraction of the cell. Analysis of the amino acid sequence of PATE shows that it has structural similarities to a group of proteins known as three-finger toxins, which includes the extracellular domain of the type transforming growth factor receptor. Restricted expression of PATE makes it a potential candidate for the immunotherapy of prostate cancer.

  20. An interdisciplinary approach to identify adaptation strategies that enhance flood resilience and urban liveability

    DEFF Research Database (Denmark)

    Rogers, B. C.; Bertram, N.; Gunn, Alex

    This paper provides guidance on how to identify and design the most suitable climate adaptation strategies for enhancing the liveability and flood resilience of urban catchments. It presents findings from a case study of Elwood, a coastal Melbourne suburb regularly affected by flooding...... that ensuring a city’s flood resilience involves a range of measures to retreat from, adapt to and defend against flooding; this necessarily requires an integrated approach and interdisciplinary expertise to develop adaptation pathways that are grounded in community aspirations and priorities, inspired by novel...

  1. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    Directory of Open Access Journals (Sweden)

    Dori Barnett

    2012-06-01

    Full Text Available A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about data collection and analysis. Implications for future research directions and policy and practice in the field of special and alternative education are discussed.

  2. Identifying the Critical Links in Road Transportation Networks: Centrality-based approach utilizing structural properties

    Energy Technology Data Exchange (ETDEWEB)

    Chinthavali, Supriya [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Surface transportation road networks share structural properties similar to other complex networks (e.g., social networks, information networks, biological networks, and so on). This research investigates the structural properties of road networks for any possible correlation with the traffic characteristics such as link flows those determined independently. Additionally, we define a criticality index for the links of the road network that identifies the relative importance in the network. We tested our hypotheses with two sample road networks. Results show that, correlation exists between the link flows and centrality measures of a link of the road (dual graph approach is followed) and the criticality index is found to be effective for one test network to identify the vulnerable nodes.

  3. Morphology control in polymer blend fibers—a high throughput computing approach

    Science.gov (United States)

    Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar

    2016-08-01

    Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.

  4. Identifying Students' Reasons for Selecting a Computer-Mediated or Lecture Class

    Science.gov (United States)

    Kinney, D. Patrick; Robertson, Douglas F.

    2005-01-01

    Students in this study were enrolled in either an Introductory Algebra or Intermediate Algebra class taught through computer-mediated instruction or lecture. In the first year of the study, students were asked what they believed helped them learn mathematics in the instructional format in which they were enrolled. They were also asked what they…

  5. Precision of identifying cephalometric landmarks with cone beam computed tomography in vivo

    NARCIS (Netherlands)

    Hassan, B.; Nijkamp, P.; Verheij, H.; Tairie, J.; Vink, C.; van der Stelt, P.; van Beek, H.

    2013-01-01

    The study aims were to assess the precision and time required to conduct cephalometric analysis with cone-beam computed tomography (CBCT) in vivo on both three-dimensional (3D) surface models and multi-planar reformations (MPR) images. Datasets from 10 patients scanned with CBCT were used to create

  6. A Computer Vision System forLocating and Identifying Internal Log Defects Using CT Imagery

    Science.gov (United States)

    Dongping Zhu; Richard W. Conners; Frederick Lamb; Philip A. Araman

    1991-01-01

    A number of researchers have shown the ability of magnetic resonance imaging (MRI) and computer tomography (CT) imaging to detect internal defects in logs. However, if these devices are ever to play a role in the forest products industry, automatic methods for analyzing data from these devices must be developed. This paper reports research aimed at developing a...

  7. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2006-01-01

    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  8. Gesture Recognition by Computer Vision: An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  9. Computer Science Contests for Secondary School Students: Approaches to Classification

    Directory of Open Access Journals (Sweden)

    Wolfgang POHL

    2006-04-01

    Full Text Available The International Olympiad in Informatics currently provides a model which is imitated by the majority of contests for secondary school students in Informatics or Computer Science. However, the IOI model can be criticized, and alternative contest models exist. To support the discussion about contests in Computer Science, several dimensions for characterizing and classifying contests are suggested.

  10. Gesture Recognition by Computer Vision: An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  11. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  12. A systematic approach for identifying and presenting mechanistic evidence in human health assessments

    Science.gov (United States)

    Kushman, Mary E.; Kraft, Andrew D.; Guyton, Kathryn Z.; Chiu, Weihsueh A.; Makris, Susan L.; Rusyn, Ivan

    2013-01-01

    Clear documentation of literature search and presentation methodologies can improve transparency in chemical hazard assessments. We sought to improve clarity for the scientific support for cancer mechanisms of action using a systematic approach to literature retrieval, selection, and presentation of studies. The general question was “What are the mechanisms by which a chemical may cause carcinogenicity in the target tissue?” Di(2-ethylhexyl)phthalate was used as a case study chemical with a complex database of >3,000 publications. Relevant mechanistic events were identified from published reviews. The PubMed search strategy included relevant synonyms and wildcards for DEHP and its metabolites, mechanistic events, and species of interest. Tiered exclusion/inclusion criteria for study pertinence were defined, and applied to the retrieved literature. Manual curation was conducted for mechanistic events with large literature databases. Literature trees documented identification and selection of the literature evidence. The selected studies were summarized in evidence tables accompanied by succinct narratives. Primary publications were deposited into the Health and Environmental Research Online (http://hero.epa.gov/) database and identified by pertinence criteria and key terms to permit organized retrieval. This approach contributes to human health assessment by effectively managing a large volume of literature, improving transparency, and facilitating subsequent synthesis of information across studies. PMID:23959061

  13. Identifying ligands at orphan GPCRs: current status using structure-based approaches.

    Science.gov (United States)

    Ngo, Tony; Kufareva, Irina; Coleman, James Lj; Graham, Robert M; Abagyan, Ruben; Smith, Nicola J

    2016-10-01

    GPCRs are the most successful pharmaceutical targets in history. Nevertheless, the pharmacology of many GPCRs remains inaccessible as their endogenous or exogenous modulators have not been discovered. Tools that explore the physiological functions and pharmacological potential of these 'orphan' GPCRs, whether they are endogenous and/or surrogate ligands, are therefore of paramount importance. Rates of receptor deorphanization determined by traditional reverse pharmacology methods have slowed, indicating a need for the development of more sophisticated and efficient ligand screening approaches. Here, we discuss the use of structure-based ligand discovery approaches to identify small molecule modulators for exploring the function of orphan GPCRs. These studies have been buoyed by the growing number of GPCR crystal structures solved in the past decade, providing a broad range of template structures for homology modelling of orphans. This review discusses the methods used to establish the appropriate signalling assays to test orphan receptor activity and provides current examples of structure-based methods used to identify ligands of orphan GPCRs. Linked Articles This article is part of a themed section on Molecular Pharmacology of G Protein-Coupled Receptors. To view the other articles in this section visit http://onlinelibrary.wiley.com/doi/10.1111/bph.v173.20/issuetoc.

  14. A review of approaches to identifying patient phenotype cohorts using electronic health records.

    Science.gov (United States)

    Shivade, Chaitanya; Raghavan, Preethi; Fosler-Lussier, Eric; Embi, Peter J; Elhadad, Noemie; Johnson, Stephen B; Lai, Albert M

    2014-01-01

    To summarize literature describing approaches aimed at automatically identifying patients with a common phenotype. We performed a review of studies describing systems or reporting techniques developed for identifying cohorts of patients with specific phenotypes. Every full text article published in (1) Journal of American Medical Informatics Association, (2) Journal of Biomedical Informatics, (3) Proceedings of the Annual American Medical Informatics Association Symposium, and (4) Proceedings of Clinical Research Informatics Conference within the past 3 years was assessed for inclusion in the review. Only articles using automated techniques were included. Ninety-seven articles met our inclusion criteria. Forty-six used natural language processing (NLP)-based techniques, 24 described rule-based systems, 41 used statistical analyses, data mining, or machine learning techniques, while 22 described hybrid systems. Nine articles described the architecture of large-scale systems developed for determining cohort eligibility of patients. We observe that there is a rise in the number of studies associated with cohort identification using electronic medical records. Statistical analyses or machine learning, followed by NLP techniques, are gaining popularity over the years in comparison with rule-based systems. There are a variety of approaches for classifying patients into a particular phenotype. Different techniques and data sources are used, and good performance is reported on datasets at respective institutions. However, no system makes comprehensive use of electronic medical records addressing all of their known weaknesses.

  15. An innovative and integrated approach based on DNA walking to identify unauthorised GMOs.

    Science.gov (United States)

    Fraiture, Marie-Alice; Herman, Philippe; Taverniers, Isabel; De Loose, Marc; Deforce, Dieter; Roosens, Nancy H

    2014-03-15

    In the coming years, the frequency of unauthorised genetically modified organisms (GMOs) being present in the European food and feed chain will increase significantly. Therefore, we have developed a strategy to identify unauthorised GMOs containing a pCAMBIA family vector, frequently present in transgenic plants. This integrated approach is performed in two successive steps on Bt rice grains. First, the potential presence of unauthorised GMOs is assessed by the qPCR SYBR®Green technology targeting the terminator 35S pCAMBIA element. Second, its presence is confirmed via the characterisation of the junction between the transgenic cassette and the rice genome. To this end, a DNA walking strategy is applied using a first reverse primer followed by two semi-nested PCR rounds using primers that are each time nested to the previous reverse primer. This approach allows to rapidly identify the transgene flanking region and can easily be implemented by the enforcement laboratories. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  17. a Holistic Approach for Inspection of Civil Infrastructures Based on Computer Vision Techniques

    Science.gov (United States)

    Stentoumis, C.; Protopapadakis, E.; Doulamis, A.; Doulamis, N.

    2016-06-01

    In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  18. Blind SELEX Approach Identifies RNA Aptamers That Regulate EMT and Inhibit Metastasis.

    Science.gov (United States)

    Yoon, Sorah; Armstrong, Brian; Habib, Nagy; Rossi, John J

    2017-07-01

    Identifying targets that are exposed on the plasma membrane of tumor cells, but expressed internally in normal cells, is a fundamental issue for improving the specificity and efficacy of anticancer therpeutics. Using blind cell Systemic Evolution of Ligands by EXponetial enrichment (SELEX), which is untargeted SELEX, we have identified an aptamer, P15, which specifically bound to the human pancreatic adenocarcinoma cells. To identify the aptamer binding plasma membrane protein, liquid chromatography tandem mass spectrometry (LC-MS/MS) was used. The results of this unbiased proteomic mass spectrometry approach identified the target of P15 as the intermediate filament vimentin, biomarker of epithelial-mesenchymal transition (EMT), which is an intracellular protein but is specifically expressed on the plasma membrane of cancer cells. As EMT plays a pivotal role to transit cancer cells to invasive cells, tumor cell metastasis assays were performed in vitro P15-treated pancreatic cancer cells showed the significant inhibition of tumor metastasis. To investigate the downstream effects of P15, EMT-related gene expression analysis was performed to identify differently expressed genes (DEG). Among five DEGs, P15-treated cells showed the downregulated expression of matrix metallopeptidase 3 (MMP3), which is involved in cancer invasion. These results, for the first time, demonstrate that P15 binding to cell surface vimentin inhibits the tumor cell invasion and is associated with reduced MMP3 expression. Thus, suggesting that P15 has potential as an anti-metastatic therapy in pancreatic cancer.Implications: This study reveals that anti-vimentin RNA aptamers selected via blind-SELEX inhibit the tumor cell metastasis. Mol Cancer Res; 15(7); 811-20. ©2017 AACR. ©2017 American Association for Cancer Research.

  19. Use of cone beam computed tomography in identifying postmenopausal women with osteoporosis.

    Science.gov (United States)

    Brasileiro, C B; Chalub, L L F H; Abreu, M H N G; Barreiros, I D; Amaral, T M P; Kakehasi, A M; Mesquita, R A

    2017-12-01

    The aim of this study is to correlate radiometric indices from cone beam computed tomography (CBCT) images and bone mineral density (BMD) in postmenopausal women. Quantitative CBCT indices can be used to screen for women with low BMD. Osteoporosis is a disease characterized by the deterioration of bone tissue and the consequent decrease in BMD and increase in bone fragility. Several studies have been performed to assess radiometric indices in panoramic images as low-BMD predictors. The aim of this study is to correlate radiometric indices from CBCT images and BMD in postmenopausal women. Sixty postmenopausal women with indications for dental implants and CBCT evaluation were selected. Dual-energy X-ray absorptiometry (DXA) was performed, and the patients were divided into normal, osteopenia, and osteoporosis groups, according to the World Health Organization (WHO) criteria. Cross-sectional images were used to evaluate the computed tomography mandibular index (CTMI), the computed tomography index (inferior) (CTI (I)) and computed tomography index (superior) (CTI (S)). Student's t test was used to compare the differences between the indices of the groups' intraclass correlation coefficient (ICC). Statistical analysis showed a high degree of interobserver and intraobserver agreement for all measurements (ICC > 0.80). The mean values of CTMI, CTI (S), and CTI (I) were lower in the osteoporosis group than in osteopenia and normal patients (p < 0.05). In comparing normal patients and women with osteopenia, there was no statistically significant difference in the mean value of CTI (I) (p = 0.075). Quantitative CBCT indices may help dentists to screen for women with low spinal and femoral bone mineral density so that they can refer postmenopausal women for bone densitometry.

  20. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  1. Loss tolerant one-way quantum computation -- a horticultural approach

    CERN Document Server

    Varnava, M; Rudolph, T; Varnava, Michael; Browne, Daniel E.; Rudolph, Terry

    2005-01-01

    We introduce a scheme for fault tolerantly dealing with losses in cluster state computation that can tolerate up to 50% qubit loss. This is achieved passively - no coherent measurements or coherent correction is required. We then use this procedure within a specific linear optical quantum computation proposal to show that: (i) given perfect sources, detector inefficiencies of up to 50% can be tolerated and (ii) given perfect detectors, the purity of the photon source (overlap of the photonic wavefunction with the desired single mode) need only be greater than 66.6% for efficient computation to be possible.

  2. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  3. A complementary approach to promoting professionalism: identifying, measuring, and addressing unprofessional behaviors.

    Science.gov (United States)

    Hickson, Gerald B; Pichert, James W; Webb, Lynn E; Gabbe, Steven G

    2007-11-01

    Vanderbilt University School of Medicine (VUSM) employs several strategies for teaching professionalism. This article, however, reviews VUSM's alternative, complementary approach: identifying, measuring, and addressing unprofessional behaviors. The key to this alternative approach is a supportive infrastructure that includes VUSM leadership's commitment to addressing unprofessional/disruptive behaviors, a model to guide intervention, supportive institutional policies, surveillance tools for capturing patients' and staff members' allegations, review processes, multilevel training, and resources for addressing disruptive behavior.Our model for addressing disruptive behavior focuses on four graduated interventions: informal conversations for single incidents, nonpunitive "awareness" interventions when data reveal patterns, leader-developed action plans if patterns persist, and imposition of disciplinary processes if the plans fail. Every physician needs skills for conducting informal interventions with peers; therefore, these are taught throughout VUSM's curriculum. Physician leaders receive skills training for conducting higher-level interventions. No single strategy fits every situation, so we teach a balance beam approach to understanding and weighing the pros and cons of alternative intervention-related communications. Understanding common excuses, rationalizations, denials, and barriers to change prepares physicians to appropriately, consistently, and professionally address the real issues. Failing to address unprofessional behavior simply promotes more of it. Besides being the right thing to do, addressing unprofessional behavior can yield improved staff satisfaction and retention, enhanced reputation, professionals who model the curriculum as taught, improved patient safety and risk-management experience, and better, more productive work environments.

  4. Identifying functional reorganization of spelling networks: An Individual Peak Probability Comparison Approach.

    Directory of Open Access Journals (Sweden)

    Jeremy Joseph Purcell

    2013-12-01

    Full Text Available Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011; in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE (Turkeltaub et al., 2002. This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual’s activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011 that compares each of a control group’s peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual’s peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual’s activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual’s activation pattern with that of a set of other individuals.

  5. Evaluation of approaches to identify the targets of cellular immunity on a proteome-wide scale.

    Directory of Open Access Journals (Sweden)

    Fernanda C Cardoso

    Full Text Available BACKGROUND: Vaccine development against malaria and other complex diseases remains a challenge for the scientific community. The recent elucidation of the genome, proteome and transcriptome of many of these complex pathogens provides the basis for rational vaccine design by identifying, on a proteome-wide scale, novel target antigens that are recognized by T cells and antibodies from exposed individuals. However, there is currently no algorithm to effectively identify important target antigens from genome sequence data; this is especially challenging for T cell targets. Furthermore, for some of these pathogens, such as Plasmodium, protein expression using conventional platforms has been problematic but cell-free in vitro transcription translation (IVTT strategies have recently proved successful. Herein, we report a novel approach for proteome-wide scale identification of the antigenic targets of T cell responses using IVTT products. PRINCIPAL FINDINGS: We conducted a series of in vitro and in vivo experiments using IVTT proteins either unpurified, absorbed to carboxylated polybeads, or affinity purified through nickel resin or magnetic beads. In vitro studies in humans using CMV, EBV, and Influenza A virus proteins showed antigen-specific cytokine production in ELIspot and Cytometric Bead Array assays with cells stimulated with purified or unpurified IVTT antigens. In vitro and in vivo studies in mice immunized with the Plasmodium yoelii circumsporozoite DNA vaccine with or without IVTT protein boost showed antigen-specific cytokine production using purified IVTT antigens only. Overall, the nickel resin method of IVTT antigen purification proved optimal in both human and murine systems. CONCLUSIONS: This work provides proof of concept for the potential of high-throughput approaches to identify T cell targets of complex parasitic, viral or bacterial pathogens from genomic sequence data, for rational vaccine development against emerging and re

  6. Identifying functional reorganization of spelling networks: an individual peak probability comparison approach.

    Science.gov (United States)

    Purcell, Jeremy J; Rapp, Brenda

    2013-01-01

    Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals.

  7. Reflections on John Monaghan's "Computer Algebra, Instrumentation, and the Anthropological Approach"

    Science.gov (United States)

    Blume, Glen

    2007-01-01

    Reactions to John Monaghan's "Computer Algebra, Instrumentation and the Anthropological Approach" focus on a variety of issues related to the ergonomic approach (instrumentation) and anthropological approach to mathematical activity and practice. These include uses of the term technique; several possibilities for integration of the two approaches;…

  8. A computational approach to understand in vitro alveolar morphogenesis.

    Directory of Open Access Journals (Sweden)

    Sean H J Kim

    Full Text Available Primary human alveolar type II (AT II epithelial cells maintained in Matrigel cultures form alveolar-like cysts (ALCs using a cytogenesis mechanism that is different from that of other studied epithelial cell types: neither proliferation nor death is involved. During ALC formation, AT II cells engage simultaneously in fundamentally different, but not fully characterized activities. Mechanisms enabling these activities and the roles they play during different process stages are virtually unknown. Identifying, characterizing, and understanding the activities and mechanisms are essential to achieving deeper insight into this fundamental feature of morphogenesis. That deeper insight is needed to answer important questions. When and how does an AT cell choose to switch from one activity to another? Why does it choose one action rather than another? We report obtaining plausible answers using a rigorous, multi-attribute modeling and simulation approach that leveraged earlier efforts by using new, agent and object-oriented capabilities. We discovered a set of cell-level operating principles that enabled in silico cells to self-organize and generate systemic cystogenesis phenomena that are quantitatively indistinguishable from those observed in vitro. Success required that the cell components be quasi-autonomous. As simulation time advances, each in silico cell autonomously updates its environment information to reclassify its condition. It then uses the axiomatic operating principles to execute just one action for each possible condition. The quasi-autonomous actions of individual in silico cells were sufficient for developing stable cyst-like structures. The results strengthen in silico to in vitro mappings at three levels: mechanisms, behaviors, and operating principles, thereby achieving a degree of validation and enabling answering the questions posed. We suggest that the in silico operating principles presented may have a biological counterpart

  9. A sequence-based approach to identify reference genes for gene expression analysis

    Directory of Open Access Journals (Sweden)

    Chari Raj

    2010-08-01

    Full Text Available Abstract Background An important consideration when analyzing both microarray and quantitative PCR expression data is the selection of appropriate genes as endogenous controls or reference genes. This step is especially critical when identifying genes differentially expressed between datasets. Moreover, reference genes suitable in one context (e.g. lung cancer may not be suitable in another (e.g. breast cancer. Currently, the main approach to identify reference genes involves the mining of expression microarray data for highly expressed and relatively constant transcripts across a sample set. A caveat here is the requirement for transcript normalization prior to analysis, and measurements obtained are relative, not absolute. Alternatively, as sequencing-based technologies provide digital quantitative output, absolute quantification ensues, and reference gene identification becomes more accurate. Methods Serial analysis of gene expression (SAGE profiles of non-malignant and malignant lung samples were compared using a permutation test to identify the most stably expressed genes across all samples. Subsequently, the specificity of the reference genes was evaluated across multiple tissue types, their constancy of expression was assessed using quantitative RT-PCR (qPCR, and their impact on differential expression analysis of microarray data was evaluated. Results We show that (i conventional references genes such as ACTB and GAPDH are highly variable between cancerous and non-cancerous samples, (ii reference genes identified for lung cancer do not perform well for other cancer types (breast and brain, (iii reference genes identified through SAGE show low variability using qPCR in a different cohort of samples, and (iv normalization of a lung cancer gene expression microarray dataset with or without our reference genes, yields different results for differential gene expression and subsequent analyses. Specifically, key established pathways in lung

  10. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells.

    Directory of Open Access Journals (Sweden)

    Victor Trevino

    2016-04-01

    Full Text Available The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell

  11. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells

    Science.gov (United States)

    Trevino, Victor; Cassese, Alberto; Nagy, Zsuzsanna; Zhuang, Xiaodong; Herbert, John; Antzack, Philipp; Clarke, Kim; Davies, Nicholas; Rahman, Ayesha; Campbell, Moray J.; Bicknell, Roy; Vannucci, Marina; Falciani, Francesco

    2016-01-01

    Abstract The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell communication

  12. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells.

    Science.gov (United States)

    Trevino, Victor; Cassese, Alberto; Nagy, Zsuzsanna; Zhuang, Xiaodong; Herbert, John; Antczak, Philipp; Clarke, Kim; Davies, Nicholas; Rahman, Ayesha; Campbell, Moray J; Guindani, Michele; Bicknell, Roy; Vannucci, Marina; Falciani, Francesco

    2016-04-01

    The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell communication networks

  13. Parallel MMF: a Multiresolution Approach to Matrix Computation

    OpenAIRE

    Kondor, Risi; Teneva, Nedelina; Mudrakarta, Pramod K.

    2015-01-01

    Multiresolution Matrix Factorization (MMF) was recently introduced as a method for finding multiscale structure and defining wavelets on graphs/matrices. In this paper we derive pMMF, a parallel algorithm for computing the MMF factorization. Empirically, the running time of pMMF scales linearly in the dimension for sparse matrices. We argue that this makes pMMF a valuable new computational primitive in its own right, and present experiments on using pMMF for two distinct purposes: compressing...

  14. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  15. Match and Move, an Approach to Data Parallel Computing

    Science.gov (United States)

    1992-10-01

    Blelloch, Siddhartha Chatterjee, Jay Sippelstein, and Marco Zagha. CVL: a C Vector Library. School of Computer Science, Carnegie Mellon University...CBZ90] Siddhartha Chatterjee, Guy E. Blelloch, and Marco Zagha. Scan primitives for vector computers. In Proceedings Supercomputing 󈨞, November 1990...Cha91] Siddhartha Chatterjee. Compiling data-parallel programs for efficient execution on shared-memory multiprocessors. PhD thesis, Carnegie Mellon

  16. Computational challenges of structure-based approaches applied to HIV.

    Science.gov (United States)

    Forli, Stefano; Olson, Arthur J

    2015-01-01

    Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.

  17. A Computational Protein Phenotype Prediction Approach to Analyze the Deleterious Mutations of Human MED12 Gene.

    Science.gov (United States)

    Banaganapalli, Babajan; Mohammed, Kaleemuddin; Khan, Imran Ali; Al-Aama, Jumana Y; Elango, Ramu; Shaik, Noor Ahmad

    2016-09-01

    Genetic mutations in MED12, a subunit of Mediator complex are seen in a broad spectrum of human diseases. However, the underlying basis of how these pathogenic mutations elicit protein phenotype changes in terms of 3D structure, stability and protein binding sites remains unknown. Therefore, we aimed to investigate the structural and functional impacts of MED12 mutations, using computational methods as an alternate to traditional in vivo and in vitro approaches. The MED12 gene mutations details and their corresponding clinical associations were collected from different databases and by text-mining. Initially, diverse computational approaches were applied to categorize the different classes of mutations based on their deleterious impact to MED12. Then, protein structures for wild and mutant types built by integrative modeling were analyzed for structural divergence, solvent accessibility, stability, and functional interaction deformities. Finally, this study was able to identify that genetic mutations mapped to exon-2 region, highly conserved LCEWAV and Catenin domains induce biochemically severe amino acid changes which alters the protein phenotype as well as the stability of MED12-CYCC interactions. To better understand the deleterious nature of FS-IDs and Indels, this study asserts the utility of computational screening based on their propensity towards non-sense mediated decay. Current study findings may help to narrow down the number of MED12 mutations to be screened for mediator complex dysfunction associated genetic diseases. This study supports computational methods as a primary filter to verify the plausible impact of pathogenic mutations based on the perspective of evolution, expression and phenotype of proteins. J. Cell. Biochem. 117: 2023-2035, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. An affinity pull-down approach to identify the plant cyclic nucleotide interactome

    KAUST Repository

    Donaldson, Lara Elizabeth

    2013-09-03

    Cyclic nucleotides (CNs) are intracellular second messengers that play an important role in mediating physiological responses to environmental and developmental signals, in species ranging from bacteria to humans. In response to these signals, CNs are synthesized by nucleotidyl cyclases and then act by binding to and altering the activity of downstream target proteins known as cyclic nucleotide-binding proteins (CNBPs). A number of CNBPs have been identified across kingdoms including transcription factors, protein kinases, phosphodiesterases, and channels, all of which harbor conserved CN-binding domains. In plants however, few CNBPs have been identified as homology searches fail to return plant sequences with significant matches to known CNBPs. Recently, affinity pull-down techniques have been successfully used to identify CNBPs in animals and have provided new insights into CN signaling. The application of these techniques to plants has not yet been extensively explored and offers an alternative approach toward the unbiased discovery of novel CNBP candidates in plants. Here, an affinity pull-down technique for the identification of the plant CN interactome is presented. In summary, the method involves an extraction of plant proteins which is incubated with a CN-bait, followed by a series of increasingly stringent elutions that eliminates proteins in a sequential manner according to their affinity to the bait. The eluted and bait-bound proteins are separated by one-dimensional gel electrophoresis, excised, and digested with trypsin after which the resultant peptides are identified by mass spectrometry - techniques that are commonplace in proteomics experiments. The discovery of plant CNBPs promises to provide valuable insight into the mechanism of CN signal transduction in plants. © Springer Science+Business Media New York 2013.

  19. Demand side management scheme in smart grid with cloud computing approach using stochastic dynamic programming

    Directory of Open Access Journals (Sweden)

    S. Sofana Reka

    2016-09-01

    Full Text Available This paper proposes a cloud computing framework in smart grid environment by creating small integrated energy hub supporting real time computing for handling huge storage of data. A stochastic programming approach model is developed with cloud computing scheme for effective demand side management (DSM in smart grid. Simulation results are obtained using GUI interface and Gurobi optimizer in Matlab in order to reduce the electricity demand by creating energy networks in a smart hub approach.

  20. Identifying a few foot-and-mouth disease virus signature nucleotide strings for computational genotyping

    Directory of Open Access Journals (Sweden)

    Xu Lizhe

    2008-06-01

    Full Text Available Abstract Background Serotypes of the Foot-and-Mouth disease viruses (FMDVs were generally determined by biological experiments. The computational genotyping is not well studied even with the availability of whole viral genomes, due to uneven evolution among genes as well as frequent genetic recombination. Naively using sequence comparison for genotyping is only able to achieve a limited extent of success. Results We used 129 FMDV strains with known serotype as training strains to select as many as 140 most serotype-specific nucleotide strings. We then constructed a linear-kernel Support Vector Machine classifier using these 140 strings. Under the leave-one-out cross validation scheme, this classifier was able to assign correct serotype to 127 of these 129 strains, achieving 98.45% accuracy. It also assigned serotype correctly to an independent test set of 83 other FMDV strains downloaded separately from NCBI GenBank. Conclusion Computational genotyping is much faster and much cheaper than the wet-lab based biological experiments, upon the availability of the detailed molecular sequences. The high accuracy of our proposed method suggests the potential of utilizing a few signature nucleotide strings instead of whole genomes to determine the serotypes of novel FMDV strains.

  1. Challenges and possible approaches: towards the petaflops computers

    Institute of Scientific and Technical Information of China (English)

    Depei QIAN; Danfeng ZHU

    2009-01-01

    In parallel with the R&D efforts in USA and Eu-rope, China's National High-tech R&D program has setup its goal in developing petaflops computers. Researchers and engineers world-wide are looking for appropriate methods and technologies to achieve the petaflops computer system. Based on discussion on important design issues in devel-oping the petafiops computer, this paper raises the major technological challenges including the memory wall, low power system design, interconnects, and programming sup-port, etc. Current efforts in addressing some of these chal-lenges and in pursuing possible solutions for developing the petaflops systems are presented. Several existing systems are briefly introduced as examples, including Roadrunner, Cray XT5 jaguar, Dawning 5000A/6000, and Lenovo DeepComp 7000. Architectures proposed by Chinese researchers for im-plementing the petaflops computer are also introduced. Ad-vantages of the architecture as well as the difficulties in its implementation are discussed. Finally, future research direc-tion in development of high productivity computing systems is discussed.

  2. SABER: a computational method for identifying active sites for new reactions.

    Science.gov (United States)

    Nosrati, Geoffrey R; Houk, K N

    2012-05-01

    A software suite, SABER (Selection of Active/Binding sites for Enzyme Redesign), has been developed for the analysis of atomic geometries in protein structures, using a geometric hashing algorithm (Barker and Thornton, Bioinformatics 2003;19:1644-1649). SABER is used to explore the Protein Data Bank (PDB) to locate proteins with a specific 3D arrangement of catalytic groups to identify active sites that might be redesigned to catalyze new reactions. As a proof-of-principle test, SABER was used to identify enzymes that have the same catalytic group arrangement present in o-succinyl benzoate synthase (OSBS). Among the highest-scoring scaffolds identified by the SABER search for enzymes with the same catalytic group arrangement as OSBS were L-Ala D/L-Glu epimerase (AEE) and muconate lactonizing enzyme II (MLE), both of which have been redesigned to become effective OSBS catalysts, demonstrated by experiments. Next, we used SABER to search for naturally existing active sites in the PDB with catalytic groups similar to those present in the designed Kemp elimination enzyme KE07. From over 2000 geometric matches to the KE07 active site, SABER identified 23 matches that corresponded to residues from known active sites. The best of these matches, with a 0.28 Å catalytic atom RMSD to KE07, was then redesigned to be compatible with the Kemp elimination using RosettaDesign. We also used SABER to search for potential Kemp eliminases using a theozyme predicted to provide a greater rate acceleration than the active site of KE07, and used Rosetta to create a design based on the proteins identified.

  3. Making sense of deviance: identifying dissociating cases within the case series approach.

    Science.gov (United States)

    Fischer-Baum, Simon

    2013-01-01

    The case series approach in cognitive neuropsychology provides a means to test theories that make quantitative predictions about associations between different components of the cognitive system [Schwartz, M. F., & Dell, G. S. (2010). Case series investigations in cognitive neuropsychology. Cognitive Neuropsychology, 27, 477-494]. However, even when the predicted association is borne out the study may include outliers-observations that deviate significantly from the rest of the data. These outliers may reveal individual cases whose cognitive impairments dissociate from other cases included in the study. These dissociating cases can pose a significant challenge to the theory being tested. Using a recent case series that investigated the underlying causes of letter perseveration in spelling [Fischer-Baum, S., & Rapp, B. (2012). Underlying cause(s) of letter perseveration errors. Neuropsychologia, 50, 305-318], I discuss statistical and theoretical issues that arise when using outlier detection techniques to identify dissociating cases in a case series study.

  4. Identifying Liver Cancer and Its Relations with Diseases, Drugs, and Genes: A Literature-Based Approach

    Science.gov (United States)

    Song, Min

    2016-01-01

    In biomedicine, scientific literature is a valuable source for knowledge discovery. Mining knowledge from textual data has become an ever important task as the volume of scientific literature is growing unprecedentedly. In this paper, we propose a framework for examining a certain disease based on existing information provided by scientific literature. Disease-related entities that include diseases, drugs, and genes are systematically extracted and analyzed using a three-level network-based approach. A paper-entity network and an entity co-occurrence network (macro-level) are explored and used to construct six entity specific networks (meso-level). Important diseases, drugs, and genes as well as salient entity relations (micro-level) are identified from these networks. Results obtained from the literature-based literature mining can serve to assist clinical applications. PMID:27195695

  5. Identifying Risk and Protective Factors in Recidivist Juvenile Offenders: A Decision Tree Approach

    Science.gov (United States)

    Ortega-Campos, Elena; García-García, Juan; Gil-Fenoy, Maria José; Zaldívar-Basurto, Flor

    2016-01-01

    Research on juvenile justice aims to identify profiles of risk and protective factors in juvenile offenders. This paper presents a study of profiles of risk factors that influence young offenders toward committing sanctionable antisocial behavior (S-ASB). Decision tree analysis is used as a multivariate approach to the phenomenon of repeated sanctionable antisocial behavior in juvenile offenders in Spain. The study sample was made up of the set of juveniles who were charged in a court case in the Juvenile Court of Almeria (Spain). The period of study of recidivism was two years from the baseline. The object of study is presented, through the implementation of a decision tree. Two profiles of risk and protective factors are found. Risk factors associated with higher rates of recidivism are antisocial peers, age at baseline S-ASB, problems in school and criminality in family members. PMID:27611313

  6. Identifying approaches for assessing methodological and reporting quality of systematic reviews

    DEFF Research Database (Denmark)

    Pussegoda, Kusala; Turner, Lucy; Garritty, Chantelle

    2017-01-01

    there are potential gaps in research best-practice guidance materials. The aims of this study are to identify reports assessing the methodological quality (MQ) and/or reporting quality (RQ) of a cohort of SRs and to assess their number, general characteristics, and approaches to 'quality' assessment over time......BACKGROUND: The methodological quality and completeness of reporting of the systematic reviews (SRs) is fundamental to optimal implementation of evidence-based health care and the reduction of research waste. Methods exist to appraise SRs yet little is known about how they are used in SRs or where....... CONCLUSIONS: The methods used to assess quality of SRs are diverse, and none has become universally accepted. The most commonly used quality assessment tools are AMSTAR, OQAQ, and PRISMA. As new tools and guidelines are developed to improve both the MQ and RQ of SRs, authors of methodological studies...

  7. Identifying the critical financial ratios for stocks evaluation: A fuzzy delphi approach

    Science.gov (United States)

    Mokhtar, Mazura; Shuib, Adibah; Mohamad, Daud

    2014-12-01

    Stocks evaluation has always been an interesting and challenging problem for both researchers and practitioners. Generally, the evaluation can be made based on a set of financial ratios. Nevertheless, there are a variety of financial ratios that can be considered and if all ratios in the set are placed into the evaluation process, data collection would be more difficult and time consuming. Thus, the objective of this paper is to identify the most important financial ratios upon which to focus in order to evaluate the stock's performance. For this purpose, a survey was carried out using an approach which is based on an expert judgement, namely the Fuzzy Delphi Method (FDM). The results of this study indicated that return on equity, return on assets, net profit margin, operating profit margin, earnings per share and debt to equity are the most important ratios.

  8. The use of computational approaches in inhaler development.

    Science.gov (United States)

    Wong, William; Fletcher, David F; Traini, Daniela; Chan, Hak-Kim; Young, Paul M

    2012-03-30

    Computational Fluid Dynamics (CFD) and Discrete Element Modelling (DEM) studies relevant to inhaled drug delivery are reviewed. CFD is widely used in device design to determine airflow patterns and turbulence levels. CFD is also used to simulate particles and droplets, which are subjected to various forces, turbulence and wall interactions. These studies can now be performed routinely because of the availability of commercial software containing high quality turbulence and particle models. DEM allows for the modelling of agglomerate break-up upon interaction with a wall or due to shear in the flow. However, the computational cost is high and the number of particles that can be simulated is minimal compared with the number present in typical inhaled formulations. Therefore DEM is currently limited to fundamental studies of break-up mechanisms. With decreasing computational limitations, simulations combining CFD and DEM that can address outstanding issues in agglomerate break-up and dispersion will be possible.

  9. Computer Mediated Learning: An Example of an Approach.

    Science.gov (United States)

    Arcavi, Abraham; Hadas, Nurit

    2000-01-01

    There are several possible approaches in which dynamic computerized environments play a significant and possibly unique role in supporting innovative learning trajectories in mathematics in general and geometry in particular. Describes an approach based on a problem situation and some experiences using it with students and teachers. (Contains 15…

  10. Development of a computationally efficient urban modeling approach

    DEFF Research Database (Denmark)

    Wolfs, Vincent; Murla, Damian; Ntegeka, Victor

    2016-01-01

    This paper presents a parsimonious and data-driven modelling approach to simulate urban floods. Flood levels simulated by detailed 1D-2D hydrodynamic models can be emulated using the presented conceptual modelling approach with a very short calculation time. In addition, the model detail can be a...

  11. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  12. The waveform similarity approach to identify dependent events in instrumental seismic catalogues

    Science.gov (United States)

    Barani, S.; Ferretti, G.; Massa, M.; Spallarossa, D.

    2007-01-01

    In this paper, waveform similarity analysis is adapted and implemented in a declustering procedure to identify foreshocks and aftershocks, to obtain instrumental catalogues that are cleaned of dependent events and to perform an independent check of the results of traditional declustering techniques. Unlike other traditional declustering methods (i.e. windowing techniques), the application of cross-correlation analysis allows definition of groups of dependent events (multiplets) characterized by similar location, fault mechanism and propagation pattern. In this way the chain of intervening related events is led by the seismogenetic features of earthquakes. Furthermore, a time-selection criterion is used to define time-independent seismic episodes eventually joined (on the basis of waveform similarity) into a single multiplet. The results, obtained applying our procedure to a test data set, show that the declustered catalogue is drawn by the Poisson distribution with a degree of confidence higher than using the Gardner and Knopoff method. The declustered catalogues, applying these two approaches, are similar with respect to the frequency-magnitude distribution and the number of earthquakes. Nevertheless, the application of our approach leads to declustered catalogues properly related to the seismotectonic background and the reology of the investigated area and the success of the procedure is ensured by the independence of the results on estimated location errors of the events collected in the raw catalogue.

  13. A non-target approach to identify disinfection byproducts of structurally similar sulfonamide antibiotics.

    Science.gov (United States)

    Wang, Mian; Helbling, Damian E

    2016-10-01

    There is growing concern over the formation of new types of disinfection byproducts (DBPs) from pharmaceuticals and other emerging contaminants during drinking water production. Free chlorine is a widely used disinfectant that reacts non-selectively with organic molecules to form a variety of byproducts. In this research, we aimed to investigate the DBPs formed from three structurally similar sulfonamide antibiotics (sulfamethoxazole, sulfathiazole, and sulfadimethoxine) to determine how chemical structure influences the types of chlorination reactions observed. We conducted free chlorination experiments and developed a non-target approach to extract masses from the experimental dataset that represent the masses of candidate DBPs. Structures were assigned to the candidate DBPs based on analytical data and knowledge of chlorine chemistry. Confidence levels were assigned to each proposed structure according to conventions in the field. In total, 11, 12, and 15 DBP structures were proposed for sulfamethoxazole, sulfathiazole, and sulfadimethoxine, respectively. The structures of the products suggest a variety of reaction types including chlorine substitution, SC cleavage, SN hydrolysis, desulfonation, oxidation/hydroxylation, and conjugation reactions. Some reaction types were common to all of the sulfonamide antibiotics, but unique reaction types were also observed for each sulfonamide antibiotic suggesting that selective prediction of DBP structures of other sulfonamide antibiotics based on chemical structure is unlikely to be possible based on these data alone. This research offers an approach to comprehensively identify DBPs of organic molecules and fills in much needed data on the formation of specific DBPs from three environmentally relevant sulfonamide antibiotics.

  14. Chemical proteomics approaches for identifying the cellular targets of natural products.

    Science.gov (United States)

    Wright, M H; Sieber, S A

    2016-05-01

    Covering: 2010 up to 2016Deconvoluting the mode of action of natural products and drugs remains one of the biggest challenges in chemistry and biology today. Chemical proteomics is a growing area of chemical biology that seeks to design small molecule probes to understand protein function. In the context of natural products, chemical proteomics can be used to identify the protein binding partners or targets of small molecules in live cells. Here, we highlight recent examples of chemical probes based on natural products and their application for target identification. The review focuses on probes that can be covalently linked to their target proteins (either via intrinsic chemical reactivity or via the introduction of photocrosslinkers), and can be applied "in situ" - in living systems rather than cell lysates. We also focus here on strategies that employ a click reaction, the copper-catalysed azide-alkyne cycloaddition reaction (CuAAC), to allow minimal functionalisation of natural product scaffolds with an alkyne or azide tag. We also discuss 'competitive mode' approaches that screen for natural products that compete with a well-characterised chemical probe for binding to a particular set of protein targets. Fuelled by advances in mass spectrometry instrumentation and bioinformatics, many modern strategies are now embracing quantitative proteomics to help define the true interacting partners of probes, and we highlight the opportunities this rapidly evolving technology provides in chemical proteomics. Finally, some of the limitations and challenges of chemical proteomics approaches are discussed.

  15. Toward an efficient approach to identify molecular scaffolds possessing selective or promiscuous compounds.

    Science.gov (United States)

    Yongye, Austin B; Medina-Franco, José L

    2013-10-01

    The concept of a recurrent scaffold present in a series of structures is common in medicinal drug discovery. We present a scaffold analysis of compounds screened across 100 sequence-unrelated proteins to identify scaffolds that drive promiscuity or selectivity. Selectivity and promiscuity play a major role in traditional and poly-pharmacological drug design considerations. The collection employed here is the first publicly available data set containing the complete screening profiles of more than 15 000 compounds from different sources. In addition, no scaffold analysis of this data set has been reported. The protocol described here employs the Molecular Equivalence Index tool to facilitate the selection of Bemis-Murcko frameworks in the data set, which contain at least five compounds and Scaffold Hunter to generate a hierarchical tree of scaffolds. The annotation of the scaffold tree with protein-binding profile data enabled the successful identification of mostly highly specific compounds, due to data set constraints. We also applied this approach to a public set of 1497 small molecules screened non-uniformly across a panel of 172 protein kinases. The approach is general and can be applied to any other data sets and activity readout.

  16. Identifying Potential Areas for Siting Interim Nuclear Waste Facilities Using Map Algebra and Optimization Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Omitaomu, Olufemi A [ORNL; Liu, Cheng [ORNL; Cetiner, Sacit M [ORNL; Belles, Randy [ORNL; Mays, Gary T [ORNL; Tuttle, Mark A [ORNL

    2013-01-01

    The renewed interest in siting new nuclear power plants in the United States has brought to the center stage, the need to site interim facilities for long-term management of spent nuclear fuel (SNF). In this paper, a two-stage approach for identifying potential areas for siting interim SNF facilities is presented. In the first stage, the land area is discretized into grids of uniform size (e.g., 100m x 100m grids). For the continental United States, this process resulted in a data matrix of about 700 million cells. Each cell of the matrix is then characterized as a binary decision variable to indicate whether an exclusion criterion is satisfied or not. A binary data matrix is created for each of the 25 siting criteria considered in this study. Using map algebra approach, cells that satisfy all criteria are clustered and regarded as potential siting areas. In the second stage, an optimization problem is formulated as a p-median problem on a rail network such that the sum of the shortest distance between nuclear power plants with SNF and the potential storage sites from the first stage is minimized. The implications of obtained results for energy policies are presented and discussed.

  17. Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction

    Directory of Open Access Journals (Sweden)

    Peter E Larsen

    2016-01-01

    Full Text Available In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree and Laccaria bicolor (mycorrhizal fungi interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensor systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. This multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.

  18. A Systems Approach Identifies Essential FOXO3 Functions at Key Steps of Terminal Erythropoiesis.

    Directory of Open Access Journals (Sweden)

    Raymond Liang

    2015-10-01

    Full Text Available Circulating red blood cells (RBCs are essential for tissue oxygenation and homeostasis. Defective terminal erythropoiesis contributes to decreased generation of RBCs in many disorders. Specifically, ineffective nuclear expulsion (enucleation during terminal maturation is an obstacle to therapeutic RBC production in vitro. To obtain mechanistic insights into terminal erythropoiesis we focused on FOXO3, a transcription factor implicated in erythroid disorders. Using an integrated computational and experimental systems biology approach, we show that FOXO3 is essential for the correct temporal gene expression during terminal erythropoiesis. We demonstrate that the FOXO3-dependent genetic network has critical physiological functions at key steps of terminal erythropoiesis including enucleation and mitochondrial clearance processes. FOXO3 loss deregulated transcription of genes implicated in cell polarity, nucleosome assembly and DNA packaging-related processes and compromised erythroid enucleation. Using high-resolution confocal microscopy and imaging flow cytometry we show that cell polarization is impaired leading to multilobulated Foxo3-/- erythroblasts defective in nuclear expulsion. Ectopic FOXO3 expression rescued Foxo3-/- erythroblast enucleation-related gene transcription, enucleation defects and terminal maturation. Remarkably, FOXO3 ectopic expression increased wild type erythroblast maturation and enucleation suggesting that enhancing FOXO3 activity may improve RBCs production. Altogether these studies uncover FOXO3 as a novel regulator of erythroblast enucleation and terminal maturation suggesting FOXO3 modulation might be therapeutic in disorders with defective erythroid maturation.

  19. An integrated approach for identifying wrongly labelled samples when performing classification in microarray data.

    Directory of Open Access Journals (Sweden)

    Yuk Yee Leung

    Full Text Available BACKGROUND: Using hybrid approach for gene selection and classification is common as results obtained are generally better than performing the two tasks independently. Yet, for some microarray datasets, both classification accuracy and stability of gene sets obtained still have rooms for improvement. This may be due to the presence of samples with wrong class labels (i.e. outliers. Outlier detection algorithms proposed so far are either not suitable for microarray data, or only solve the outlier detection problem on their own. RESULTS: We tackle the outlier detection problem based on a previously proposed Multiple-Filter-Multiple-Wrapper (MFMW model, which was demonstrated to yield promising results when compared to other hybrid approaches (Leung and Hung, 2010. To incorporate outlier detection and overcome limitations of the existing MFMW model, three new features are introduced in our proposed MFMW-outlier approach: 1 an unbiased external Leave-One-Out Cross-Validation framework is developed to replace internal cross-validation in the previous MFMW model; 2 wrongly labeled samples are identified within the MFMW-outlier model; and 3 a stable set of genes is selected using an L1-norm SVM that removes any redundant genes present. Six binary-class microarray datasets were tested. Comparing with outlier detection studies on the same datasets, MFMW-outlier could detect all the outliers found in the original paper (for which the data was provided for analysis, and the genes selected after outlier removal were proven to have biological relevance. We also compared MFMW-outlier with PRAPIV (Zhang et al., 2006 based on same synthetic datasets. MFMW-outlier gave better average precision and recall values on three different settings. Lastly, artificially flipped microarray datasets were created by removing our detected outliers and flipping some of the remaining samples' labels. Almost all the 'wrong' (artificially flipped samples were detected, suggesting

  20. A statistical approach for identifying the ionospheric footprint of magnetospheric boundaries from SuperDARN observations

    Directory of Open Access Journals (Sweden)

    G. Lointier

    2008-02-01

    Full Text Available Identifying and tracking the projection of magnetospheric regions on the high-latitude ionosphere is of primary importance for studying the Solar Wind-Magnetosphere-Ionosphere system and for space weather applications. By its unique spatial coverage and temporal resolution, the Super Dual Auroral Radar Network (SuperDARN provides key parameters, such as the Doppler spectral width, which allows the monitoring of the ionospheric footprint of some magnetospheric boundaries in near real-time. In this study, we present the first results of a statistical approach for monitoring these magnetospheric boundaries. The singular value decomposition is used as a data reduction tool to describe the backscattered echoes with a small set of parameters. One of these is strongly correlated with the Doppler spectral width, and can thus be used as a proxy for it. Based on this, we propose a Bayesian classifier for identifying the spectral width boundary, which is classically associated with the Polar Cap boundary. The results are in good agreement with previous studies. Two advantages of the method are: the possibility to apply it in near real-time, and its capacity to select the appropriate threshold level for the boundary detection.

  1. Multi-compartment approach to identify minimal flow and maximal recreational use of a lowland river

    Science.gov (United States)

    Pusch, Martin; Lorenz, Stefan

    2013-04-01

    Most approaches to establish a minimum flow rate for river sections subjected to water abstraction focus on flow requirements of fish and benthic invertebrates. However, artificial reduction of river flow will always affect additional key ecosystem features, as sediment properties and the metabolism of matter in these ecosystems as well, and may even influence adjacent floodplains. Thus, significant effects e.g. on the dissolved oxygen content of river water, on habitat conditions in the benthic zone, and on water levels in the floodplain are to be expected. Thus, we chose a multiple compartment method to identify minimum flow requirements in a lowland River in northern Germany (Spree River), selecting the minimal required flow level out of all compartments studied. Results showed that minimal flow levels necessary to keep key ecosystem features at a 'good' state depended significantly on actual water quality and on river channel morphology. Thereby, water quality of the Spree is potentially influenced by recreational boating activity, which causes mussels to stop filter-feeding, and thus impedes self-purification. Disturbance of mussel feeding was shown to directly depend on boat type and speed, with substantial differences among mussel species. Thus, a maximal recreational boating intensity could be derived that does not significantly affect self purification. We conclude that minimal flow levels should be identified not only based on flow preferences of target species, but also considering channel morphology, ecological functions, and the intensity of other human uses of the river section.

  2. Identifying potential ground movement as a landslide mitigation approach using resistivity method

    Science.gov (United States)

    Izzati, F. N.; Laksmana, Z. S.; Marcelina, B.; Hutabarat, S. S.; Widodo

    2017-07-01

    Landslide is defined as a form of ground movement in which land mass suddenly fails downward on a slope as aresult of gravitational pull. One of the mitigative approaches into investigating landslide is to identify a potential slip zone usingresistivity method. In this study, the array chosen to acquire the resistivity data was Wenner array as it provides a robust resolution in mapping lateral resistivity variations. This method will generate a contour map portraying thedistribution of resistivity values of the subsurface. Beforehand, a 2-dimensional forward modeling was conducted to acquire anexpected ideal result of possible potential slip zone. Landslides itself are affiliated with a low resistivity zone that is locatedbetween two high resistivity zones. This study is conducted in a ground slump in Jalan Citra Green, Northern Bandung which is comprised of mostly unconsolidated soil. By applying a least-square inversion to the resistivity data obtained, resistivity values of 10-200 Ωm is attained. Based on the inversion result, a low resistivity zone of 10-20 Ωm is identified spanning from the surface to approximately 10 meters deep. In conclusion, furtherinvestigations are needed to determine whether the low resistivity zone is associated with potential slip zone as our datais limited to a single line

  3. A Neural Network Approach for Identifying Particle Pitch Angle Distributions in Van Allen Probes Data

    Science.gov (United States)

    Souza, V. M.; Vieira, L. E. A.; Medeiros, C.; Da Silva, L. A.; Alves, L. R.; Koga, D.; Sibeck, D. G.; Walsh, B. M.; Kanekal, S. G.; Jauer, P. R.; hide

    2016-01-01

    Analysis of particle pitch angle distributions (PADs) has been used as a means to comprehend a multitude of different physical mechanisms that lead to flux variations in the Van Allen belts and also to particle precipitation into the upper atmosphere. In this work we developed a neural network-based data clustering methodology that automatically identifies distinct PAD types in an unsupervised way using particle flux data. One can promptly identify and locate three well-known PAD types in both time and radial distance, namely, 90deg peaked, butterfly, and flattop distributions. In order to illustrate the applicability of our methodology, we used relativistic electron flux data from the whole month of November 2014, acquired from the Relativistic Electron-Proton Telescope instrument on board the Van Allen Probes, but it is emphasized that our approach can also be used with multiplatform spacecraft data. Our PAD classification results are in reasonably good agreement with those obtained by standard statistical fitting algorithms. The proposed methodology has a potential use for Van Allen belt's monitoring.

  4. Integrative screening approach identifies regulators of polyploidization and targets for acute megakaryocytic leukemia

    Science.gov (United States)

    Wen, Qiang; Goldenson, Benjamin; Silver, Serena J.; Schenone, Monica; Dancik, Vladimir; Huang, Zan; Wang, Ling-Zhi; Lewis, Timothy; An, W. Frank; Li, Xiaoyu; Bray, Mark-Anthony; Thiollier, Clarisse; Diebold, Lauren; Gilles, Laure; Vokes, Martha S.; Moore, Christopher B.; Bliss-Moreau, Meghan; VerPlank, Lynn; Tolliday, Nicola J.; Mishra, Rama; Vemula, Sasidhar; Shi, Jianjian; Wei, Lei; Kapur, Reuben; Lopez, Cécile K.; Gerby, Bastien; Ballerini, Paola; Pflumio, Francoise; Gilliland, D. Gary; Goldberg, Liat; Birger, Yehudit; Izraeli, Shai; Gamis, Alan S.; Smith, Franklin O.; Woods, William G.; Taub, Jeffrey; Scherer, Christina A.; Bradner, James; Goh, Boon-Cher; Mercher, Thomas; Carpenter, Anne E.; Gould, Robert J.; Clemons, Paul A.; Carr, Steven A.; Root, David E.; Schreiber, Stuart L.; Stern, Andrew M.; Crispino, John D.

    2012-01-01

    Summary The mechanism by which cells decide to skip mitosis to become polyploid is largely undefined. Here we used a high-content image-based screen to identify small-molecule probes that induce polyploidization of megakaryocytic leukemia cells and serve as perturbagens to help understand this process. We found that dimethylfasudil (diMF, H-1152P) selectively increased polyploidization, mature cell-surface marker expression, and apoptosis of malignant megakaryocytes. A broadly applicable, highly integrated target identification approach employing proteomic and shRNA screening revealed that a major target of diMF is Aurora A kinase (AURKA), which has not been studied extensively in megakaryocytes. Moreover, we discovered that MLN8237 (Alisertib), a selective inhibitor of AURKA, induced polyploidization and expression of mature megakaryocyte markers in AMKL blasts and displayed potent anti-AMKL activity in vivo. This research provides the rationale to support clinical trials of MLN8237 and other inducers of polyploidization in AMKL. Finally, we have identified five networks of kinases that regulate the switch to polyploidy. PMID:22863010

  5. Shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity.

    Directory of Open Access Journals (Sweden)

    J R Managbanag

    Full Text Available BACKGROUND: Identification of genes that modulate longevity is a major focus of aging-related research and an area of intense public interest. In addition to facilitating an improved understanding of the basic mechanisms of aging, such genes represent potential targets for therapeutic intervention in multiple age-associated diseases, including cancer, heart disease, diabetes, and neurodegenerative disorders. To date, however, targeted efforts at identifying longevity-associated genes have been limited by a lack of predictive power, and useful algorithms for candidate gene-identification have also been lacking. METHODOLOGY/PRINCIPAL FINDINGS: We have utilized a shortest-path network analysis to identify novel genes that modulate longevity in Saccharomyces cerevisiae. Based on a set of previously reported genes associated with increased life span, we applied a shortest-path network algorithm to a pre-existing protein-protein interaction dataset in order to construct a shortest-path longevity network. To validate this network, the replicative aging potential of 88 single-gene deletion strains corresponding to predicted components of the shortest-path longevity network was determined. Here we report that the single-gene deletion strains identified by our shortest-path longevity analysis are significantly enriched for mutations conferring either increased or decreased replicative life span, relative to a randomly selected set of 564 single-gene deletion strains or to the current data set available for the entire haploid deletion collection. Further, we report the identification of previously unknown longevity genes, several of which function in a conserved longevity pathway believed to mediate life span extension in response to dietary restriction. CONCLUSIONS/SIGNIFICANCE: This work demonstrates that shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity and represents the first application of

  6. Computational prediction of riboswitch tertiary structures including pseudoknots by RAGTOP: a hierarchical graph sampling approach.

    Science.gov (United States)

    Kim, Namhee; Zahran, Mai; Schlick, Tamar

    2015-01-01

    The modular organization of RNA structure has been exploited in various computational and theoretical approaches to identify RNA tertiary (3D) motifs and assemble RNA structures. Riboswitches exemplify this modularity in terms of both structural and functional adaptability of RNA components. Here, we extend our computational approach based on tree graph sampling to the prediction of riboswitch topologies by defining additional edges to mimick pseudoknots. Starting from a secondary (2D) structure, we construct an initial graph deduced from predicted junction topologies by our data-mining algorithm RNAJAG trained on known RNAs; we sample these graphs in 3D space guided by knowledge-based statistical potentials derived from bending and torsion measures of internal loops as well as radii of gyration for known RNAs. We present graph sampling results for 10 representative riboswitches, 6 of them with pseudoknots, and compare our predictions to solved structures based on global and local RMSD measures. Our results indicate that the helical arrangements in riboswitches can be approximated using our combination of modified 3D tree graph representations for pseudoknots, junction prediction, graph moves, and scoring functions. Future challenges in the field of riboswitch prediction and design are also discussed.

  7. An algorithmic calibration approach to identify globally optimal parameters for constraining the DayCent model

    Energy Technology Data Exchange (ETDEWEB)

    Rafique, Rashid; Kumar, Sandeep; Luo, Yiqi; Kiely, Gerard; Asrar, Ghassem R.

    2015-02-01

    he accurate calibration of complex biogeochemical models is essential for the robust estimation of soil greenhouse gases (GHG) as well as other environmental conditions and parameters that are used in research and policy decisions. DayCent is a popular biogeochemical model used both nationally and internationally for this purpose. Despite DayCent’s popularity, its complex parameter estimation is often based on experts’ knowledge which is somewhat subjective. In this study we used the inverse modelling parameter estimation software (PEST), to calibrate the DayCent model based on sensitivity and identifi- ability analysis. Using previously published N2 O and crop yield data as a basis of our calibration approach, we found that half of the 140 parameters used in this study were the primary drivers of calibration dif- ferences (i.e. the most sensitive) and the remaining parameters could not be identified given the data set and parameter ranges we used in this study. The post calibration results showed improvement over the pre-calibration parameter set based on, a decrease in residual differences 79% for N2O fluxes and 84% for crop yield, and an increase in coefficient of determination 63% for N2O fluxes and 72% for corn yield. The results of our study suggest that future studies need to better characterize germination tem- perature, number of degree-days and temperature dependency of plant growth; these processes were highly sensitive and could not be adequately constrained by the data used in our study. Furthermore, the sensitivity and identifiability analysis was helpful in providing deeper insight for important processes and associated parameters that can lead to further improvement in calibration of DayCent model.

  8. Identifying genetic risk variants for coronary heart disease in familial hypercholesterolemia: an extreme genetics approach

    Science.gov (United States)

    Versmissen, Jorie; Oosterveer, Daniëlla M; Yazdanpanah, Mojgan; Dehghan, Abbas; Hólm, Hilma; Erdman, Jeanette; Aulchenko, Yurii S; Thorleifsson, Gudmar; Schunkert, Heribert; Huijgen, Roeland; Vongpromek, Ranitha; Uitterlinden, André G; Defesche, Joep C; van Duijn, Cornelia M; Mulder, Monique; Dadd, Tony; Karlsson, Hróbjartur D; Ordovas, Jose; Kindt, Iris; Jarman, Amelia; Hofman, Albert; van Vark-van der Zee, Leonie; Blommesteijn-Touw, Adriana C; Kwekkeboom, Jaap; Liem, Anho H; van der Ouderaa, Frans J; Calandra, Sebastiano; Bertolini, Stefano; Averna, Maurizio; Langslet, Gisle; Ose, Leiv; Ros, Emilio; Almagro, Fátima; de Leeuw, Peter W; Civeira, Fernando; Masana, Luis; Pintó, Xavier; Simoons, Maarten L; Schinkel, Arend FL; Green, Martin R; Zwinderman, Aeilko H; Johnson, Keith J; Schaefer, Arne; Neil, Andrew; Witteman, Jacqueline CM; Humphries, Steve E; Kastelein, John JP; Sijbrands, Eric JG

    2015-01-01

    Mutations in the low-density lipoprotein receptor (LDLR) gene cause familial hypercholesterolemia (FH), a disorder characterized by coronary heart disease (CHD) at young age. We aimed to apply an extreme sampling method to enhance the statistical power to identify novel genetic risk variants for CHD in individuals with FH. We selected cases and controls with an extreme contrast in CHD risk from 17 000 FH patients from the Netherlands, whose functional LDLR mutation was unequivocally established. The genome-wide association (GWA) study was performed on 249 very young FH cases with CHD and 217 old FH controls without CHD (above 65 years for males and 70 years of age for females) using the Illumina HumanHap550K chip. In the next stage, two independent samples (one from the Netherlands and one from Italy, Norway, Spain, and the United Kingdom) of FH patients were used as replication samples. In the initial GWA analysis, we identified 29 independent single nucleotide polymorphisms (SNPs) with suggestive associations with premature CHD (P<1 × 10−4). We examined the association of these SNPs with CHD risk in the replication samples. After Bonferroni correction, none of the SNPs either replicated or reached genome-wide significance after combining the discovery and replication samples. Therefore, we conclude that the genetics of CHD risk in FH is complex and even applying an ‘extreme genetics' approach we did not identify new genetic risk variants. Most likely, this method is not as effective in leveraging effect size as anticipated, and may, therefore, not lead to significant gains in statistical power. PMID:24916650

  9. A functional genomics approach identifies candidate effectors from the aphid species Myzus persicae (green peach aphid).

    Science.gov (United States)

    Bos, Jorunn I B; Prince, David; Pitino, Marco; Maffei, Massimo E; Win, Joe; Hogenhout, Saskia A

    2010-11-18

    Aphids are amongst the most devastating sap-feeding insects of plants. Like most plant parasites, aphids require intimate associations with their host plants to gain access to nutrients. Aphid feeding induces responses such as clogging of phloem sieve elements and callose formation, which are suppressed by unknown molecules, probably proteins, in aphid saliva. Therefore, it is likely that aphids, like plant pathogens, deliver proteins (effectors) inside their hosts to modulate host cell processes, suppress plant defenses, and promote infestation. We exploited publicly available aphid salivary gland expressed sequence tags (ESTs) to apply a functional genomics approach for identification of candidate effectors from Myzus persicae (green peach aphid), based on common features of plant pathogen effectors. A total of 48 effector candidates were identified, cloned, and subjected to transient overexpression in Nicotiana benthamiana to assay for elicitation of a phenotype, suppression of the Pathogen-Associated Molecular Pattern (PAMP)-mediated oxidative burst, and effects on aphid reproductive performance. We identified one candidate effector, Mp10, which specifically induced chlorosis and local cell death in N. benthamiana and conferred avirulence to recombinant Potato virus X (PVX) expressing Mp10, PVX-Mp10, in N. tabacum, indicating that this protein may trigger plant defenses. The ubiquitin-ligase associated protein SGT1 was required for the Mp10-mediated chlorosis response in N. benthamiana. Mp10 also suppressed the oxidative burst induced by flg22, but not by chitin. Aphid fecundity assays revealed that in planta overexpression of Mp10 and Mp42 reduced aphid fecundity, whereas another effector candidate, MpC002, enhanced aphid fecundity. Thus, these results suggest that, although Mp10 suppresses flg22-triggered immunity, it triggers a defense response, resulting in an overall decrease in aphid performance in the fecundity assays. Overall, we identified aphid

  10. SU-E-J-212: Identifying Bones From MRI: A Dictionary Learnign and Sparse Regression Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ruan, D; Yang, Y; Cao, M; Hu, P; Low, D [UCLA, Los Angeles, CA (United States)

    2014-06-01

    Purpose: To develop an efficient and robust scheme to identify bony anatomy based on MRI-only simulation images. Methods: MRI offers important soft tissue contrast and functional information, yet its lack of correlation to electron-density has placed it as an auxiliary modality to CT in radiotherapy simulation and adaptation. An effective scheme to identify bony anatomy is an important first step towards MR-only simulation/treatment paradigm and would satisfy most practical purposes. We utilize a UTE acquisition sequence to achieve visibility of the bone. By contrast to manual + bulk or registration-to identify bones, we propose a novel learning-based approach for improved robustness to MR artefacts and environmental changes. Specifically, local information is encoded with MR image patch, and the corresponding label is extracted (during training) from simulation CT aligned to the UTE. Within each class (bone vs. nonbone), an overcomplete dictionary is learned so that typical patches within the proper class can be represented as a sparse combination of the dictionary entries. For testing, an acquired UTE-MRI is divided to patches using a sliding scheme, where each patch is sparsely regressed against both bone and nonbone dictionaries, and subsequently claimed to be associated with the class with the smaller residual. Results: The proposed method has been applied to the pilot site of brain imaging and it has showed general good performance, with dice similarity coefficient of greater than 0.9 in a crossvalidation study using 4 datasets. Importantly, it is robust towards consistent foreign objects (e.g., headset) and the artefacts relates to Gibbs and field heterogeneity. Conclusion: A learning perspective has been developed for inferring bone structures based on UTE MRI. The imaging setting is subject to minimal motion effects and the post-processing is efficient. The improved efficiency and robustness enables a first translation to MR-only routine. The scheme

  11. A novel approach identifying hybrid sterility QTL on the autosomes of Drosophila simulans and D. mauritiana.

    Science.gov (United States)

    Dickman, Christopher T D; Moehring, Amanda J

    2013-01-01

    When species interbreed, the hybrid offspring that are produced are often sterile. If only one hybrid sex is sterile, it is almost always the heterogametic (XY or ZW) sex. Taking this trend into account, the predominant model used to explain the genetic basis of F1 sterility involves a deleterious interaction between recessive sex-linked loci from one species and dominant autosomal loci from the other species. This model is difficult to evaluate, however, as only a handful of loci influencing interspecies hybrid sterility have been identified, and their autosomal genetic interactors have remained elusive. One hindrance to their identification has been the overwhelming effect of the sex chromosome in mapping studies, which could 'mask' the ability to accurately map autosomal factors. Here, we use a novel approach employing attached-X chromosomes to create reciprocal backcross interspecies hybrid males that have a non-recombinant sex chromosome and recombinant autosomes. The heritable variation in phenotype is thus solely caused by differences in the autosomes, thereby allowing us to accurately identify the number and location of autosomal sterility loci. In one direction of backcross, all males were sterile, indicating that sterility could be entirely induced by the sex chromosome complement in these males. In the other direction, we identified nine quantitative trait loci that account for a surprisingly large amount (56%) of the autosome-induced phenotypic variance in sterility, with a large contribution of autosome-autosome epistatic interactions. These loci are capable of acting dominantly, and thus could contribute to F1 hybrid sterility.

  12. A phase coherence approach to identifying co-located earthquakes and tremor

    Science.gov (United States)

    Hawthorne, J. C.; Ampuero, J.-P.

    2017-01-01

    We present and use a phase coherence approach to identify seismic signals that have similar path effects but different source time functions: co-located earthquakes and tremor. The method used is a phase coherence-based implementation of empirical matched field processing, modified to suit tremor analysis. It works by comparing the frequency-domain phases of waveforms generated by two sources recorded at multiple stations. We first cross-correlate the records of the two sources at a single station. If the sources are co-located, this cross-correlation eliminates the phases of the Green's function. It leaves the relative phases of the source time functions, which should be the same across all stations so long as the spatial extent of the sources are small compared with the seismic wavelength. We therefore search for cross-correlation phases that are consistent across stations as an indication of co-located sources. We also introduce a method to obtain relative locations between the two sources, based on back-projection of inter-station phase coherence. We apply this technique to analyze two tremor-like signals that are thought to be composed of a number of earthquakes. First, we analyze a 20-second-long seismic precursor to a M 3.9 earthquake in central Alaska. The analysis locates the precursor to within 2 km of the mainshock, and it identifies several bursts of energy-potentially foreshocks or groups of foreshocks-within the precursor. Second, we examine several minutes of volcanic tremor prior to an eruption at Redoubt Volcano. We confirm that the tremor source is located close to repeating earthquakes identified earlier in the tremor sequence. The amplitude of the tremor diminishes about 30 seconds before the eruption, but the phase coherence results suggest that the tremor may persist at some level through this final interval.

  13. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  14. Exploring polymorphism in molecular crystals with a computational approach

    NARCIS (Netherlands)

    Ende, J.A. van den

    2016-01-01

    Different crystal structures can possess different properties and therefore the control of polymorphism in molecular crystals is a goal in multiple industries, e.g. the pharmaceutical industry. Part I of this thesis is a computational study at the molecular scale of a particular solid-solid polymorp

  15. Linguistics, Computers, and the Language Teacher. A Communicative Approach.

    Science.gov (United States)

    Underwood, John H.

    This analysis of the state of the art of computer programs and programming for language teaching has two parts. In the first part, an overview of the theory and practice of language teaching, Noam Chomsky's view of language, and the implications and problems of generative theory are presented. The theory behind the input model of language…

  16. Individual Differences in Learning Computer Programming: A Social Cognitive Approach

    Science.gov (United States)

    Akar, Sacide Guzin Mazman; Altun, Arif

    2017-01-01

    The purpose of this study is to investigate and conceptualize the ranks of importance of social cognitive variables on university students' computer programming performances. Spatial ability, working memory, self-efficacy, gender, prior knowledge and the universities students attend were taken as variables to be analyzed. The study has been…

  17. A Knowledge-Intensive Approach to Computer Vision Systems

    NARCIS (Netherlands)

    Koenderink-Ketelaars, N.J.J.P.

    2010-01-01

    This thesis focusses on the modelling of knowledge-intensive computer vision tasks. Knowledge-intensive tasks are tasks that require a high level of expert knowledge to be performed successfully. Such tasks are generally performed by a task expert. Task experts have a lot of experience in performing

  18. Exploring polymorphism in molecular crystals with a computational approach

    NARCIS (Netherlands)

    Ende, J.A. van den

    2016-01-01

    Different crystal structures can possess different properties and therefore the control of polymorphism in molecular crystals is a goal in multiple industries, e.g. the pharmaceutical industry. Part I of this thesis is a computational study at the molecular scale of a particular solid-solid

  19. Statistical Learning of Phonetic Categories: Insights from a Computational Approach

    Science.gov (United States)

    McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.

    2009-01-01

    Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…

  20. R for cloud computing an approach for data scientists

    CERN Document Server

    Ohri, A

    2014-01-01

    R for Cloud Computing looks at some of the tasks performed by business analysts on the desktop (PC era)  and helps the user navigate the wealth of information in R and its 4000 packages as well as transition the same analytics using the cloud.  With this information the reader can select both cloud vendors  and the sometimes confusing cloud ecosystem as well  as the R packages that can help process the analytical tasks with minimum effort and cost, and maximum usefulness and customization. The use of Graphical User Interfaces (GUI)  and Step by Step screenshot tutorials is emphasized in this book to lessen the famous learning curve in learning R and some of the needless confusion created in cloud computing that hinders its widespread adoption. This will help you kick-start analytics on the cloud including chapters on cloud computing, R, common tasks performed in analytics, scrutiny of big data analytics, and setting up and navigating cloud providers. Readers are exposed to a breadth of cloud computing ch...

  1. A New Approach: Computer-Assisted Problem-Solving Systems

    Science.gov (United States)

    Gok, Tolga

    2010-01-01

    Computer-assisted problem solving systems are rapidly growing in educational use and with the advent of the Internet. These systems allow students to do their homework and solve problems online with the help of programs like Blackboard, WebAssign and LON-CAPA program etc. There are benefits and drawbacks of these systems. In this study, the…

  2. Affective brain-computer interfaces: neuroscientific approaches to affect detection

    NARCIS (Netherlands)

    Mühl, C.; Heylen, Dirk K.J.; Nijholt, Antinus; Calvo, Rafael; D'Mello, Sidney K.; Gratch, Jonathan; Kappas, Arvid

    The brain is involved in the registration, evaluation, and representation of emotional events and in the subsequent planning and execution of appropriate actions. Novel interface technologies—so-called affective brain-computer interfaces (aBCI)—can use this rich neural information, occurring in

  3. Linguistics, Computers, and the Language Teacher. A Communicative Approach.

    Science.gov (United States)

    Underwood, John H.

    This analysis of the state of the art of computer programs and programming for language teaching has two parts. In the first part, an overview of the theory and practice of language teaching, Noam Chomsky's view of language, and the implications and problems of generative theory are presented. The theory behind the input model of language…

  4. Nested Transactions: An Approach to Reliable Distributed Computing.

    Science.gov (United States)

    1981-04-01

    Undoubtedly such universal use of computers and rapid exchange of information will have a dramatic impact: social , economic, and political. Distributed...level tiansaction, these committed inferiors are SLJ C.e’,ssfulI inferiors of the top-level transaction, too. Therefore q will indeed get a commint

  5. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  6. Discovery of new [Formula: see text] proteasome inhibitors using a knowledge-based computational screening approach.

    Science.gov (United States)

    Mehra, Rukmankesh; Chib, Reena; Munagala, Gurunadham; Yempalla, Kushalava Reddy; Khan, Inshad Ali; Singh, Parvinder Pal; Khan, Farrah Gul; Nargotra, Amit

    2015-11-01

    Mycobacterium tuberculosis bacteria cause deadly infections in patients [Corrected]. The rise of multidrug resistance associated with tuberculosis further makes the situation worse in treating the disease. M. tuberculosis proteasome is necessary for the pathogenesis of the bacterium validated as an anti-tubercular target, thus making it an attractive enzyme for designing Mtb inhibitors. In this study, a computational screening approach was applied to identify new proteasome inhibitor candidates from a library of 50,000 compounds. This chemical library was procured from the ChemBridge (20,000 compounds) and the ChemDiv (30,000 compounds) databases. After a detailed analysis of the computational screening results, 50 in silico hits were retrieved and tested in vitro finding 15 compounds with [Formula: see text] values ranging from 35.32 to 64.15 [Formula: see text]M on lysate. A structural analysis of these hits revealed that 14 of these compounds probably have non-covalent mode of binding to the target and have not reported for anti-tubercular or anti-proteasome activity. The binding interactions of all the 14 protein-inhibitor complexes were analyzed using molecular docking studies. Further, molecular dynamics simulations of the protein in complex with the two most promising hits were carried out so as to identify the key interactions and validate the structural stability.

  7. Multistep greedy algorithm identifies community structure in real-world and computer-generated networks

    CERN Document Server

    Schuetz, Philipp

    2008-01-01

    We have recently introduced a multistep extension of the greedy algorithm for modularity optimization. The extension is based on the idea that merging l pairs of communities (l>1) at each iteration prevents premature condensation into few large communities. Here, an empirical formula is presented for the choice of the step width l that generates partitions with (close to) optimal modularity for 17 real-world and 1100 computer-generated networks. Furthermore, an in-depth analysis of the communities of two real-world networks (the metabolic network of the bacterium E. coli and the graph of coappearing words in the titles of papers coauthored by Martin Karplus) provides evidence that the partition obtained by the multistep greedy algorithm is superior to the one generated by the original greedy algorithm not only with respect to modularity but also according to objective criteria. In other words, the multistep extension of the greedy algorithm reduces the danger of getting trapped in local optima of modularity a...

  8. A computational approach for phenotypic comparisons of cell populations in high-dimensional cytometry data.

    Science.gov (United States)

    Platon, Ludovic; Pejoski, David; Gautreau, Guillaume; Targat, Brice; Le Grand, Roger; Beignon, Anne-Sophie; Tchitchek, Nicolas

    2017-09-14

    Cytometry is an experimental technique used to measure molecules expressed by cells at a single cell resolution. Recently, several technological improvements have made possible to increase greatly the number of cell markers that can be simultaneously measured. Many computational methods have been proposed to identify clusters of cells having similar phenotypes. Nevertheless, only a limited number of computational methods permits to compare the phenotypes of the cell clusters identified by different clustering approaches. These phenotypic comparisons are necessary to choose the appropriate clustering methods and settings. Because of this lack of tools, comparisons of cell cluster phenotypes are often performed manually, a highly biased and time-consuming process. We designed CytoCompare, an R package that performs comparisons between the phenotypes of cell clusters with the purpose of identifying similar and different ones, based on the distribution of marker expressions. For each phenotype comparison of two cell clusters, CytoCompare provides a distance measure as well as a p-value asserting the statistical significance of the difference. CytoCompare can import clustering results from various algorithms including SPADE, viSNE/ACCENSE, and Citrus, the most current widely used algorithms. Additionally, CytoCompare can generate parallel coordinates, parallel heatmaps, multidimensional scaling or circular graph representations to visualize easily cell cluster phenotypes and the comparison results. CytoCompare is a flexible analysis pipeline for comparing the phenotypes of cell clusters identified by automatic gating algorithms in high-dimensional cytometry data. This R package is ideal for benchmarking different clustering algorithms and associated parameters. CytoCompare is freely distributed under the GPL-3 license and is available on https://github.com/tchitchek-lab/CytoCompare. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Computational morphology a computational geometric approach to the analysis of form

    CERN Document Server

    Toussaint, GT

    1988-01-01

    Computational Geometry is a new discipline of computer science that deals with the design and analysis of algorithms for solving geometric problems. There are many areas of study in different disciplines which, while being of a geometric nature, have as their main component the extraction of a description of the shape or form of the input data. This notion is more imprecise and subjective than pure geometry. Such fields include cluster analysis in statistics, computer vision and pattern recognition, and the measurement of form and form-change in such areas as stereology and developmental biolo

  10. TOWARD HIGHLY SECURE AND AUTONOMIC COMPUTING SYSTEMS: A HIERARCHICAL APPROACH

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hsien-Hsin S

    2010-05-11

    The overall objective of this research project is to develop novel architectural techniques as well as system software to achieve a highly secure and intrusion-tolerant computing system. Such system will be autonomous, self-adapting, introspective, with self-healing capability under the circumstances of improper operations, abnormal workloads, and malicious attacks. The scope of this research includes: (1) System-wide, unified introspection techniques for autonomic systems, (2) Secure information-flow microarchitecture, (3) Memory-centric security architecture, (4) Authentication control and its implication to security, (5) Digital right management, (5) Microarchitectural denial-of-service attacks on shared resources. During the period of the project, we developed several architectural techniques and system software for achieving a robust, secure, and reliable computing system toward our goal.

  11. Computational Model of Music Sight Reading: A Reinforcement Learning Approach

    CERN Document Server

    Yahya, Keyvan

    2010-01-01

    Although the Music Sight Reading process usually has been studied from the cognitive or neurological view points, but the computational learning methods like the Reinforcement Learning have not yet been used to modeling of such processes. In this paper with regards to essential properties of our specific problem, we consider the value function concept and will indicate that the optimum policy can be obtained by the method we offer without to be getting involved with computing of the complex value functions which are in most of cases inexact. Also, the algorithm we will offer here is somehow a PDE based algorithm which is associated with a stochastic optimization programming and we consider that in this case, this one is more applicable than the normative algorithms like temporal difference method.

  12. Distance Based Asynchronous Recovery Approach In Mobile Computing Environment

    Directory of Open Access Journals (Sweden)

    Yogita Khatri,

    2012-06-01

    Full Text Available A mobile computing system is a distributed system in which at least one of the processes is mobile. They are constrained by lack of stable storage, low network bandwidth, mobility, frequent disconnection andlimited battery life. Checkpointing is one of the commonly used techniques to provide fault tolerance in mobile computing environment. In order to suit the mobile environment a distance based recovery schemeis proposed which is based on checkpointing and message logging. After the system recovers from failures, only the failed processes rollback and restart from their respective recent checkpoints, independent of the others. The salient feature of this scheme is to reduce the transfer and recovery cost. While the mobile host moves with in a specific range, recovery information is not moved and thus only be transferred nearby if the mobile host moves out of certain range.

  13. Influence of intracanal post on apical periodontitis identified by cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Estrela, Carlos; Porto, Olavo Cesar Lyra; Rodrigues, Cleomar Donizeth [Federal University of Goias (UFG), Goiania, GO (Brazil). Dental School; Bueno, Mike Reis [University of Cuiaba (UNIC), MT (Brazil). Dental School; Pecora, Jesus Djalma, E-mail: estrela3@terra.com.b [University of Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Dental School

    2009-07-01

    The determination of the success of endodontic treatment has been often discussed based on outcome obtained by periapical radiography. The aim of this study was to verify the influence of intracanal post on apical periodontitis detected by cone-beam computed tomography (CBCT). A consecutive sample of 1020 images (periapical radiographs and CBCT scans) taken from 619 patients (245 men; mean age, 50.1 years) between February 2008 and September 2009 were used in this study. Presence and intracanal post length (short, medium and long) were associated with apical periodontitis (AP). Chi-square test was used for statistical analyses. Significance level was set at p<0.01. The kappa value was used to assess examiner variability. From a total of 591 intracanal posts, AP was observed in 15.06%, 18.78% and 7.95% using periapical radiographs, into the different lengths, short, medium and long, respectively (p=0.466). Considering the same posts length it was verified AP in 24.20%, 26.40% and 11.84% observed by CBCT scans, respectively (p=0.154). From a total of 1,020 teeth used in this study, AP was detected in 397 (38.92%) by periapical radiography and in 614 (60.19%) by CBCT scans (p<0.001). The distribution of intracanal posts in different dental groups showed higher prevalence in maxillary anterior teeth (54.79%). Intracanal posts lengths did not influenced AP. AP was detected more frequently when CBCT method was used. (author)

  14. A computational study of the Warburg effect identifies metabolic targets inhibiting cancer migration.

    Science.gov (United States)

    Yizhak, Keren; Le Dévédec, Sylvia E; Rogkoti, Vasiliki Maria; Baenke, Franziska; de Boer, Vincent C; Frezza, Christian; Schulze, Almut; van de Water, Bob; Ruppin, Eytan

    2014-08-01

    Over the last decade, the field of cancer metabolism has mainly focused on studying the role of tumorigenic metabolic rewiring in supporting cancer proliferation. Here, we perform the first genome-scale computational study of the metabolic underpinnings of cancer migration. We build genome-scale metabolic models of the NCI-60 cell lines that capture the Warburg effect (aerobic glycolysis) typically occurring in cancer cells. The extent of the Warburg effect in each of these cell line models is quantified by the ratio of glycolytic to oxidative ATP flux (AFR), which is found to be highly positively associated with cancer cell migration. We hence predicted that targeting genes that mitigate the Warburg effect by reducing the AFR may specifically inhibit cancer migration. By testing the anti-migratory effects of silencing such 17 top predicted genes in four breast and lung cancer cell lines, we find that up to 13 of these novel predictions significantly attenuate cell migration either in all or one cell line only, while having almost no effect on cell proliferation. Furthermore, in accordance with the predictions, a significant reduction is observed in the ratio between experimentally measured ECAR and OCR levels following these perturbations. Inhibiting anti-migratory targets is a promising future avenue in treating cancer since it may decrease cytotoxic-related side effects that plague current anti-proliferative treatments. Furthermore, it may reduce cytotoxic-related clonal selection of more aggressive cancer cells and the likelihood of emerging resistance. © 2014 The Authors. Published under the terms of the CC BY 4.0 license.

  15. Identifying Mechanisms Behind the Tullio Phenomenon: a Computational Study Based on First Principles.

    Science.gov (United States)

    Grieser, Bernhard J; Kleiser, Leonhard; Obrist, Dominik

    2016-04-01

    Patients with superior canal dehiscence (SCD) suffer from events of dizziness and vertigo in response to sound, also known as Tullio phenomenon (TP). The present work seeks to explain the fluid-dynamical mechanisms behind TP. In accordance with the so-called third window theory, we developed a computational model for the vestibular signal pathway between stapes and SCD. It is based on first principles and accounts for fluid-structure interactions arising between endolymph, perilymph, and membranous labyrinth. The simulation results reveal a wave propagation phenomenon in the membranous canal, leading to two flow phenomena within the endolymph which are in close interaction. First, the periodic deformation of the membranous labyrinth causes oscillating endolymph flow which forces the cupula to oscillate in phase with the sound stimulus. Second, these primary oscillations of the endolymph induce a steady flow component by a phenomenon known as steady streaming. We find that this steady flow of the endolymph is typically in ampullofugal direction. This flow leads to a quasi-steady deflection of the cupula which increases until the driving forces of the steady streaming are balanced by the elastic reaction forces of the cupula, such that the cupula attains a constant deflection amplitude which lasts as long as the sound stimulus. Both response types have been observed in the literature. In a sensitivity study, we obtain an analytical fit which very well matches our simulation results in a relevant parameter range. Finally, we correlate the corresponding eye response (vestibulo-ocular reflex) with the fluid dynamics by a simplified model of lumped system constants. The results reveal a "sweet spot" for TP within the audible sound spectrum. We find that the underlying mechanisms which lead to TP originate primarily from Reynolds stresses in the fluid, which are weaker at lower sound frequencies.

  16. Modeling Cu2+-Aβ complexes from computational approaches

    Science.gov (United States)

    Alí-Torres, Jorge; Mirats, Andrea; Maréchal, Jean-Didier; Rodríguez-Santiago, Luis; Sodupe, Mariona

    2015-09-01

    Amyloid plaques formation and oxidative stress are two key events in the pathology of the Alzheimer disease (AD), in which metal cations have been shown to play an important role. In particular, the interaction of the redox active Cu2+ metal cation with Aβ has been found to interfere in amyloid aggregation and to lead to reactive oxygen species (ROS). A detailed knowledge of the electronic and molecular structure of Cu2+-Aβ complexes is thus important to get a better understanding of the role of these complexes in the development and progression of the AD disease. The computational treatment of these systems requires a combination of several available computational methodologies, because two fundamental aspects have to be addressed: the metal coordination sphere and the conformation adopted by the peptide upon copper binding. In this paper we review the main computational strategies used to deal with the Cu2+-Aβ coordination and build plausible Cu2+-Aβ models that will afterwards allow determining physicochemical properties of interest, such as their redox potential.

  17. Modeling Cu2+-Aβ complexes from computational approaches

    Directory of Open Access Journals (Sweden)

    Jorge Alí-Torres

    2015-09-01

    Full Text Available Amyloid plaques formation and oxidative stress are two key events in the pathology of the Alzheimer disease (AD, in which metal cations have been shown to play an important role. In particular, the interaction of the redox active Cu2+ metal cation with Aβ has been found to interfere in amyloid aggregation and to lead to reactive oxygen species (ROS. A detailed knowledge of the electronic and molecular structure of Cu2+-Aβ complexes is thus important to get a better understanding of the role of these complexes in the development and progression of the AD disease. The computational treatment of these systems requires a combination of several available computational methodologies, because two fundamental aspects have to be addressed: the metal coordination sphere and the conformation adopted by the peptide upon copper binding. In this paper we review the main computational strategies used to deal with the Cu2+-Aβ coordination and build plausible Cu2+-Aβ models that will afterwards allow determining physicochemical properties of interest, such as their redox potential.

  18. Modeling Cu{sup 2+}-Aβ complexes from computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Alí-Torres, Jorge [Departamento de Química, Universidad Nacional de Colombia- Sede Bogotá, 111321 (Colombia); Mirats, Andrea; Maréchal, Jean-Didier; Rodríguez-Santiago, Luis; Sodupe, Mariona, E-mail: Mariona.Sodupe@uab.cat [Departament de Química, Universitat Autònoma de Barcelona, 08193 Bellaterra, Barcelona (Spain)

    2015-09-15

    Amyloid plaques formation and oxidative stress are two key events in the pathology of the Alzheimer disease (AD), in which metal cations have been shown to play an important role. In particular, the interaction of the redox active Cu{sup 2+} metal cation with Aβ has been found to interfere in amyloid aggregation and to lead to reactive oxygen species (ROS). A detailed knowledge of the electronic and molecular structure of Cu{sup 2+}-Aβ complexes is thus important to get a better understanding of the role of these complexes in the development and progression of the AD disease. The computational treatment of these systems requires a combination of several available computational methodologies, because two fundamental aspects have to be addressed: the metal coordination sphere and the conformation adopted by the peptide upon copper binding. In this paper we review the main computational strategies used to deal with the Cu{sup 2+}-Aβ coordination and build plausible Cu{sup 2+}-Aβ models that will afterwards allow determining physicochemical properties of interest, such as their redox potential.

  19. Computational Approach to Diarylprolinol-Silyl Ethers in Aminocatalysis.

    Science.gov (United States)

    Halskov, Kim Søholm; Donslund, Bjarke S; Paz, Bruno Matos; Jørgensen, Karl Anker

    2016-05-17

    Asymmetric organocatalysis has witnessed a remarkable development since its "re-birth" in the beginning of the millenium. In this rapidly growing field, computational investigations have proven to be an important contribution for the elucidation of mechanisms and rationalizations of the stereochemical outcomes of many of the reaction concepts developed. The improved understanding of mechanistic details has facilitated the further advancement of the field. The diarylprolinol-silyl ethers have since their introduction been one of the most applied catalysts in asymmetric aminocatalysis due to their robustness and generality. Although aminocatalytic methods at first glance appear to follow relatively simple mechanistic principles, more comprehensive computational studies have shown that this notion in some cases is deceiving and that more complex pathways might be operating. In this Account, the application of density functional theory (DFT) and other computational methods on systems catalyzed by the diarylprolinol-silyl ethers is described. It will be illustrated how computational investigations have shed light on the structure and reactivity of important intermediates in aminocatalysis, such as enamines and iminium ions formed from aldehydes and α,β-unsaturated aldehydes, respectively. Enamine and iminium ion catalysis can be classified as HOMO-raising and LUMO-lowering activation modes. In these systems, the exclusive reactivity through one of the possible intermediates is often a requisite for achieving high stereoselectivity; therefore, the appreciation of subtle energy differences has been vital for the efficient development of new stereoselective reactions. The diarylprolinol-silyl ethers have also allowed for novel activation modes for unsaturated aldehydes, which have opened up avenues for the development of new remote functionalization reactions of poly-unsaturated carbonyl compounds via di-, tri-, and tetraenamine intermediates and vinylogous iminium ions

  20. Security approaches in using tablet computers for primary data collection in clinical research.

    Science.gov (United States)

    Wilcox, Adam B; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project.

  1. A computational approach to identify predictive gene signatures in Triple Negative Breast Cancer

    OpenAIRE

    Nuzzo, Simona

    2014-01-01

    Microarray technology has been extensively used to detect patterns in gene expression that stem from regulatory interactions. Seminal studies demonstrated that the synergistic use of microarray-based techniques and bioinformatics analysis of genomic data might not only further the understanding of pathological phenotypes, but also provide lists of genes to dissect a disease into distinct groups, with different diagnostic or prognostic characteristics. Nonetheless, optimism for microarray-base...

  2. Probing the dynamics of identified neurons with a data-driven modeling approach.

    Directory of Open Access Journals (Sweden)

    Thomas Nowotny

    Full Text Available In controlling animal behavior the nervous system has to perform within the operational limits set by the requirements of each specific behavior. The implications for the corresponding range of suitable network, single neuron, and ion channel properties have remained elusive. In this article we approach the question of how well-constrained properties of neuronal systems may be on the neuronal level. We used large data sets of the activity of isolated invertebrate identified cells and built an accurate conductance-based model for this cell type using customized automated parameter estimation techniques. By direct inspection of the data we found that the variability of the neurons is larger when they are isolated from the circuit than when in the intact system. Furthermore, the responses of the neurons to perturbations appear to be more consistent than their autonomous behavior under stationary conditions. In the developed model, the constraints on different parameters that enforce appropriate model dynamics vary widely from some very tightly controlled parameters to others that are almost arbitrary. The model also allows predictions for the effect of blocking selected ionic currents and to prove that the origin of irregular dynamics in the neuron model is proper chaoticity and that this chaoticity is typical in an appropriate sense. Our results indicate that data driven models are useful tools for the in-depth analysis of neuronal dynamics. The better consistency of responses to perturbations, in the real neurons as well as in the model, suggests a paradigm shift away from measuring autonomous dynamics alone towards protocols of controlled perturbations. Our predictions for the impact of channel blockers on the neuronal dynamics and the proof of chaoticity underscore the wide scope of our approach.

  3. Computational modeling of bone density profiles in response to gait: a subject-specific approach.

    Science.gov (United States)

    Pang, Henry; Shiwalkar, Abhishek P; Madormo, Chris M; Taylor, Rebecca E; Andriacchi, Thomas P; Kuhl, Ellen

    2012-03-01

    The goal of this study is to explore the potential of computational growth models to predict bone density profiles in the proximal tibia in response to gait-induced loading. From a modeling point of view, we design a finite element-based computational algorithm using the theory of open system thermodynamics. In this algorithm, the biological problem, the balance of mass, is solved locally on the integration point level, while the mechanical problem, the balance of linear momentum, is solved globally on the node point level. Specifically, the local bone mineral density is treated as an internal variable, which is allowed to change in response to mechanical loading. From an experimental point of view, we perform a subject-specific gait analysis to identify the relevant forces during walking using an inverse dynamics approach. These forces are directly applied as loads in the finite element simulation. To validate the model, we take a Dual-Energy X-ray Absorptiometry scan of the subject's right knee from which we create a geometric model of the proximal tibia. For qualitative validation, we compare the computationally predicted density profiles to the bone mineral density extracted from this scan. For quantitative validation, we adopt the region of interest method and determine the density values at fourteen discrete locations using standard and custom-designed image analysis tools. Qualitatively, our two- and three-dimensional density predictions are in excellent agreement with the experimental measurements. Quantitatively, errors are less than 3% for the two-dimensional analysis and less than 10% for the three-dimensional analysis. The proposed approach has the potential to ultimately improve the long-term success of possible treatment options for chronic diseases such as osteoarthritis on a patient-specific basis by accurately addressing the complex interactions between ambulatory loads and tissue changes.

  4. Unsupervised Approaches for Post-Processing in Computationally Efficient Waveform-Similarity-Based Earthquake Detection

    Science.gov (United States)

    Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.

    2015-12-01

    Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.

  5. Novel phenotypes and loci identified through clinical genomics approaches to pediatric cataract.

    Science.gov (United States)

    Patel, Nisha; Anand, Deepti; Monies, Dorota; Maddirevula, Sateesh; Khan, Arif O; Algoufi, Talal; Alowain, Mohammed; Faqeih, Eissa; Alshammari, Muneera; Qudair, Ahmed; Alsharif, Hadeel; Aljubran, Fatimah; Alsaif, Hessa S; Ibrahim, Niema; Abdulwahab, Firdous M; Hashem, Mais; Alsedairy, Haifa; Aldahmesh, Mohammed A; Lachke, Salil A; Alkuraya, Fowzan S

    2017-02-01

    Pediatric cataract is highly heterogeneous clinically and etiologically. While mostly isolated, cataract can be part of many multisystem disorders, further complicating the diagnostic process. In this study, we applied genomic tools in the form of a multi-gene panel as well as whole-exome sequencing on unselected cohort of pediatric cataract (166 patients from 74 families). Mutations in previously reported cataract genes were identified in 58% for a total of 43 mutations, including 15 that are novel. GEMIN4 was independently mutated in families with a syndrome of cataract, global developmental delay with or without renal involvement. We also highlight a recognizable syndrome that resembles galactosemia (a fulminant infantile liver disease with cataract) caused by biallelic mutations in CYP51A1. A founder mutation in RIC1 (KIAA1432) was identified in patients with cataract, brain atrophy, microcephaly with or without cleft lip and palate. For non-syndromic pediatric cataract, we map a novel locus in a multiplex consanguineous family on 4p15.32 where exome sequencing revealed a homozygous truncating mutation in TAPT1. We report two further candidates that are biallelically inactivated each in a single cataract family: TAF1A (cataract with global developmental delay) and WDR87 (non-syndromic cataract). In addition to positional mapping data, we use iSyTE developmental lens expression and gene-network analysis to corroborate the proposed link between the novel candidate genes and cataract. Our study expands the phenotypic, allelic and locus heterogeneity of pediatric cataract. The high diagnostic yield of clinical genomics supports the adoption of this approach in this patient group.

  6. A new computational approach to simulate pattern formation in Paenibacillus dendritiformis bacterial colonies

    Science.gov (United States)

    Tucker, Laura Jane

    Under the harsh conditions of limited nutrient and hard growth surface, Paenibacillus dendritiformis in agar plates form two classes of patterns (morphotypes). The first class, called the dendritic morphotype, has radially directed branches. The second class, called the chiral morphotype, exhibits uniform handedness. The dendritic morphotype has been modeled successfully using a continuum model on a regular lattice; however, a suitable computational approach was not known to solve a continuum chiral model. This work details a new computational approach to solving the chiral continuum model of pattern formation in P. dendritiformis. The approach utilizes a random computational lattice and new methods for calculating certain derivative terms found in the model.

  7. Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.

    Science.gov (United States)

    Fong, Stephen S

    2014-08-01

    Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.

  8. A Computer Simulation Modeling Approach to Estimating Utility in Several Air Force Specialties

    Science.gov (United States)

    1992-05-01

    AL-TR-1992-0006 AD-A252 322 /II" A COMPUTER SIMULATION MODELING A APPROACH TO ESTIMATING UTILITY IN R SEVERAL AIR FORCE SPECIALTIES M Brice M. Stone...I 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED IU 1Q::l.n1 Umrjh 1100 4. TITLE AND SUBTITLE S. FUNDING NUMBERS A Computer Simulation Modeling Approach...I DTIC TAB 0 Unannounced 0 justificatlon- By Distribut On . Availability Codes Avai an /r Dist Special v A COMPUTER SIMULATION MODELING APPROACH TO

  9. Computational approach for calculating bound states in quantum field theory

    Science.gov (United States)

    Lv, Q. Z.; Norris, S.; Brennan, R.; Stefanovich, E.; Su, Q.; Grobe, R.

    2016-09-01

    We propose a nonperturbative approach to calculate bound-state energies and wave functions for quantum field theoretical models. It is based on the direct diagonalization of the corresponding quantum field theoretical Hamiltonian in an effectively discretized and truncated Hilbert space. We illustrate this approach for a Yukawa-like interaction between fermions and bosons in one spatial dimension and show where it agrees with the traditional method based on the potential picture and where it deviates due to recoil and radiative corrections. This method permits us also to obtain some insight into the spatial characteristics of the distribution of the fermions in the ground state, such as the bremsstrahlung-induced widening.

  10. Computational intelligent gait-phase detection system to identify pathological gait.

    Science.gov (United States)

    Senanayake, Chathuri M; Senanayake, S M N Arosha

    2010-09-01

    An intelligent gait-phase detection algorithm based on kinematic and kinetic parameters is presented in this paper. The gait parameters do not vary distinctly for each gait phase; therefore, it is complex to differentiate gait phases with respect to a threshold value. To overcome this intricacy, the concept of fuzzy logic was applied to detect gait phases with respect to fuzzy membership values. A real-time data-acquisition system was developed consisting of four force-sensitive resistors and two inertial sensors to obtain foot-pressure patterns and knee flexion/extension angle, respectively. The detected gait phases could be further analyzed to identify abnormality occurrences, and hence, is applicable to determine accurate timing for feedback. The large amount of data required for quality gait analysis necessitates the utilization of information technology to store, manage, and extract required information. Therefore, a software application was developed for real-time acquisition of sensor data, data processing, database management, and a user-friendly graphical-user interface as a tool to simplify the task of clinicians. The experiments carried out to validate the proposed system are presented along with the results analysis for normal and pathological walking patterns.

  11. An observation-based approach to identify local natural dust events from routine aerosol ground monitoring

    Directory of Open Access Journals (Sweden)

    D. Q. Tong

    2012-02-01

    Full Text Available Dust is a major component of atmospheric aerosols in many parts of the world. Although there exist many routine aerosol monitoring networks, it is often difficult to obtain dust records from these networks, because these monitors are either deployed far away from dust active regions (most likely collocated with dense population or contaminated by anthropogenic sources and other natural sources, such as wildfires and vegetation detritus. Here we propose a new approach to identify local dust events relying solely on aerosol mass and composition from general-purpose aerosol measurements. Through analyzing the chemical and physical characteristics of aerosol observations during satellite-detected dust episodes, we select five indicators to be used to identify local dust records: (1 high PM10 concentrations; (2 low PM2.5/PM10 ratio; (3 higher concentrations and percentage of crustal elements; (4 lower percentage of anthropogenic pollutants; and (5 low enrichment factors of anthropogenic elements. After establishing these identification criteria, we conduct hierarchical cluster analysis for all validated aerosol measurement data over 68 IMPROVE sites in the Western United States. A total of 182 local dust events were identified over 30 of the 68 locations from 2000 to 2007. These locations are either close to the four US Deserts, namely the Great Basin Desert, the Mojave Desert, the Sonoran Desert, and the Chihuahuan Desert, or in the high wind power region (Colorado. During the eight-year study period, the total number of dust events displays an interesting four-year activity cycle (one in 2000–2003 and the other in 2004–2007. The years of 2003, 2002 and 2007 are the three most active dust periods, with 46, 31 and 24 recorded dust events, respectively, while the years of 2000, 2004 and 2005 are the calmest periods, all with single digit dust records. Among these deserts, the Chihuahua Desert (59 cases and the

  12. Electromagnetic space-time crystals. II. Fractal computational approach

    OpenAIRE

    Borzdov, G. N.

    2014-01-01

    A fractal approach to numerical analysis of electromagnetic space-time crystals, created by three standing plane harmonic waves with mutually orthogonal phase planes and the same frequency, is presented. Finite models of electromagnetic crystals are introduced, which make possible to obtain various approximate solutions of the Dirac equation. A criterion for evaluating accuracy of these approximate solutions is suggested.

  13. Electromagnetic space-time crystals. II. Fractal computational approach

    OpenAIRE

    2014-01-01

    A fractal approach to numerical analysis of electromagnetic space-time crystals, created by three standing plane harmonic waves with mutually orthogonal phase planes and the same frequency, is presented. Finite models of electromagnetic crystals are introduced, which make possible to obtain various approximate solutions of the Dirac equation. A criterion for evaluating accuracy of these approximate solutions is suggested.

  14. Synergy between experimental and computational approaches to homogeneous photoredox catalysis.

    Science.gov (United States)

    Demissie, Taye B; Hansen, Jørn H

    2016-07-01

    In this Frontiers article, we highlight how state-of-the-art density functional theory calculations can contribute to the field of homogeneous photoredox catalysis. We discuss challenges in the fields and potential solutions to be found at the interface between theory and experiment. The exciting opportunities and insights that can arise through such an interdisciplinary approach are highlighted.

  15. Diffusive Wave Approximation to the Shallow Water Equations: Computational Approach

    KAUST Repository

    Collier, Nathan

    2011-05-14

    We discuss the use of time adaptivity applied to the one dimensional diffusive wave approximation to the shallow water equations. A simple and computationally economical error estimator is discussed which enables time-step size adaptivity. This robust adaptive time discretization corrects the initial time step size to achieve a user specified bound on the discretization error and allows time step size variations of several orders of magnitude. In particular, in the one dimensional results presented in this work feature a change of four orders of magnitudes for the time step over the entire simulation.

  16. Essential algorithms a practical approach to computer algorithms

    CERN Document Server

    Stephens, Rod

    2013-01-01

    A friendly and accessible introduction to the most useful algorithms Computer algorithms are the basic recipes for programming. Professional programmers need to know how to use algorithms to solve difficult programming problems. Written in simple, intuitive English, this book describes how and when to use the most practical classic algorithms, and even how to create new algorithms to meet future needs. The book also includes a collection of questions that can help readers prepare for a programming job interview. Reveals methods for manipulating common data structures s

  17. A computer simulation approach to measurement of human control strategy

    Science.gov (United States)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  18. Oligomerization states of Bowman-Birk inhibitor by atomic force microscopy and computational approaches.

    Science.gov (United States)

    Silva, Luciano P; Azevedo, Ricardo B; Morais, Paulo C; Ventura, Manuel M; Freitas, Sonia M

    2005-11-15

    Several methods have been applied to study protein-protein interaction from structural and thermodynamic point of view. The present study reveals that atomic force microscopy (AFM), molecular modeling, and docking approaches represent alternative methods offering new strategy to investigate structural aspects in oligomerization process of proteinase inhibitors. The topography of the black-eyed pea trypsin/chymotrypsin inhibitor (BTCI) was recorded by AFM and compared with computational rigid-bodies docking approaches. Multimeric states of BTCI identified from AFM analysis showed globular-ellipsoidal shapes. Monomers, dimers, trimers, and hexamers were the most prominent molecular arrays observed in AFM images as evaluated by molecular volume calculations and corroborated by in silico docking and theoretical approaches. We therefore propose that BTCI adopts stable and well-packed self-assembled states in monomer-dimer-trimer-hexamer equilibrium. Although there are no correlation between specificity and packing efficiency among proteinases and proteinase inhibitors, the AFM and docked BTCI analyses suggest that these assemblies may exist in situ to play their potential function in oligomerization process.

  19. Machine learning approach identifies new pathways associated with demyelination in a viral model of multiple sclerosis

    Science.gov (United States)

    Ulrich, Reiner; Kalkuhl, Arno; Deschl, Ulrich; Baumgärtner, Wolfgang

    2010-01-01

    Abstract Theiler’s murine encephalomyelitis is an experimentally virus-induced inflammatory demyelinating disease of the spinal cord, displaying clinical and pathological similarities to chronic progressive multiple sclerosis. The aim of this study was to identify pathways associated with chronic demyelination using an assumption-free combined microarray and immunohistology approach. Movement control as determined by rotarod assay significantly worsened in Theiler’s murine encephalomyelitis -virus-infected SJL/J mice from 42 to 196 days after infection (dpi). In the spinal cords, inflammatory changes were detected 14 to 196 dpi, and demyelination progressively increased from 42 to 196 dpi. Microarray analysis revealed 1001 differentially expressed genes over the study period. The dominating changes as revealed by k-means and functional annotation clustering included up-regulations related to intrathecal antibody production and antigen processing and presentation via major histocompatibility class II molecules. A random forest machine learning algorithm revealed that down-regulated lipid and cholesterol biosynthesis, differentially expressed neurite morphogenesis and up-regulated toll-like receptor-4-induced pathways were intimately associated with demyelination as measured by immunohistology. Conclusively, although transcriptional changes were dominated by the adaptive immune response, the main pathways associated with demyelination included up-regulation of toll-like receptor 4 and down-regulation of cholesterol biosynthesis. Cholesterol biosynthesis is a rate limiting step of myelination and its down-regulation is suggested to be involved in chronic demyelination by an inhibition of remyelination. PMID:19183246

  20. Mixed Integer Linear Programming based machine learning approach identifies regulators of telomerase in yeast.

    Science.gov (United States)

    Poos, Alexandra M; Maicher, André; Dieckmann, Anna K; Oswald, Marcus; Eils, Roland; Kupiec, Martin; Luke, Brian; König, Rainer

    2016-06-02

    Understanding telomere length maintenance mechanisms is central in cancer biology as their dysregulation is one of the hallmarks for immortalization of cancer cells. Important for this well-balanced control is the transcriptional regulation of the telomerase genes. We integrated Mixed Integer Linear Programming models into a comparative machine learning based approach to identify regulatory interactions that best explain the discrepancy of telomerase transcript levels in yeast mutants with deleted regulators showing aberrant telomere length, when compared to mutants with normal telomere length. We uncover novel regulators of telomerase expression, several of which affect histone levels or modifications. In particular, our results point to the transcription factors Sum1, Hst1 and Srb2 as being important for the regulation of EST1 transcription, and we validated the effect of Sum1 experimentally. We compiled our machine learning method leading to a user friendly package for R which can straightforwardly be applied to similar problems integrating gene regulator binding information and expression profiles of samples of e.g. different phenotypes, diseases or treatments.

  1. Identifying medication error chains from critical incident reports: a new analytic approach.

    Science.gov (United States)

    Huckels-Baumgart, Saskia; Manser, Tanja

    2014-10-01

    Research into the distribution of medication errors usually focuses on isolated stages within the medication use process. Our study aimed to provide a novel process-oriented approach to medication incident analysis focusing on medication error chains. Our study was conducted across a 900-bed teaching hospital in Switzerland. All reported 1,591 medication errors 2009-2012 were categorized using the Medication Error Index NCC MERP and the WHO Classification for Patient Safety Methodology. In order to identify medication error chains, each reported medication incident was allocated to the relevant stage of the hospital medication use process. Only 25.8% of the reported medication errors were detected before they propagated through the medication use process. The majority of medication errors (74.2%) formed an error chain encompassing two or more stages. The most frequent error chain comprised preparation up to and including medication administration (45.2%). "Non-consideration of documentation/prescribing" during the drug preparation was the most frequent contributor for "wrong dose" during the administration of medication. Medication error chains provide important insights for detecting and stopping medication errors before they reach the patient. Existing and new safety barriers need to be extended to interrupt error chains and to improve patient safety.

  2. Mixed Integer Linear Programming based machine learning approach identifies regulators of telomerase in yeast

    Science.gov (United States)

    Poos, Alexandra M.; Maicher, André; Dieckmann, Anna K.; Oswald, Marcus; Eils, Roland; Kupiec, Martin; Luke, Brian; König, Rainer

    2016-01-01

    Understanding telomere length maintenance mechanisms is central in cancer biology as their dysregulation is one of the hallmarks for immortalization of cancer cells. Important for this well-balanced control is the transcriptional regulation of the telomerase genes. We integrated Mixed Integer Linear Programming models into a comparative machine learning based approach to identify regulatory interactions that best explain the discrepancy of telomerase transcript levels in yeast mutants with deleted regulators showing aberrant telomere length, when compared to mutants with normal telomere length. We uncover novel regulators of telomerase expression, several of which affect histone levels or modifications. In particular, our results point to the transcription factors Sum1, Hst1 and Srb2 as being important for the regulation of EST1 transcription, and we validated the effect of Sum1 experimentally. We compiled our machine learning method leading to a user friendly package for R which can straightforwardly be applied to similar problems integrating gene regulator binding information and expression profiles of samples of e.g. different phenotypes, diseases or treatments. PMID:26908654

  3. A Novel Approach for Identifying the Heme-Binding Proteins from Mouse Tissues

    Institute of Scientific and Technical Information of China (English)

    Xiaolei Li; Rong Wang; Zhongsheng Sun; Zuyuan Xu; Jingyue Bao; Xiuqing Zhang; Xiaoli Feng; Siqi Liu; Xiaoshan Wang; Kang Zhao; Zhengfeng Zhou; Caifeng Zhao; Ren Yan; Liang Lin; Tingting Lei; Jianning Yin

    2003-01-01

    Heme is a key cofactor in aerobic life, both in eukaryotes and prokaryotes. Because of the high reactivity of ferrous protoporphyrin IX, the reactions of heme in cells are often carried out through heme-protein complexes. Traditionally studies of hemebinding proteins have been approached on a case by case basis, thus there is a limited global view of the distribution of heme-binding proteins in different cells or tissues. The procedure described here is aimed at profiling hemne-binding proteins in mouse tissues sequentially by 1) purification of heme-binding proteins by hemeagarose, an affinity chromatographic resin; 2) isolation of heme-binding proteins by SDS-PAGE or two-dimensional electrophoresis; 3) identification of heme-binding proteins by mass spectrometry. In five mouse tissues, over 600 protein spots were visualized on 2DE gel stained by Commassie blue and 154 proteins were identified by MALDI-TOF, in which most proteins belong to heme related. This methodology makes it possible to globally characterize the heme-binding proteins in a biological system.

  4. Identifying an appropriate measurement modeling approach for the Mini-Mental State Examination.

    Science.gov (United States)

    Rubright, Jonathan D; Nandakumar, Ratna; Karlawish, Jason

    2016-02-01

    The Mini-Mental State Examination (MMSE) is a 30-item, dichotomously scored test of general cognition. A number of benefits could be gained by modeling the MMSE in an item response theory (IRT) framework, as opposed to the currently used classical additive approach. However, the test, which is built from groups of items related to separate cognitive subdomains, may violate a key assumption of IRT: local item independence. This study aimed to identify the most appropriate measurement model for the MMSE: a unidimensional IRT model, a testlet response theory model, or a bifactor model. Local dependence analysis using nationally representative data showed a meaningful violation of the local item independence assumption, indicating multidimensionality. In addition, the testlet and bifactor models displayed superior fit indices over a unidimensional IRT model. Statistical comparisons showed that the bifactor model fit MMSE respondent data significantly better than the other models considered. These results suggest that application of a traditional unidimensional IRT model is inappropriate in this context. Instead, a bifactor model is suggested for future modeling of MMSE data as it more accurately represents the multidimensional nature of the scale. (PsycINFO Database Record

  5. Identifying contamination with advanced visualization and analysis practices: metagenomic approaches for eukaryotic genome assemblies

    Directory of Open Access Journals (Sweden)

    Tom O. Delmont

    2016-03-01

    Full Text Available High-throughput sequencing provides a fast and cost-effective mean to recover genomes of organisms from all domains of life. However, adequate curation of the assembly results against potential contamination of non-target organisms requires advanced bioinformatics approaches and practices. Here, we re-analyzed the sequencing data generated for the tardigrade Hypsibius dujardini, and created a holistic display of the eukaryotic genome assembly using DNA data originating from two groups and eleven sequencing libraries. By using bacterial single-copy genes, k-mer frequencies, and coverage values of scaffolds we could identify and characterize multiple near-complete bacterial genomes from the raw assembly, and curate a 182 Mbp draft genome for H. dujardini supported by RNA-Seq data. Our results indicate that most contaminant scaffolds were assembled from Moleculo long-read libraries, and most of these contaminants have differed between library preparations. Our re-analysis shows that visualization and curation of eukaryotic genome assemblies can benefit from tools designed to address the needs of today’s microbiologists, who are constantly challenged by the difficulties associated with the identification of distinct microbial genomes in complex environmental metagenomes.

  6. Identifying contamination with advanced visualization and analysis practices: metagenomic approaches for eukaryotic genome assemblies

    Science.gov (United States)

    Delmont, Tom O.

    2016-01-01

    High-throughput sequencing provides a fast and cost-effective mean to recover genomes of organisms from all domains of life. However, adequate curation of the assembly results against potential contamination of non-target organisms requires advanced bioinformatics approaches and practices. Here, we re-analyzed the sequencing data generated for the tardigrade Hypsibius dujardini, and created a holistic display of the eukaryotic genome assembly using DNA data originating from two groups and eleven sequencing libraries. By using bacterial single-copy genes, k-mer frequencies, and coverage values of scaffolds we could identify and characterize multiple near-complete bacterial genomes from the raw assembly, and curate a 182 Mbp draft genome for H. dujardini supported by RNA-Seq data. Our results indicate that most contaminant scaffolds were assembled from Moleculo long-read libraries, and most of these contaminants have differed between library preparations. Our re-analysis shows that visualization and curation of eukaryotic genome assemblies can benefit from tools designed to address the needs of today’s microbiologists, who are constantly challenged by the difficulties associated with the identification of distinct microbial genomes in complex environmental metagenomes. PMID:27069789

  7. A Novel Approach for Identifying the Heme—Binding Proteins from Mouse Tissues

    Institute of Scientific and Technical Information of China (English)

    XiaoleiLi; XiaoshanWang; KangZhao; ZhengfengZhou; CaifengZhao; RenYan; LiangLin; TingtingLei; JianningYin; RongWang; ZhongshengSun; ZuyuanXu; JingyueBao; XiugingZhang; XiaoliFeng; SiqiLiu

    2003-01-01

    Heme is a key cofactor in aerobic life,both in eukaryotes and prokaryotes.Because of the high reactivity of ferrous protoporphyrin IX,the reactions of heme in cells are often carried out through heme-protein complexes.Traditionally studies of hemebinding proteins have been approached on a case by case basis,thus there is a limited global view of the distribution of heme-binding proteins in different cells or tissues.The procedure described here is aimed at profiling heme-binding proteins in mouse tissues sequentially by 1)purification of heme-binding proteins by hemeagarose,an affinity chromatographic resin;2)isolation of heme-binding proteins by SDS-PAGE or two-dimensional electrophoresis;3)identification of heme-binding proteins by mass spectrometry.In five mouse tissues,over 600 protein spots were visualized on 2DE gel stained by Commassie blue and 154 proteins were identified by MALDI-TOF,in which most proteins belong to heme related.This methodology makes it possible to globally characterize the heme-binding proteins in a biological system.

  8. A statistical approach for identifying nuclear waste glass compositions that will meet quality and processability requirements

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, G.F.

    1990-09-01

    Borosilicate glass provides a solid, stable medium for the disposal of high-level radioactive wastes resulting from the production of nuclear materials for United States defense needs. The glass must satisfy various quality and processability requirements on properties such as chemical durability, viscosity, and electrical conductivity. These properties depend on the composition of the waste glass, which will vary during production due to variations in nuclear waste composition and variations in the glass-making process. This paper discusses the experimentally-based statistical approach being used in the Hanford Waste Vitrification Plant (HWVP) Composition Variability Study (CVS). The overall goal of the CVS is to identify the composition region of potential HWVP waste glasses that satisfy with high confidence the applicable quality and processability requirements. This is being accomplished by melting and obtaining property data for simulated nuclear waste glasses of various compositions, and then statistically developing models and other tools needed to meet the goal. 6 refs., 1 fig., 5 tabs.

  9. What influences motivation in Physical Education? A multilevel approach for identifying climate determinants of achievement motivation

    Directory of Open Access Journals (Sweden)

    Benjamin Niederkofler

    2015-03-01

    Full Text Available The present research tested the longitudinal and hierarchical influences of students’ climate perception on the development of achievement motives in Physical Education (PE. Students from Switzerland (N = 919; 45 classes; 50.1% female, age: M = 13.2, SD = 0.6 responded to the questionnaire. Perceived climate was measured using the German LASSO scales (Von Saldern & Littig, 1987, namely teacher care, classmate cooperativeness and satisfaction with teaching. To assess sport specific achievement motives (Hope of Success, HS; Fear of Failure, FF, we used a validated German scale from Elbe, Wenhold, and Müller (2005. Multilevel analysis revealed a link between perceived climate on change of students’ motivation in PE. The investigation also identified factors determining motivation decline caused by the classroom environment and teachers. Moreover, results showed significant gender effects on both motives and a significant impact of individual teacher care on the HS. This was also found for individual and aggregated satisfaction with teaching. The latter was significant for FF on both levels. Interestingly, teacher care showed inhibitory effects on both achievement motives. These findings suggest that students in PE may have unique behaviour which requires a different teaching approach than in normal classroom. This describes a specific learning environment in PE classes. Results are discussed based on students’ unique needs and gender effects.

  10. Gene networks associated with conditional fear in mice identified using a systems genetics approach

    Directory of Open Access Journals (Sweden)

    Eskin Eleazar

    2011-03-01

    Full Text Available Abstract Background Our understanding of the genetic basis of learning and memory remains shrouded in mystery. To explore the genetic networks governing the biology of conditional fear, we used a systems genetics approach to analyze a hybrid mouse diversity panel (HMDP with high mapping resolution. Results A total of 27 behavioral quantitative trait loci were mapped with a false discovery rate of 5%. By integrating fear phenotypes, transcript profiling data from hippocampus and striatum and also genotype information, two gene co-expression networks correlated with context-dependent immobility were identified. We prioritized the key markers and genes in these pathways using intramodular connectivity measures and structural equation modeling. Highly connected genes in the context fear modules included Psmd6, Ube2a and Usp33, suggesting an important role for ubiquitination in learning and memory. In addition, we surveyed the architecture of brain transcript regulation and demonstrated preservation of gene co-expression modules in hippocampus and striatum, while also highlighting important differences. Rps15a, Kif3a, Stard7, 6330503K22RIK, and Plvap were among the individual genes whose transcript abundance were strongly associated with fear phenotypes. Conclusion Application of our multi-faceted mapping strategy permits an increasingly detailed characterization of the genetic networks underlying behavior.

  11. Identifying contamination with advanced visualization and analysis practices: metagenomic approaches for eukaryotic genome assemblies.

    Science.gov (United States)

    Delmont, Tom O; Eren, A Murat

    2016-01-01

    High-throughput sequencing provides a fast and cost-effective mean to recover genomes of organisms from all domains of life. However, adequate curation of the assembly results against potential contamination of non-target organisms requires advanced bioinformatics approaches and practices. Here, we re-analyzed the sequencing data generated for the tardigrade Hypsibius dujardini, and created a holistic display of the eukaryotic genome assembly using DNA data originating from two groups and eleven sequencing libraries. By using bacterial single-copy genes, k-mer frequencies, and coverage values of scaffolds we could identify and characterize multiple near-complete bacterial genomes from the raw assembly, and curate a 182 Mbp draft genome for H. dujardini supported by RNA-Seq data. Our results indicate that most contaminant scaffolds were assembled from Moleculo long-read libraries, and most of these contaminants have differed between library preparations. Our re-analysis shows that visualization and curation of eukaryotic genome assemblies can benefit from tools designed to address the needs of today's microbiologists, who are constantly challenged by the difficulties associated with the identification of distinct microbial genomes in complex environmental metagenomes.

  12. Pharmacy patronage: identifying key factors in the decision making process using the determinant attribute approach.

    Science.gov (United States)

    Franic, Duska M; Haddock, Sarah M; Tucker, Leslie Tootle; Wooten, Nathan

    2008-01-01

    To use the determinant attribute approach, a research method commonly used in marketing to identify the wants of various consumer groups, to evaluate consumer pharmacy choice when having a prescription order filled in different pharmacy settings. Cross sectional. Community independent, grocery store, community chain, and discount store pharmacies in Georgia between April 2005 and April 2006. Convenience sample of adult pharmacy consumers (n = 175). Survey measuring consumer preferences on 26 attributes encompassing general pharmacy site features (16 items), pharmacist characteristics (5 items), and pharmacy staff characteristics (5 items). 26 potential determinant attributes for pharmacy selection. 175 consumers were surveyed at community independent (n = 81), grocery store (n = 44), community chain (n = 27), or discount store (n = 23) pharmacy settings. The attributes of pharmacists and staff at all four pharmacy settings were shown to affect pharmacy patronage motives, although consumers frequenting non-community independent pharmacies were also motivated by secondary convenience factors, e.g., hours of operation, and prescription coverage. Most consumers do not perceive pharmacies as merely prescription-distribution centers that vary only by convenience. Prescriptions are not just another economic good. Pharmacy personnel influence pharmacy selection; therefore, optimal staff selection and training is likely the greatest asset and most important investment for ensuring pharmacy success.

  13. Combined metagenomic and phenomic approaches identify a novel salt tolerance gene from the human gut microbiome.

    Science.gov (United States)

    Culligan, Eamonn P; Marchesi, Julian R; Hill, Colin; Sleator, Roy D

    2014-01-01

    In the current study, a number of salt-tolerant clones previously isolated from a human gut metagenomic library were screened using Phenotype MicroArray (PM) technology to assess their functional capacity. PM's can be used to study gene function, pathogenicity, metabolic capacity and identify drug targets using a series of specialized microtitre plate assays, where each well of the microtitre plate contains a different set of conditions and tests a different phenotype. Cellular respiration is monitored colorimetrically by the reduction of a tetrazolium dye. One clone, SMG 9, was found to be positive for utilization/transport of L-carnitine (a well-characterized osmoprotectant) in the presence of 6% w/v sodium chloride (NaCl). Subsequent experiments revealed a significant growth advantage in minimal media containing NaCl and L-carnitine. Fosmid sequencing revealed putative candidate genes responsible for the phenotype. Subsequent cloning of two genes did not replicate the L-carnitine-associated phenotype, although one of the genes, a σ(54)-dependent transcriptional regulator, did confer salt tolerance to Escherichia coli when expressed in isolation. The original clone, SMG 9, was subsequently found to have lost the original observed phenotype upon further investigation. Nevertheless, this study demonstrates the usefulness of a phenomic approach to assign a functional role to metagenome-derived clones.

  14. A new experimental approach for studying bacterial genomic island evolution identifies island genes with bacterial host-specific expression patterns

    Directory of Open Access Journals (Sweden)

    Nickerson Cheryl A

    2006-01-01

    Full Text Available Abstract Background Genomic islands are regions of bacterial genomes that have been acquired by horizontal transfer and often contain blocks of genes that function together for specific processes. Recently, it has become clear that the impact of genomic islands on the evolution of different bacterial species is significant and represents a major force in establishing bacterial genomic variation. However, the study of genomic island evolution has been mostly performed at the sequence level using computer software or hybridization analysis to compare different bacterial genomic sequences. We describe here a novel experimental approach to study the evolution of species-specific bacterial genomic islands that identifies island genes that have evolved in such a way that they are differentially-expressed depending on the bacterial host background into which they are transferred. Results We demonstrate this approach by using a "test" genomic island that we have cloned from the Salmonella typhimurium genome (island 4305 and transferred to a range of Gram negative bacterial hosts of differing evolutionary relationships to S. typhimurium. Systematic analysis of the expression of the island genes in the different hosts compared to proper controls allowed identification of genes with genera-specific expression patterns. The data from the analysis can be arranged in a matrix to give an expression "array" of the island genes in the different bacterial backgrounds. A conserved 19-bp DNA site was found upstream of at least two of the differentially-expressed island genes. To our knowledge, this is the first systematic analysis of horizontally-transferred genomic island gene expression in a broad range of Gram negative hosts. We also present evidence in this study that the IS200 element found in island 4305 in S. typhimurium strain LT2 was inserted after the island had already been acquired by the S. typhimurium lineage and that this element is likely not

  15. Identifying and establishing consensus on the most important safety features of GP computer systems: e-Delphi study

    Directory of Open Access Journals (Sweden)

    Anthony Avery

    2005-03-01

    Full Text Available Our objective was to identify and establish consensus on the most important safety features of GP computer systems, with a particular emphasis on medicines management.Weused a two-round electronic Delphi survey, completed by a 21-member multidisciplinary expert panel, all from the UK. The main outcome measure was percentage agreement of the panel members on the importance of the presence of a number of different safety features (presented as clinical statements onGP computer systems.We found 90% or greater agreement on the importance of 32 (58% statements. These statements, indicating issues considered to be of considerable importance (rated as important or very important, related to: computerised alerts; the need to avoid spurious alerts; making it difficult to override critical alerts; having audit trails of such overrides; support for safe repeat prescribing; effective computer_user interface; importance of call and recall management; and the need to be able to run safety reports. The high level of agreement among the expert panel members indicates clear themes and priorities that need to be addressed in any further improvement of safety features in primary care computing systems.

  16. A modeling approach to identify the effective forcing exerted by wind on a prealpine lake surrounded by a complex topography

    Science.gov (United States)

    Valerio, G.; Cantelli, A.; Monti, P.; Leuzzi, G.

    2017-05-01

    The representation of spatial wind distribution is recognized as a serious difficulty when modeling the hydrodynamics of lakes surrounded by a complex topography. To address this issue, we propose to force a 3-D lake model with the wind field simulated by a high-resolution atmospheric model, considering as a case study a 61 km2 prealpine lake surrounded by mountain ranges that reach 1800 m above the lake's surface, where a comprehensive data set was available in the stratified season. The improved distributed description of the wind stress over the lake surface led to a significant enhancement in the representation of the main basin-scale internal wave motions, and hence provided a reference solution to test the use of simplified approaches. Moreover, the analysis of the power exerted by the computed wind field enabled us to identify measuring stations that provide suitable wind data to be applied uniformly on the lake surface in long-term simulations. Accordingly, the proposed methodology can contribute to reducing the uncertainties associated with the definition of wind forcing for modeling purposes and can provide a rational criterion for installing representative measurement locations in prealpine lakes.

  17. A uniform approach for programming distributed heterogeneous computing systems.

    Science.gov (United States)

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-12-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.

  18. Cognitive control in majority search: A computational modeling approach

    Directory of Open Access Journals (Sweden)

    Hongbin eWang

    2011-02-01

    Full Text Available Despite the importance of cognitive control in many cognitive tasks involving uncertainty, the computational mechanisms of cognitive control in response to uncertainty remain unclear. In this study, we develop biologically realistic neural network models to investigate the instantiation of cognitive control in a majority function task, where one determines the category to which the majority of items in a group belong. Two models are constructed, both of which include the same set of modules representing task-relevant brain functions and share the same model structure. However, with a critical change of a model parameter setting, the two models implement two different underlying algorithms: one for grouping search (where a subgroup of items are sampled and re-sampled until a congruent sample is found and the other for self-terminating search (where the items are scanned and counted one-by-one until the majority is decided. The two algorithms hold distinct implications for the involvement of cognitive control. The modeling results show that while both models are able to perform the task, the grouping search model fit the human data better than the self-terminating search model. An examination of the dynamics underlying model performance reveals how cognitive control might be instantiated in the brain via the V4-ACC-LPFC-IPS loop for computing the majority function.

  19. One-loop kink mass shifts: a computational approach

    CERN Document Server

    Alonso-Izquierdo, Alberto

    2011-01-01

    In this paper we develop a procedure to compute the one-loop quantum correction to the kink masses in generic (1+1)-dimensional one-component scalar field theoretical models. The procedure uses the generalized zeta function regularization method helped by the Gilkey-de Witt asymptotic expansion of the heat function via Mellin's transform. We find a formula for the one-loop kink mass shift that depends only on the part of the energy density with no field derivatives, evaluated by means of a symbolic software algorithm that automates the computation. The improved algorithm with respect to earlier work in this subject has been tested in the sine-Gordon and $\\lambda(\\phi)_2^4$ models. The quantum corrections of the sG-soliton and $\\lambda(\\phi^4)_2$-kink masses have been estimated with a relative error of 0.00006% and 0.00007% respectively. Thereafter, the algorithm is applied to other models. In particular, an interesting one-parametric family of double sine-Gordon models interpolating between the ordinary sine-...

  20. Cognitive control in majority search: a computational modeling approach.

    Science.gov (United States)

    Wang, Hongbin; Liu, Xun; Fan, Jin

    2011-01-01

    Despite the importance of cognitive control in many cognitive tasks involving uncertainty, the computational mechanisms of cognitive control in response to uncertainty remain unclear. In this study, we develop biologically realistic neural network models to investigate the instantiation of cognitive control in a majority function task, where one determines the category to which the majority of items in a group belong. Two models are constructed, both of which include the same set of modules representing task-relevant brain functions and share the same model structure. However, with a critical change of a model parameter setting, the two models implement two different underlying algorithms: one for grouping search (where a subgroup of items are sampled and re-sampled until a congruent sample is found) and the other for self-terminating search (where the items are scanned and counted one-by-one until the majority is decided). The two algorithms hold distinct implications for the involvement of cognitive control. The modeling results show that while both models are able to perform the task, the grouping search model fit the human data better than the self-terminating search model. An examination of the dynamics underlying model performance reveals how cognitive control might be instantiated in the brain for computing the majority function.