WorldWideScience

Sample records for based computational approach

  1. Immune based computer virus detection approaches

    Institute of Scientific and Technical Information of China (English)

    TAN Ying; ZHANG Pengtao

    2013-01-01

    The computer virus is considered one of the most horrifying threats to the security of computer systems worldwide.The rapid development of evasion techniques used in virus causes the signature based computer virus detection techniques to be ineffective.Many novel computer virus detection approaches have been proposed in the past to cope with the ineffectiveness,mainly classified into three categories:static,dynamic and heuristics techniques.As the natural similarities between the biological immune system (BIS),computer security system (CSS),and the artificial immune system (AIS) were all developed as a new prototype in the community of anti-virus research.The immune mechanisms in the BIS provide the opportunities to construct computer virus detection models that are robust and adaptive with the ability to detect unseen viruses.In this paper,a variety of classic computer virus detection approaches were introduced and reviewed based on the background knowledge of the computer virus history.Next,a variety of immune based computer virus detection approaches were also discussed in detail.Promising experimental results suggest that the immune based computer virus detection approaches were able to detect new variants and unseen viruses at lower false positive rates,which have paved a new way for the anti-virus research.

  2. A polyhedral approach to computing border bases

    CERN Document Server

    Braun, Gábor

    2009-01-01

    Border bases can be considered to be the natural extension of Gr\\"obner bases that have several advantages. Unfortunately, to date the classical border basis algorithm relies on (degree-compatible) term orderings and implicitly on reduced Gr\\"obner bases. We adapt the classical border basis algorithm to allow for calculating border bases for arbitrary degree-compatible order ideals, which is \\emph{independent} from term orderings. Moreover, the algorithm also supports calculating degree-compatible order ideals with \\emph{preference} on contained elements, even though finding a preferred order ideal is NP-hard. Effectively we retain degree-compatibility only to successively extend our computation degree-by-degree. The adaptation is based on our polyhedral characterization: order ideals that support a border basis correspond one-to-one to integral points of the order ideal polytope. This establishes a crucial connection between the ideal and the combinatorial structure of the associated factor spaces.

  3. A Computationally Based Approach to Homogenizing Advanced Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  4. [Computer work and De Quervain's tenosynovitis: an evidence based approach].

    Science.gov (United States)

    Gigante, M R; Martinotti, I; Cirla, P E

    2012-01-01

    The debate around the role of the work at personal computer as cause of De Quervain's Tenosynovitis was developed partially, without considering multidisciplinary available data. A systematic review of the literature, using an evidence-based approach, was performed. In disorders associated with the use of VDU, we must distinguish those at the upper limbs and among them those related to an overload. Experimental studies on the occurrence of De Quervain's Tenosynovitis are quite limited, as well as clinically are quite difficult to prove the professional etiology, considering the interference due to other activities of daily living or to the biological susceptibility (i.e. anatomical variability, sex, age, exercise). At present there is no evidence of any connection between De Quervain syndrome and time of use of the personal computer or keyboard, limited evidence of correlation is found with time using a mouse. No data are available regarding the use exclusively or predominantly for personal laptops or mobile "smart phone".

  5. Computational approaches to substrate-based cell motility

    Science.gov (United States)

    Ziebert, Falko; Aranson, Igor S.

    2016-07-01

    Substrate-based crawling motility of eukaryotic cells is essential for many biological functions, both in developing and mature organisms. Motility dysfunctions are involved in several life-threatening pathologies such as cancer and metastasis. Motile cells are also a natural realisation of active, self-propelled 'particles', a popular research topic in nonequilibrium physics. Finally, from the materials perspective, assemblies of motile cells and evolving tissues constitute a class of adaptive self-healing materials that respond to the topography, elasticity and surface chemistry of the environment and react to external stimuli. Although a comprehensive understanding of substrate-based cell motility remains elusive, progress has been achieved recently in its modelling on the whole-cell level. Here we survey the most recent advances in computational approaches to cell movement and demonstrate how these models improve our understanding of complex self-organised systems such as living cells.

  6. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  7. A spline-based approach for computing spatial impulse responses.

    Science.gov (United States)

    Ellis, Michael A; Guenther, Drake; Walker, William F

    2007-05-01

    Computer simulations are an essential tool for the design of phased-array ultrasonic imaging systems. FIELD II, which determines the two-way temporal response of a transducer at a point in space, is the current de facto standard for ultrasound simulation tools. However, the need often arises to obtain two-way spatial responses at a single point in time, a set of dimensions for which FIELD II is not well optimized. This paper describes an analytical approach for computing the two-way, far-field, spatial impulse response from rectangular transducer elements under arbitrary excitation. The described approach determines the response as the sum of polynomial functions, making computational implementation quite straightforward. The proposed algorithm, named DELFI, was implemented as a C routine under Matlab and results were compared to those obtained under similar conditions from the well-established FIELD II program. Under the specific conditions tested here, the proposed algorithm was approximately 142 times faster than FIELD II for computing spatial sensitivity functions with similar amounts of error. For temporal sensitivity functions with similar amounts of error, the proposed algorithm was about 1.7 times slower than FIELD II using rectangular elements and 19.2 times faster than FIELD II using triangular elements. DELFI is shown to be an attractive complement to FIELD II, especially when spatial responses are needed at a specific point in time.

  8. Computational challenges of structure-based approaches applied to HIV.

    Science.gov (United States)

    Forli, Stefano; Olson, Arthur J

    2015-01-01

    Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.

  9. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  10. Distance Based Asynchronous Recovery Approach In Mobile Computing Environment

    Directory of Open Access Journals (Sweden)

    Yogita Khatri,

    2012-06-01

    Full Text Available A mobile computing system is a distributed system in which at least one of the processes is mobile. They are constrained by lack of stable storage, low network bandwidth, mobility, frequent disconnection andlimited battery life. Checkpointing is one of the commonly used techniques to provide fault tolerance in mobile computing environment. In order to suit the mobile environment a distance based recovery schemeis proposed which is based on checkpointing and message logging. After the system recovers from failures, only the failed processes rollback and restart from their respective recent checkpoints, independent of the others. The salient feature of this scheme is to reduce the transfer and recovery cost. While the mobile host moves with in a specific range, recovery information is not moved and thus only be transferred nearby if the mobile host moves out of certain range.

  11. An engineering based approach for hydraulic computations in river flows

    Science.gov (United States)

    Di Francesco, S.; Biscarini, C.; Pierleoni, A.; Manciola, P.

    2016-06-01

    This paper presents an engineering based approach for hydraulic risk evaluation. The aim of the research is to identify a criteria for the choice of the simplest and appropriate model to use in different scenarios varying the characteristics of main river channel. The complete flow field, generally expressed in terms of pressure, velocities, accelerations can be described through a three dimensional approach that consider all the flow properties varying in all directions. In many practical applications for river flow studies, however, the greatest changes occur only in two dimensions or even only in one. In these cases the use of simplified approaches can lead to accurate results, with easy to build and faster simulations. The study has been conducted taking in account a dimensionless parameter of channels (ratio of curvature radius and width of the channel (R/B).

  12. Integrating structure-based and ligand-based approaches for computational drug design.

    Science.gov (United States)

    Wilson, Gregory L; Lill, Markus A

    2011-04-01

    Methods utilized in computer-aided drug design can be classified into two major categories: structure based and ligand based, using information on the structure of the protein or on the biological and physicochemical properties of bound ligands, respectively. In recent years there has been a trend towards integrating these two methods in order to enhance the reliability and efficiency of computer-aided drug-design approaches by combining information from both the ligand and the protein. This trend resulted in a variety of methods that include: pseudoreceptor methods, pharmacophore methods, fingerprint methods and approaches integrating docking with similarity-based methods. In this article, we will describe the concepts behind each method and selected applications.

  13. A Hybrid Approach Towards Intrusion Detection Based on Artificial Immune System and Soft Computing

    CERN Document Server

    Sanyal, Sugata

    2012-01-01

    A number of works in the field of intrusion detection have been based on Artificial Immune System and Soft Computing. Artificial Immune System based approaches attempt to leverage the adaptability, error tolerance, self- monitoring and distributed nature of Human Immune Systems. Whereas Soft Computing based approaches are instrumental in developing fuzzy rule based systems for detecting intrusions. They are computationally intensive and apply machine learning (both supervised and unsupervised) techniques to detect intrusions in a given system. A combination of these two approaches could provide significant advantages for intrusion detection. In this paper we attempt to leverage the adaptability of Artificial Immune System and the computation intensive nature of Soft Computing to develop a system that can effectively detect intrusions in a given network.

  14. Engineering interrelated electricity markets. An agent-based computational approach

    Energy Technology Data Exchange (ETDEWEB)

    Weidlich, Anke [Mannheim Univ. (Germany). Dieter Schwarz Chair of Business Administration and Information Systems

    2008-07-01

    Due to the characteristics of electricity, power markets rank among the most complex markets operated at present. The requirements of an environmentally sustainable, economically efficient, and secure energy supply have resulted in the emergence of several interrelated markets that have to be carefully engineered in order to ensure efficient market outcomes. This book presents an agent-based simulation model that facilitates electricity market research. Simulation outcomes from this model are validated against price data from German power markets. The results significantly contribute to existing research in agent-based simulation and electricity market modeling, and provide insights into the impact of the market structure and market design on electricity prices. The book addresses researchers, lecturers and students who are interested in applying agent-based simulation to power markets. It provides a thorough discussion of the methodology and helpful details for model implementation. (orig.)

  15. Access Control for Agent-based Computing: A Distributed Approach.

    Science.gov (United States)

    Antonopoulos, Nick; Koukoumpetsos, Kyriakos; Shafarenko, Alex

    2001-01-01

    Discusses the mobile software agent paradigm that provides a foundation for the development of high performance distributed applications and presents a simple, distributed access control architecture based on the concept of distributed, active authorization entities (lock cells), any combination of which can be referenced by an agent to provide…

  16. Novel Approach to Content Based Image Retrieval Using Evolutionary Computing

    Directory of Open Access Journals (Sweden)

    Muhammad Imran

    2014-08-01

    Full Text Available Content Based Image Retrieval (CBIR is an active research area in multimedia domain in this era of information technology. One of the challenges of CBIR is to bridge the gap between low level features and high level semantic. In this study we investigate the Particle Swarm Optimization (PSO, a stochastic algorithm and Genetic Algorithm (GA for CBIR to overcome this drawback. We proposed a new CBIR system based on the PSO and GA coupled with Support Vector Machine (SVM. GA and PSO both are evolutionary algorithms and in this study are used to increase the number of relevant images. SVM is used to perform final classification. To check the performance of the proposed technique, rich experiments are performed using coral dataset. The proposed technique achieves higher accuracy compared to the previously introduced techniques (FEI, FIRM, simplicity, simple HIST and WH.

  17. A computationally efficient approach for template matching-based image registration

    Indian Academy of Sciences (India)

    Vilas H Gaidhane; Yogesh V Hote; Vijander Singh

    2014-04-01

    Image registration using template matching is an important step in image processing. In this paper, a simple, robust and computationally efficient approach is presented. The proposed approach is based on the properties of a normalized covariance matrix. The main advantage of the proposed approach is that the image matching can be achieved without calculating eigenvalues and eigenvectors of a covariance matrix, hence reduces the computational complexity. The experimental results show that the proposed approach performs better in the presence of various noises and rigid geometric transformations.

  18. Effects of a Peer Assessment System Based on a Grid-Based Knowledge Classification Approach on Computer Skills Training

    Science.gov (United States)

    Hsu, Ting-Chia

    2016-01-01

    In this study, a peer assessment system using the grid-based knowledge classification approach was developed to improve students' performance during computer skills training. To evaluate the effectiveness of the proposed approach, an experiment was conducted in a computer skills certification course. The participants were divided into three…

  19. Medical imaging in clinical applications algorithmic and computer-based approaches

    CERN Document Server

    Bhateja, Vikrant; Hassanien, Aboul

    2016-01-01

    This volume comprises of 21 selected chapters, including two overview chapters devoted to abdominal imaging in clinical applications supported computer aided diagnosis approaches as well as different techniques for solving the pectoral muscle extraction problem in the preprocessing part of the CAD systems for detecting breast cancer in its early stage using digital mammograms. The aim of this book is to stimulate further research in medical imaging applications based algorithmic and computer based approaches and utilize them in real-world clinical applications. The book is divided into four parts, Part-I: Clinical Applications of Medical Imaging, Part-II: Classification and clustering, Part-III: Computer Aided Diagnosis (CAD) Tools and Case Studies and Part-IV: Bio-inspiring based Computer Aided diagnosis techniques. .

  20. The Effects of Computer Supported Problem Based Learning on Students' Approaches to Learning

    Science.gov (United States)

    Ak, Serife

    2011-01-01

    The purpose of this paper is to investigate the effects of computer supported problem based learning on students' approaches to learning. The research was conducted as a pre-test and posttest one-grouped design used to achieve the objectives of the study. The experimental process of study lasted 5 weeks and was carried out on 78 university…

  1. A computationally efficient approach for hidden-Markov model-augmented fingerprint-based positioning

    Science.gov (United States)

    Roth, John; Tummala, Murali; McEachen, John

    2016-09-01

    This paper presents a computationally efficient approach for mobile subscriber position estimation in wireless networks. A method of data scaling assisted by timing adjust is introduced in fingerprint-based location estimation under a framework which allows for minimising computational cost. The proposed method maintains a comparable level of accuracy to the traditional case where no data scaling is used and is evaluated in a simulated environment under varying channel conditions. The proposed scheme is studied when it is augmented by a hidden-Markov model to match the internal parameters to the channel conditions that present, thus minimising computational cost while maximising accuracy. Furthermore, the timing adjust quantity, available in modern wireless signalling messages, is shown to be able to further reduce computational cost and increase accuracy when available. The results may be seen as a significant step towards integrating advanced position-based modelling with power-sensitive mobile devices.

  2. A new approach based on PSO algorithm to find good computational encoding sequences

    Institute of Scientific and Technical Information of China (English)

    Cui Guangzhao; Niu Yunyun; Wang Yanfeng; Zhang Xuncai; Pan Linqiang

    2007-01-01

    Computational encoding DNA sequence design is one of the most important steps in molecular computation. A lot of research work has been done to design reliable sequence library. A revised method based on the support system developed by Tanaka et al.is proposed here with different criteria to construct fitness function. Then we adapt particle swarm optimization (PSO) algorithm to our encoding problem. By using the new algorithm, a set of sequences with good quality is generated. The result also shows that our PSO- based approach could rapidly converge at the minimum level for an output of the simulation model. The celerity of the algorithm fits our requirements.

  3. The Effect of Computer Assisted Instruction Based Constructivist Learning Approach on Students’ Attitudes and Achievements

    Directory of Open Access Journals (Sweden)

    Şeyda Gül

    2011-06-01

    Full Text Available The aim of this study is to determine the effect of computer assisted instruction based constructivist learning approach on students’ attitudes towards computers and science and technology lesson and their achievements at science and technology lesson. The study group is a group of 56 students who attend to fourth grade of a public primary school which were selected via convenient sampling method from Körfez (Kocaeli. The data were collected by means of Attitude Scale to Science and Technology Lesson, Attitude Scale to Computers and Achievement Test. In this research, a quasi-experiment design with pre test-post test control group was employed. The subjects was taught to the students using constructivist learning method which carried out at present syllabus in control group and computer assisted instruction based constructivist learning approach in experimental group. The findings from this study showed that there was a statistically significant difference between groups’ post-test attitudes towards computers and post-test scores obtained from achievement test in favour of experimental group (p0.05 despite the fact that there was a positive increase at experimental groups’ attitudes.

  4. Design-based approach to ethics in computer-aided diagnosis

    Science.gov (United States)

    Collmann, Jeff R.; Lin, Jyh-Shyan; Freedman, Matthew T.; Wu, Chris Y.; Hayes, Wendelin S.; Mun, Seong K.

    1996-04-01

    A design-based approach to ethical analysis examines how computer scientists, physicians and patients make and justify choices in designing, using and reacting to computer-aided diagnosis (CADx) systems. The basic hypothesis of this research is that values are embedded in CADx systems during all phases of their development, not just retrospectively imposed on them. This paper concentrates on the work of computer scientists and physicians as they attempt to resolve central technical questions in designing clinically functional CADx systems for lung cancer and breast cancer diagnosis. The work of Lo, Chan, Freedman, Lin, Wu and their colleagues provides the initial data on which this study is based. As these researchers seek to increase the rate of true positive classifications of detected abnormalities in chest radiographs and mammograms, they explore dimensions of the fundamental ethical principal of beneficence. The training of CADx systems demonstrates the key ethical dilemmas inherent in their current design.

  5. Lightweight Tactical Client: A Capability-Based Approach to Command Post Computing

    Science.gov (United States)

    2015-12-01

    SUBTITLE LIGHTWEIGHT TACTICAL CLIENT: A CAPABILITY-BASED APPROACH TO COMMAND POST COMPUTING 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...operational standpoint. For example, a requirement of a Command Post Client (ref. 1) is the capability to operate for an extended period of time (48+ hr...operations during disconnected, intermittent, and latent states including fully disconnected operations  Able to operate on a physically light

  6. COMPUTER EVALUATION OF SKILLS FORMATION QUALITY IN THE IMPLEMENTATION OF COMPETENCE-BASED APPROACH TO LEARNING

    Directory of Open Access Journals (Sweden)

    Vitalia A. Zhuravleva

    2014-01-01

    Full Text Available The article deals with the problem of effective organization of skills forming as an important part of the competence approach in education, implemented via educational standards of new generation. The solution of the problem suggests using of computer tools to assess the quality of skills formation and abilities based on the proposed model of the problem. This paper proposes an approach to creating an assessing model of the level of skills formation in knowledge management systems based on mathematical modeling methods. Attention is paid to the evaluation strategy and technology of assessment, which is based on the use of rules of fuzzy mathematics. Algorithmic implementation of the proposed model of evaluation of the quality of skills development is shown as well. 

  7. A Hybrid Autonomic Computing-Based Approach to Distributed Constraint Satisfaction Problems

    Directory of Open Access Journals (Sweden)

    Abhishek Bhatia

    2015-03-01

    Full Text Available Distributed constraint satisfaction problems (DisCSPs are among the widely endeavored problems using agent-based simulation. Fernandez et al. formulated sensor and mobile tracking problem as a DisCSP, known as SensorDCSP In this paper, we adopt a customized ERE (environment, reactive rules and entities algorithm for the SensorDCSP, which is otherwise proven as a computationally intractable problem. An amalgamation of the autonomy-oriented computing (AOC-based algorithm (ERE and genetic algorithm (GA provides an early solution of the modeled DisCSP. Incorporation of GA into ERE facilitates auto-tuning of the simulation parameters, thereby leading to an early solution of constraint satisfaction. This study further contributes towards a model, built up in the NetLogo simulation environment, to infer the efficacy of the proposed approach.

  8. Parallel LDPC Decoding on GPUs Using a Stream-Based Computing Approach

    Institute of Scientific and Technical Information of China (English)

    Gabriel Falc(a)o; Shinichi Yamagiwa; Vitor Silva; Leonel Sousa

    2009-01-01

    Low-Density Parity-Check(LDPC)codes are powerful error correcting codes adopted by recent communication standards.LDPC decoders are based on belief propagation algorithms,which make use of a Tanner graph and very intensive message-passing computation,and usually require hardware-based dedicated solutions.With the exponential increase of the computational power of commodity graphics processing units(GPUs),new opportunities have arisen to develop general purpose processing on GPUs.This paper proposes the use of GPUs for implementing flexible and programmable LDPC decoders.A new stream-based approach is proposed,based on compact data structures to represent the Tanner graph.It is shown that such a challenging application for stream-based computing,because of irregular memory access patterns,memory bandwidth and recursive flow control constraints,can be efficiently implemented on GPUs.The proposal was experimentally evaluated by programming LDPC decoders on GPUs using the Caravela platform,a generic interface tool for managing the kernels'execution regardless of the GPU manufacturer and operating system.Moreover,to relatively assess the obtained results,we have also implemented LDPC decoders on general purpose processors with Streaming Single Instruction Multiple Data(SIMD)Extensions.Experimental results show that the solution proposed here efficiently decodes several codewords simultaneously,reducing the processing time by one order of magnitude.

  9. Comparing large-scale computational approaches to epidemic modeling: Agent-based versus structured metapopulation models

    Directory of Open Access Journals (Sweden)

    Merler Stefano

    2010-06-01

    Full Text Available Abstract Background In recent years large-scale computational models for the realistic simulation of epidemic outbreaks have been used with increased frequency. Methodologies adapt to the scale of interest and range from very detailed agent-based models to spatially-structured metapopulation models. One major issue thus concerns to what extent the geotemporal spreading pattern found by different modeling approaches may differ and depend on the different approximations and assumptions used. Methods We provide for the first time a side-by-side comparison of the results obtained with a stochastic agent-based model and a structured metapopulation stochastic model for the progression of a baseline pandemic event in Italy, a large and geographically heterogeneous European country. The agent-based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM model, based on high-resolution census data worldwide, and integrating airline travel flow data with short-range human mobility patterns at the global scale. The model also considers age structure data for Italy. GLEaM and the agent-based models are synchronized in their initial conditions by using the same disease parameterization, and by defining the same importation of infected cases from international travels. Results The results obtained show that both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing on the order of a few days. The relative difference of the epidemic size depends on the basic reproductive ratio, R0, and on the fact that the metapopulation model consistently yields a larger incidence than the agent-based model, as expected due to the differences in the structure in the intra-population contact pattern of the approaches. The age

  10. Charge transport in carbon nanotubes based materials: a Kubo-Greenwood computational approach

    Science.gov (United States)

    Ishii, Hiroyuki; Triozon, François; Kobayashi, Nobuhiko; Hirose, Kenji; Roche, Stephan

    2009-05-01

    In this contribution, we present a numerical study of quantum transport in carbon nanotubes based materials. After a brief presentation of the computational approach used to investigate the transport coefficient (Kubo method), the scaling properties of quantum conductance in ballistic regime as well as in the diffusive regimes are illustrated. The impact of elastic (impurities) and dynamical disorders (phonon vibrations) are analyzed separately, with the extraction of main transport length scales (mean free path and localization length), as well as the temperature dependence of the nanotube resistance. The results are found in very good agreement with both analytical results and experimental data, demonstrating the predictability efficiency of our computational strategy. To cite this article: H. Ishii et al., C. R. Physique 10 (2009).

  11. A cell-based computational modeling approach for developing site-directed molecular probes.

    Directory of Open Access Journals (Sweden)

    Jing-Yu Yu

    Full Text Available Modeling the local absorption and retention patterns of membrane-permeant small molecules in a cellular context could facilitate development of site-directed chemical agents for bioimaging or therapeutic applications. Here, we present an integrative approach to this problem, combining in silico computational models, in vitro cell based assays and in vivo biodistribution studies. To target small molecule probes to the epithelial cells of the upper airways, a multiscale computational model of the lung was first used as a screening tool, in silico. Following virtual screening, cell monolayers differentiated on microfabricated pore arrays and multilayer cultures of primary human bronchial epithelial cells differentiated in an air-liquid interface were used to test the local absorption and intracellular retention patterns of selected probes, in vitro. Lastly, experiments involving visualization of bioimaging probe distribution in the lungs after local and systemic administration were used to test the relevance of computational models and cell-based assays, in vivo. The results of in vivo experiments were consistent with the results of in silico simulations, indicating that mitochondrial accumulation of membrane permeant, hydrophilic cations can be used to maximize local exposure and retention, specifically in the upper airways after intratracheal administration.

  12. A New Approach of Error Compensation on NC Machining Based on Memetic Computation

    Directory of Open Access Journals (Sweden)

    Huanglin Zeng

    2013-04-01

    Full Text Available This paper is a study of the application of Memetic computation integrating and coordinating intelligence algorithms to solve the problems of error compensation for a high-precision numeral control machining system. The primary focus is on development of integrated intelligent computation approach to set up an error compensation system of a numeral control machine tool based on a dynamic feedback neural network. Optimization of error measurement points of a numeral control machine tool is realized by way of application of error variable attribute reduction on rough set theory. A principal component analysis is used for data compression and feature extraction to reduce the input dimension of a dynamic feedback neural network. A dynamic feedback neural network is trained on ant colony algorithm so that network can converge to get a global optimum. Positioning error caused in thermal deformation compensation capabilities were tested using industry standard equipment and procedures. The results obtained shows that this approach can effectively improve compensation precision and real time of error compensation on machine tools.

  13. Integrative computational approach for genome-based study of microbial lipid-degrading enzymes.

    Science.gov (United States)

    Vorapreeda, Tayvich; Thammarongtham, Chinae; Laoteng, Kobkul

    2016-07-01

    Lipid-degrading or lipolytic enzymes have gained enormous attention in academic and industrial sectors. Several efforts are underway to discover new lipase enzymes from a variety of microorganisms with particular catalytic properties to be used for extensive applications. In addition, various tools and strategies have been implemented to unravel the functional relevance of the versatile lipid-degrading enzymes for special purposes. This review highlights the study of microbial lipid-degrading enzymes through an integrative computational approach. The identification of putative lipase genes from microbial genomes and metagenomic libraries using homology-based mining is discussed, with an emphasis on sequence analysis of conserved motifs and enzyme topology. Molecular modelling of three-dimensional structure on the basis of sequence similarity is shown to be a potential approach for exploring the structural and functional relationships of candidate lipase enzymes. The perspectives on a discriminative framework of cutting-edge tools and technologies, including bioinformatics, computational biology, functional genomics and functional proteomics, intended to facilitate rapid progress in understanding lipolysis mechanism and to discover novel lipid-degrading enzymes of microorganisms are discussed.

  14. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  15. Computer-aided diagnosis of prostate cancer with emphasis on ultrasound-based approaches: a review.

    Science.gov (United States)

    Moradi, Mehdi; Mousavi, Parvin; Abolmaesumi, Purang

    2007-07-01

    This paper reviews the state of the art in computer-aided diagnosis of prostate cancer and focuses, in particular, on ultrasound-based techniques for detection of cancer in prostate tissue. The current standard procedure for diagnosis of prostate cancer, i.e., ultrasound-guided biopsy followed by histopathological analysis of tissue samples, is invasive and produces a high rate of false negatives resulting in the need for repeated trials. It is against these backdrops that the search for new methods to diagnose prostate cancer continues. Image-based approaches (such as MRI, ultrasound and elastography) represent a major research trend for diagnosis of prostate cancer. Due to the integration of ultrasound imaging in the current clinical procedure for detection of prostate cancer, we specifically provide a more detailed review of methodologies that use ultrasound RF-spectrum parameters, B-scan texture features and Doppler measures for prostate tissue characterization. We present current and future directions of research aimed at computer-aided detection of prostate cancer and conclude that ultrasound is likely to play an important role in the field.

  16. Remembered or Forgotten?—An EEG-Based Computational Prediction Approach

    Science.gov (United States)

    Sun, Xuyun; Qian, Cunle; Chen, Zhongqin; Wu, Zhaohui; Luo, Benyan; Pan, Gang

    2016-01-01

    Prediction of memory performance (remembered or forgotten) has various potential applications not only for knowledge learning but also for disease diagnosis. Recently, subsequent memory effects (SMEs)—the statistical differences in electroencephalography (EEG) signals before or during learning between subsequently remembered and forgotten events—have been found. This finding indicates that EEG signals convey the information relevant to memory performance. In this paper, based on SMEs we propose a computational approach to predict memory performance of an event from EEG signals. We devise a convolutional neural network for EEG, called ConvEEGNN, to predict subsequently remembered and forgotten events from EEG recorded during memory process. With the ConvEEGNN, prediction of memory performance can be achieved by integrating two main stages: feature extraction and classification. To verify the proposed approach, we employ an auditory memory task to collect EEG signals from scalp electrodes. For ConvEEGNN, the average prediction accuracy was 72.07% by using EEG data from pre-stimulus and during-stimulus periods, outperforming other approaches. It was observed that signals from pre-stimulus period and those from during-stimulus period had comparable contributions to memory performance. Furthermore, the connection weights of ConvEEGNN network can reveal prominent channels, which are consistent with the distribution of SME studied previously. PMID:27973531

  17. Remembered or Forgotten?-An EEG-Based Computational Prediction Approach.

    Science.gov (United States)

    Sun, Xuyun; Qian, Cunle; Chen, Zhongqin; Wu, Zhaohui; Luo, Benyan; Pan, Gang

    2016-01-01

    Prediction of memory performance (remembered or forgotten) has various potential applications not only for knowledge learning but also for disease diagnosis. Recently, subsequent memory effects (SMEs)-the statistical differences in electroencephalography (EEG) signals before or during learning between subsequently remembered and forgotten events-have been found. This finding indicates that EEG signals convey the information relevant to memory performance. In this paper, based on SMEs we propose a computational approach to predict memory performance of an event from EEG signals. We devise a convolutional neural network for EEG, called ConvEEGNN, to predict subsequently remembered and forgotten events from EEG recorded during memory process. With the ConvEEGNN, prediction of memory performance can be achieved by integrating two main stages: feature extraction and classification. To verify the proposed approach, we employ an auditory memory task to collect EEG signals from scalp electrodes. For ConvEEGNN, the average prediction accuracy was 72.07% by using EEG data from pre-stimulus and during-stimulus periods, outperforming other approaches. It was observed that signals from pre-stimulus period and those from during-stimulus period had comparable contributions to memory performance. Furthermore, the connection weights of ConvEEGNN network can reveal prominent channels, which are consistent with the distribution of SME studied previously.

  18. Hyperspectral Aquatic Radiative Transfer Modeling Using a High-Performance Cluster Computing Based Approach

    Energy Technology Data Exchange (ETDEWEB)

    Fillippi, Anthony [Texas A& M University; Bhaduri, Budhendra L [ORNL; Naughton, III, Thomas J [ORNL; King, Amy L [ORNL; Scott, Stephen L [ORNL; Guneralp, Inci [Texas A& M University

    2012-01-01

    For aquatic studies, radiative transfer (RT) modeling can be used to compute hyperspectral above-surface remote sensing reflectance that can be utilized for inverse model development. Inverse models can provide bathymetry and inherent- and bottom-optical property estimation. Because measured oceanic field/organic datasets are often spatio-temporally sparse, synthetic data generation is useful in yielding sufficiently large datasets for inversion model development; however, these forward-modeled data are computationally expensive and time-consuming to generate. This study establishes the magnitude of wall-clock-time savings achieved for performing large, aquatic RT batch-runs using parallel computing versus a sequential approach. Given 2,600 simulations and identical compute-node characteristics, sequential architecture required {approx}100 hours until termination, whereas a parallel approach required only {approx}2.5 hours (42 compute nodes) - a 40x speed-up. Tools developed for this parallel execution are discussed.

  19. Hyperspectral Aquatic Radiative Transfer Modeling Using a High-Performance Cluster Computing-Based Approach

    Energy Technology Data Exchange (ETDEWEB)

    Filippi, Anthony M [ORNL; Bhaduri, Budhendra L [ORNL; Naughton, III, Thomas J [ORNL; King, Amy L [ORNL; Scott, Stephen L [ORNL; Guneralp, Inci [Texas A& M University

    2012-01-01

    Abstract For aquatic studies, radiative transfer (RT) modeling can be used to compute hyperspectral above-surface remote sensing reflectance that can be utilized for inverse model development. Inverse models can provide bathymetry and inherent-and bottom-optical property estimation. Because measured oceanic field/organic datasets are often spatio-temporally sparse, synthetic data generation is useful in yielding sufficiently large datasets for inversion model development; however, these forward-modeled data are computationally expensive and time-consuming to generate. This study establishes the magnitude of wall-clock-time savings achieved for performing large, aquatic RT batch-runs using parallel computing versus a sequential approach. Given 2,600 simulations and identical compute-node characteristics, sequential architecture required ~100 hours until termination, whereas a parallel approach required only ~2.5 hours (42 compute nodes) a 40x speed-up. Tools developed for this parallel execution are discussed.

  20. Design of tailor-made chemical blend using a decomposition-based computer-aided approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Gernaey, Krist; Manan, Z.A.

    2011-01-01

    method reduces the search space in a systematic manner and the general blend design problem is decomposed into two stages. The first stage investigates the mixture stability where all unstable mixtures are eliminated and the stable blend candidates are retained for further testing (note that all blends...... attributes (properties).The systematic computer-aided technique first establishes the search space, and then narrows it down in subsequent steps until a small number of feasible and promising candidates remain. At this point, experimental work may be conducted to verify if any or all the candidates satisfy...... and their compositions and a set of desired target properties of the blended product as design constraints. This blend design problem is solved using a decomposition approach, which eliminates infeasible and/or redundant candidates gradually through a hierarchy of (property) model based constraints. This decomposition...

  1. Unsupervised Approaches for Post-Processing in Computationally Efficient Waveform-Similarity-Based Earthquake Detection

    Science.gov (United States)

    Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.

    2015-12-01

    Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.

  2. A new approach to computer-aided spine surgery: fluoroscopy-based surgical navigation

    OpenAIRE

    Nolte, L.-P.; Slomczykowski, M. A.; Berlemann, U.; Strauss, M. J.; Hofstetter, R; Schlenzka, D.; Laine, T.; Lund, T

    2000-01-01

    A new computer-based navigation system for spinal surgery has been designed. This was achieved by combining intraoperative fluoroscopy-based imaging using conventional C-arm technology with freehand surgical navigation principles. Modules were developed to automate digital X-ray image registration. This is in contrast to existing computed tomography- (CT) based spinal navigation systems, which require a vertebra-based registration procedure. Cross-referencing of the image intensifier with the...

  3. A learning-based approach for automated quality assessment of computer-rendered images

    Science.gov (United States)

    Zhang, Xi; Agam, Gady

    2012-01-01

    Computer generated images are common in numerous computer graphics applications such as games, modeling, and simulation. There is normally a tradeoff between the time allocated to the generation of each image frame and and the quality of the image, where better quality images require more processing time. Specifically, in the rendering of 3D objects, the surfaces of objects may be manipulated by subdividing them into smaller triangular patches and/or smoothing them so as to produce better looking renderings. Since unnecessary subdivision results in increased rendering time and unnecessary smoothing results in reduced details, there is a need to automatically determine the amount of necessary processing for producing good quality rendered images. In this paper we propose a novel supervised learning based methodology for automatically predicting the quality of rendered images of 3D objects. To perform the prediction we train on a data set which is labeled by human observers for quality. We are then able to predict the quality of renderings (not used in the training) with an average prediction error of roughly 20%. The proposed approach is compared to known techniques and is shown to produce better results.

  4. A Biologically-Based Computational Approach to Drug Repurposing for Anthrax Infection

    Directory of Open Access Journals (Sweden)

    Jane P. F. Bai

    2017-03-01

    Full Text Available Developing drugs to treat the toxic effects of lethal toxin (LT and edema toxin (ET produced by B. anthracis is of global interest. We utilized a computational approach to score 474 drugs/compounds for their ability to reverse the toxic effects of anthrax toxins. For each toxin or drug/compound, we constructed an activity network by using its differentially expressed genes, molecular targets, and protein interactions. Gene expression profiles of drugs were obtained from the Connectivity Map and those of anthrax toxins in human alveolar macrophages were obtained from the Gene Expression Omnibus. Drug rankings were based on the ability of a drug/compound’s mode of action in the form of a signaling network to reverse the effects of anthrax toxins; literature reports were used to verify the top 10 and bottom 10 drugs/compounds identified. Simvastatin and bepridil with reported in vitro potency for protecting cells from LT and ET toxicities were computationally ranked fourth and eighth. The other top 10 drugs were fenofibrate, dihydroergotamine, cotinine, amantadine, mephenytoin, sotalol, ifosfamide, and mefloquine; literature mining revealed their potential protective effects from LT and ET toxicities. These drugs are worthy of investigation for their therapeutic benefits and might be used in combination with antibiotics for treating B. anthracis infection.

  5. Computing gene expression data with a knowledge-based gene clustering approach.

    Science.gov (United States)

    Rosa, Bruce A; Oh, Sookyung; Montgomery, Beronda L; Chen, Jin; Qin, Wensheng

    2010-01-01

    Computational analysis methods for gene expression data gathered in microarray experiments can be used to identify the functions of previously unstudied genes. While obtaining the expression data is not a difficult task, interpreting and extracting the information from the datasets is challenging. In this study, a knowledge-based approach which identifies and saves important functional genes before filtering based on variability and fold change differences was utilized to study light regulation. Two clustering methods were used to cluster the filtered datasets, and clusters containing a key light regulatory gene were located. The common genes to both of these clusters were identified, and the genes in the common cluster were ranked based on their coexpression to the key gene. This process was repeated for 11 key genes in 3 treatment combinations. The initial filtering method reduced the dataset size from 22,814 probes to an average of 1134 genes, and the resulting common cluster lists contained an average of only 14 genes. These common cluster lists scored higher gene enrichment scores than two individual clustering methods. In addition, the filtering method increased the proportion of light responsive genes in the dataset from 1.8% to 15.2%, and the cluster lists increased this proportion to 18.4%. The relatively short length of these common cluster lists compared to gene groups generated through typical clustering methods or coexpression networks narrows the search for novel functional genes while increasing the likelihood that they are biologically relevant.

  6. Use of a Computed Tomography Based Approach to Validate Noninvasive Devices to Measure Rotational Knee Laxity.

    Science.gov (United States)

    Neumann, Simon; Maas, Stefan; Waldmann, Danièle; Ricci, Pierre-Louis; Zürbes, Arno; Arnoux, Pierre-Jean; Walter, Frédéric; Kelm, Jens

    2015-01-01

    The purpose of this study is to validate a noninvasive rotational knee laxity measuring device called "Rotameter P2" with an approach based on Computed Tomography (CT). This CT-approach using X-rays is hence invasive and can be regarded as a precise reference method that may also be applied to similar devices. An error due to imperfect femur fixation was observed but can be neglected for small torques. The most significant estimation error is due to the unavoidable soft tissues rotation and hence flexibility in the measurement chain. The error increases with the applied torque. The assessment showed that the rotational knee angle measured with the Rotameter is still overestimated because of thigh and femur displacement, soft tissues deformation, and measurement artefacts adding up to a maximum of 285% error at +15 Nm for the Internal Rotation of female volunteers. This may be questioned if such noninvasive devices for measuring the Tibia-Femoral Rotation (TFR) can help diagnosing knee pathologies and investigate ligament reconstructive surgery.

  7. Molecule-based approach for computing chemical-reaction rates in upper atmosphere hypersonic flows.

    Energy Technology Data Exchange (ETDEWEB)

    Gallis, Michail A.; Bond, Ryan Bomar; Torczynski, John Robert

    2009-08-01

    This report summarizes the work completed during FY2009 for the LDRD project 09-1332 'Molecule-Based Approach for Computing Chemical-Reaction Rates in Upper-Atmosphere Hypersonic Flows'. The goal of this project was to apply a recently proposed approach for the Direct Simulation Monte Carlo (DSMC) method to calculate chemical-reaction rates for high-temperature atmospheric species. The new DSMC model reproduces measured equilibrium reaction rates without using any macroscopic reaction-rate information. Since it uses only molecular properties, the new model is inherently able to predict reaction rates for arbitrary nonequilibrium conditions. DSMC non-equilibrium reaction rates are compared to Park's phenomenological non-equilibrium reaction-rate model, the predominant model for hypersonic-flow-field calculations. For near-equilibrium conditions, Park's model is in good agreement with the DSMC-calculated reaction rates. For far-from-equilibrium conditions, corresponding to a typical shock layer, the difference between the two models can exceed 10 orders of magnitude. The DSMC predictions are also found to be in very good agreement with measured and calculated non-equilibrium reaction rates. Extensions of the model to reactions typically found in combustion flows and ionizing reactions are also found to be in very good agreement with available measurements, offering strong evidence that this is a viable and reliable technique to predict chemical reaction rates.

  8. Designing optimal transportation networks: a knowledge-based computer-aided multicriteria approach

    Energy Technology Data Exchange (ETDEWEB)

    Tung, S.I.

    1986-01-01

    The dissertation investigates the applicability of using knowledge-based expert systems (KBES) approach to solve the single-mode (automobile), fixed-demand, discrete, multicriteria, equilibrium transportation-network-design problem. Previous works on this problem has found that mathematical programming method perform well on small networks with only one objective. Needed is a solution technique that can be used on large networks having multiple, conflicting criteria with different relative importance weights. The KBES approach developed in this dissertation represents a new way to solve network design problems. The development of an expert system involves three major tasks: knowledge acquisition, knowledge representation, and testing. For knowledge acquisition, a computer aided network design/evaluation model (UFOS) was developed to explore the design space. This study is limited to the problem of designing an optimal transportation network by adding and deleting capacity increments to/from any link in the network. Three weighted criteria were adopted for use in evaluating each design alternative: cost, average V/C ratio, and average travel time.

  9. DNA based computers II

    CERN Document Server

    Landweber, Laura F; Baum, Eric B

    1998-01-01

    The fledgling field of DNA computers began in 1994 when Leonard Adleman surprised the scientific community by using DNA molecules, protein enzymes, and chemicals to solve an instance of a hard computational problem. This volume presents results from the second annual meeting on DNA computers held at Princeton only one and one-half years after Adleman's discovery. By drawing on the analogy between DNA computing and cutting-edge fields of biology (such as directed evolution), this volume highlights some of the exciting progress in the field and builds a strong foundation for the theory of molecular computation. DNA computing is a radically different approach to computing that brings together computer science and molecular biology in a way that is wholly distinct from other disciplines. This book outlines important advances in the field and offers comprehensive discussion on potential pitfalls and the general practicality of building DNA based computers.

  10. Computational Approaches to Interface Design

    Science.gov (United States)

    Corker; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    Tools which make use of computational processes - mathematical, algorithmic and/or knowledge-based - to perform portions of the design, evaluation and/or construction of interfaces have become increasingly available and powerful. Nevertheless, there is little agreement as to the appropriate role for a computational tool to play in the interface design process. Current tools fall into broad classes depending on which portions, and how much, of the design process they automate. The purpose of this panel is to review and generalize about computational approaches developed to date, discuss the tasks which for which they are suited, and suggest methods to enhance their utility and acceptance. Panel participants represent a wide diversity of application domains and methodologies. This should provide for lively discussion about implementation approaches, accuracy of design decisions, acceptability of representational tradeoffs and the optimal role for a computational tool to play in the interface design process.

  11. A Hybrid Approach for Scheduling and Replication based on Multi-criteria Decision Method in Grid Computing

    Directory of Open Access Journals (Sweden)

    Nadia Hadi

    2012-09-01

    Full Text Available Grid computing environments have emerged following the demand of scientists to have a very high computing power and storage capacity. One among the challenges imposed in the use of these environments is the performance problem. To improve performance, scheduling and replicating techniques are used. In this paper we propose an approach to task scheduling combined with data replication decision based on multi criteria principle. This is to improve performance by reducing the response time of tasks and the load of system. This hybrid approach is based on a non-hierarchical model that allows scalability.

  12. An Improved Constraint Based Resource Scheduling Approach Using Job Grouping Strategy in Grid Computing

    Directory of Open Access Journals (Sweden)

    Payal Singhal

    2013-01-01

    Full Text Available Grid computing is a collection of distributed resources interconnected by networks to provide a unified virtual computing resource view to the user. Grid computing has one important responsibility of resource management and techniques to allow the user to make optimal use of the job completion time and achieving good throughput. It is a big deal to design the efficient scheduler and is implementation. In this paper, the constraint based job and resource scheduling algorithm has been proposed. The four constraints are taken into account for grouping the jobs, i.e. Resource memory, Job memory, Job MI and the fourth constraint L2 cache are considered. Our implementation is to reduce the processing time efficiently by adding the fourth constraint L2 cache of the resource and is allocated to the resource for parallel computing. The L2 cache is a part of computer’s processor; it increases the performance of computer. It is smaller and extremely fast computer memory. The use of more constraint of the resource and job can increase the efficiency more. The work has been done in MATLAB using the parallel computing toolbox. All the constraints are calculated using different functions in MATLAB and are allocated to the resource based on it. The resource memory, Cache, job memory size and job MI are the key factors to group the jobs according to the available capability of the selected resource. The processing time is taken into account to analyze the feasibility of the algorithms.

  13. Integrated Teaching of Structure-Based Drug Design and Biopharmaceutics: A Computer-Based Approach

    Science.gov (United States)

    Sutch, Brian T.; Romero, Rebecca M.; Neamati, Nouri; Haworth, Ian S.

    2012-01-01

    Rational drug design requires expertise in structural biology, medicinal chemistry, physiology, and related fields. In teaching structure-based drug design, it is important to develop an understanding of the need for early recognition of molecules with "drug-like" properties as a key component. That is, it is not merely sufficient to teach…

  14. ConMap: Investigating new computer-based approaches to assessing conceptual knowledge structure in physics

    Science.gov (United States)

    Beatty, Ian D.

    2000-06-01

    There is a growing consensus among educational researchers that traditional problem-based assessments are not effective tools for diagnosing a student's knowledge state and for guiding pedagogical intervention, and that new tools grounded in the results of cognitive science research are needed. The ConMap (``Conceptual Mapping'') project, described in this dissertation, proposed and investigated some novel methods for assessing the conceptual knowledge structure of physics students. A set of brief computer-administered tasks for eliciting students' conceptual associations was designed. The basic approach of the tasks was to elicit spontaneous term associations from subjects by presenting them with a prompt term, or problem, or topic area, and having them type a set of response terms. Each response was recorded along with the time spent thinking of and typing it. Several studies were conducted in which data was collected on introductory physics students' performance on the tasks. A detailed statistical description of the data was compiled. Phenomenological characterization of the data (description and statistical summary of observed patterns) provided insight into the way students respond to the tasks, and discovered some notable features to guide modeling efforts. Possible correlations were investigated, some among different aspects of the ConMap data, others between aspects of the data and students' in-course exam scores. Several correlations were found which suggest that the ConMap tasks can successfully reveal information about students' knowledge structuring and level of expertise. Similarity was observed between data from one of the tasks and results from a traditional concept map task. Two rudimentary quantitative models for the temporal aspects of student performance on one of the tasks were constructed, one based on random probability distributions and the other on a detailed deterministic representation of conceptual knowledge structure. Both models were

  15. AFFECTIVE AND EMOTIONAL ASPECTS OF HUMAN-COMPUTER INTERACTION: Game-Based and Innovative Learning Approaches

    OpenAIRE

    A. Askim GULUMBAY, Anadolu University, TURKEY

    2006-01-01

    This book was edited by, Maja Pivec, an educator at the University of Applied Sciences, and published by IOS Pres in 2006. The learning process can be seen as an emotional and personal experience that is addictive and leads learners to proactive behavior. New research methods in this field are related to affective and emotional approaches to computersupported learning and human-computer interactions.Bringing together scientists and research aspects from psychology, educational sciences, cogni...

  16. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    Science.gov (United States)

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  17. A Self-Instructional Approach To the Teaching of Enzymology Involving Computer-Based Sequence Analysis and Molecular Modelling.

    Science.gov (United States)

    Attwood, Paul V.

    1997-01-01

    Describes a self-instructional assignment approach to the teaching of advanced enzymology. Presents an assignment that offers a means of teaching enzymology to students that exposes them to modern computer-based techniques of analyzing protein structure and relates structure to enzyme function. (JRH)

  18. Adaptive build-up and breakdown of trust : An agent based computational approach

    NARCIS (Netherlands)

    Gorobets, A.; Nooteboom, B.

    2005-01-01

    This article employs Agent-Based Computational Economics (ACE) to investigate whether, and under what conditions, trust is viable in markets. The emergence and breakdown of trust is modeled in a context of multiple buyers and suppliers. Agents develop trust in a partner as a function of observed loy

  19. Computational evaluation of intraventricular pressure gradients based on a fluid-structure approach.

    Science.gov (United States)

    Redaelli, A; Montevecchi, F M

    1996-11-01

    The dynamics of intraventricular blood flow, i.e. its rapid evolution, implies the rise of intraventricular pressure gradients (IPGs) characteristic of the inertia-driven events as experimentally observed by Pasipoularides (1987, 1990) and by Falsetti et al. (1986). The IPG time course is determined by the wall contraction which, in turn, depends on the load applied, namely the intraventricular pressure which is the sum of the aortic pressure (i.e., the systemic net response) and the IPG. Hence the IPGs account, at least in part, for the wall movement. These considerations suggest the necessity of a comprehensive analysis of the ventricular mechanics involving both ventricular wall mechanics and intraventricular fluid dynamics as each domain determines the boundary conditions of the other. This paper presents a computational approach to ventricular ejection mechanics based on a fluid-structure interaction calculation for the evaluation of the IPG time course. An axisymmetric model of the left ventricle is utilized. The intraventricular fluid is assumed to be Newtonian. The ventricle wall is thin and is composed of two sets of counter-rotating fibres which behave according to the modified version of Wong's sarcomere model proposed by Montevecchi and Pietrabissa and Pietrabissa et al. (1987, 1991). The full Navier-Stokes equations describing the fluid domain are solved using Galerkin's weighted residual approach in conjunction with finite element approximation (FIDAP). The wall displacement is solved using the multiplane quasi-Newton method proposed by Buzzi Ferraris and Tronconi (1985). The interaction procedure is performed by means of an external macro which compares the flow fields and the wall displacement and appropriately modifies the boundary conditions to reach the simultaneous and congruous convergence of the two problems. The results refer to a simulation of the ventricular ejection with a heart rate of 72 bpm. In this phase the ventricle ejects 61 cm3

  20. Computing the laser beam path in optical cavities: a geometric Newton's method based approach

    CERN Document Server

    Cuccato, Davide; Ortolan, Antonello; Beghi, Alessandro

    2015-01-01

    In the last decade, increasing attention has been drawn to high precision optical experiments, which push resolution and accuracy of the measured quantities beyond their current limits. This challenge requires to place optical elements (e.g. mirrors, lenses, etc.) and to steer light beams with sub-nanometer precision. Existing methods for beam direction computing in resonators, e.g. iterative ray tracing or generalized ray transfer matrices, are either computationally expensive or rely on overparametrized models of optical elements. By exploiting Fermat's principle, we develop a novel method to compute the steady-state beam configurations in resonant optical cavities formed by spherical mirrors, as a function of mirror positions and curvature radii. The proposed procedure is based on the geometric Newton method on matrix manifold, a tool with second order convergence rate that relies on a second order model of the cavity optical length. As we avoid coordinates to parametrize the beam position on mirror surfac...

  1. AFFECTIVE AND EMOTIONAL ASPECTS OF HUMAN-COMPUTER INTERACTION: Game-Based and Innovative Learning Approaches

    Directory of Open Access Journals (Sweden)

    A. Askim GULUMBAY, Anadolu University, TURKEY

    2006-07-01

    Full Text Available This book was edited by, Maja Pivec, an educator at the University of Applied Sciences, and published by IOS Pres in 2006. The learning process can be seen as an emotional and personal experience that is addictive and leads learners to proactive behavior. New research methods in this field are related to affective and emotional approaches to computersupported learning and human-computer interactions.Bringing together scientists and research aspects from psychology, educational sciences, cognitive sciences, various aspects of communication and human computer interaction, interface design andcomputer science on one hand and educators and game industry on the other, this should open gates to evolutionary changes of the learning industry. The major topics discussed are emotions, motivation, games and game-experience.

  2. a Holistic Approach for Inspection of Civil Infrastructures Based on Computer Vision Techniques

    Science.gov (United States)

    Stentoumis, C.; Protopapadakis, E.; Doulamis, A.; Doulamis, N.

    2016-06-01

    In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  3. Tailor-made Design of Chemical Blends using Decomposition-based Computer-aided Approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Manan, Zainuddin Abd.; Gernaey, Krist

    methodology for blended liquid products that identifies a set of feasible chemical blends. The blend design problem is formulated as a nonlinear programming (NLP) model where the objective is to find the optimal blended gasoline or diesel product subject to blend chemicals and their compositions, a set...... to a specified priority. Finally, a short list of candidates, ordered in terms of specified performance criteria, is produced for final testing and selection. This systematic and computer-aided approach is illustrated through a case study involving the design of blends of gasoline with oxygenates from biomass...

  4. Design of tailor-made chemical blend using a decomposition-based computer-aided approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Gernaey, Krist; Manan, Z.A.

    2011-01-01

    methodology for blended liquid products that identifies a set of feasible chemical blends. The blend design problem is formulated as a Mixed Integer Nonlinear Programming (MINLP) model where the objective is to find the optimal blended gasoline or diesel product subject to types of chemicals....... The application of this systematic and computer-aided approach is illustrated through a case study involving the design of blends of gasoline with oxygenated compounds resulting from degradation and fermentation of biomass for use in internal combustion engines. Emphasis is given here on the concepts used...

  5. Computational approaches for drug discovery.

    Science.gov (United States)

    Hung, Che-Lun; Chen, Chi-Chun

    2014-09-01

    Cellular proteins are the mediators of multiple organism functions being involved in physiological mechanisms and disease. By discovering lead compounds that affect the function of target proteins, the target diseases or physiological mechanisms can be modulated. Based on knowledge of the ligand-receptor interaction, the chemical structures of leads can be modified to improve efficacy, selectivity and reduce side effects. One rational drug design technology, which enables drug discovery based on knowledge of target structures, functional properties and mechanisms, is computer-aided drug design (CADD). The application of CADD can be cost-effective using experiments to compare predicted and actual drug activity, the results from which can used iteratively to improve compound properties. The two major CADD-based approaches are structure-based drug design, where protein structures are required, and ligand-based drug design, where ligand and ligand activities can be used to design compounds interacting with the protein structure. Approaches in structure-based drug design include docking, de novo design, fragment-based drug discovery and structure-based pharmacophore modeling. Approaches in ligand-based drug design include quantitative structure-affinity relationship and pharmacophore modeling based on ligand properties. Based on whether the structure of the receptor and its interaction with the ligand are known, different design strategies can be seed. After lead compounds are generated, the rule of five can be used to assess whether these have drug-like properties. Several quality validation methods, such as cost function analysis, Fisher's cross-validation analysis and goodness of hit test, can be used to estimate the metrics of different drug design strategies. To further improve CADD performance, multi-computers and graphics processing units may be applied to reduce costs.

  6. Multiscale approach for bone remodeling simulation based on finite element and neural network computation

    CERN Document Server

    Hambli, Ridha

    2011-01-01

    The aim of this paper is to develop a multiscale hierarchical hybrid model based on finite element analysis and neural network computation to link mesoscopic scale (trabecular network level) and macroscopic (whole bone level) to simulate bone remodelling process. Because whole bone simulation considering the 3D trabecular level is time consuming, the finite element calculation is performed at macroscopic level and a trained neural network are employed as numerical devices for substituting the finite element code needed for the mesoscale prediction. The bone mechanical properties are updated at macroscopic scale depending on the morphological organization at the mesoscopic computed by the trained neural network. The digital image-based modeling technique using m-CT and voxel finite element mesh is used to capture 2 mm3 Representative Volume Elements at mesoscale level in a femur head. The input data for the artificial neural network are a set of bone material parameters, boundary conditions and the applied str...

  7. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  8. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  9. THE EFFECT OF WEB-BASED COMPUTER-AIDED ART EDUCATION TO ACADEMIC SUCCESS BY CONSTRUCTIVIST APPROACH

    Directory of Open Access Journals (Sweden)

    Adnan TEPECIK

    2014-11-01

    Full Text Available The general objective of this study is to examine the effect of the “Web Based Computer Aided Teaching” activity which is prepared according to the structural approach on the academic success of the students who study art at graduate level. With this study, interactive education material has been prepared for the web based computer aided teaching and teaching activity has been carried out with two separate student groups. One group has been subjected to the traditional teaching method, and the other has been subjected to the web based computer aided teaching. The success and the learning persistency of the students after the application have been examined. “Multiple subject – one factor experimental pattern” among the experimental models has been used as the pattern in the study. In this respect, the study has been carried out on the pattern with the pretest-posttest control group. A significant increase has been seen in the academic successes of the students who participated in the Web Based Computer Aided teaching applications when compared to the academic successes of the students who participated in the Traditional Teaching method. It has been revealed that the actions in the test group are more effective than those in the control group. Relying on the obtained results, it can be suggested that new studies can be carried out including the web based computer aided teaching applications for different lessons in the field of art education.

  10. Dopamine selectively remediates 'model-based' reward learning: a computational approach.

    Science.gov (United States)

    Sharp, Madeleine E; Foerde, Karin; Daw, Nathaniel D; Shohamy, Daphna

    2016-02-01

    Patients with loss of dopamine due to Parkinson's disease are impaired at learning from reward. However, it remains unknown precisely which aspect of learning is impaired. In particular, learning from reward, or reinforcement learning, can be driven by two distinct computational processes. One involves habitual stamping-in of stimulus-response associations, hypothesized to arise computationally from 'model-free' learning. The other, 'model-based' learning, involves learning a model of the world that is believed to support goal-directed behaviour. Much work has pointed to a role for dopamine in model-free learning. But recent work suggests model-based learning may also involve dopamine modulation, raising the possibility that model-based learning may contribute to the learning impairment in Parkinson's disease. To directly test this, we used a two-step reward-learning task which dissociates model-free versus model-based learning. We evaluated learning in patients with Parkinson's disease tested ON versus OFF their dopamine replacement medication and in healthy controls. Surprisingly, we found no effect of disease or medication on model-free learning. Instead, we found that patients tested OFF medication showed a marked impairment in model-based learning, and that this impairment was remediated by dopaminergic medication. Moreover, model-based learning was positively correlated with a separate measure of working memory performance, raising the possibility of common neural substrates. Our results suggest that some learning deficits in Parkinson's disease may be related to an inability to pursue reward based on complete representations of the environment.

  11. Evolutionary Game Analysis of Competitive Information Dissemination on Social Networks: An Agent-Based Computational Approach

    Directory of Open Access Journals (Sweden)

    Qing Sun

    2015-01-01

    Full Text Available Social networks are formed by individuals, in which personalities, utility functions, and interaction rules are made as close to reality as possible. Taking the competitive product-related information as a case, we proposed a game-theoretic model for competitive information dissemination in social networks. The model is presented to explain how human factors impact competitive information dissemination which is described as the dynamic of a coordination game and players’ payoff is defined by a utility function. Then we design a computational system that integrates the agent, the evolutionary game, and the social network. The approach can help to visualize the evolution of % of competitive information adoption and diffusion, grasp the dynamic evolution features in information adoption game over time, and explore microlevel interactions among users in different network structure under various scenarios. We discuss several scenarios to analyze the influence of several factors on the dissemination of competitive information, ranging from personality of individuals to structure of networks.

  12. Design of dimerization inhibitors of HIV-1 aspartic proteinase: A computer-based combinatorial approach

    Science.gov (United States)

    Caflisch, Amedeo; Schramm, Hans J.; Karplus, Martin

    2000-02-01

    Inhibition of dimerization to the active form of the HIV-1 aspartic proteinase (HIV-1 PR) may be a way to decrease the probability of escape mutations for this viral protein. The Multiple Copy Simultaneous Search (MCSS) methodology was used to generate functionality maps for the dimerization interface of HIV-1 PR. The positions of the MCSS minima of 19 organic fragments, once postprocessed to take into account solvation effects, are in good agreement with experimental data on peptides that bind to the interface. The MCSS minima combined with an approach for computational combinatorial ligand design yielded a set of modified HIV-1 PR C-terminal peptides that are similar to known nanomolar inhibitors of HIV-1 PR dimerization. A number of N-substituted 2,5-diketopiperazines are predicted to be potential dimerization inhibitors of HIV-1 PR.

  13. Computational and experimental approaches for investigating nanoparticle-based drug delivery systems.

    Science.gov (United States)

    Ramezanpour, M; Leung, S S W; Delgado-Magnero, K H; Bashe, B Y M; Thewalt, J; Tieleman, D P

    2016-07-01

    Most therapeutic agents suffer from poor solubility, rapid clearance from the blood stream, a lack of targeting, and often poor translocation ability across cell membranes. Drug/gene delivery systems (DDSs) are capable of overcoming some of these barriers to enhance delivery of drugs to their right place of action, e.g. inside cancer cells. In this review, we focus on nanoparticles as DDSs. Complementary experimental and computational studies have enhanced our understanding of the mechanism of action of nanocarriers and their underlying interactions with drugs, biomembranes and other biological molecules. We review key biophysical aspects of DDSs and discuss how computer modeling can assist in rational design of DDSs with improved and optimized properties. We summarize commonly used experimental techniques for the study of DDSs. Then we review computational studies for several major categories of nanocarriers, including dendrimers and dendrons, polymer-, peptide-, nucleic acid-, lipid-, and carbon-based DDSs, and gold nanoparticles. This article is part of a Special Issue entitled: Membrane Proteins edited by J.C. Gumbart and Sergei Noskov.

  14. Vision therapy and computer orthoptics: evidence-based approach to use in your practice.

    Science.gov (United States)

    Lambert, Jennifer

    2013-01-01

    Convergence insufficiency is a commonly seen disorder of the vergence system. Its clinical characteristics and symptoms have been well described by Duane and von Graefe. Laboratory studies have clarified the vergence pathway, which includes a bi-phasic response. Several recent randomized controlled trials show the effectiveness of common treatment modalities, including pencil pushups, computer orthoptics, and office-based therapy. More studies are needed to investigate the possibility that other treatments may treat convergence insufficiency in a more profound way by acting on other parts of the vergence system.

  15. Discovery of new [Formula: see text] proteasome inhibitors using a knowledge-based computational screening approach.

    Science.gov (United States)

    Mehra, Rukmankesh; Chib, Reena; Munagala, Gurunadham; Yempalla, Kushalava Reddy; Khan, Inshad Ali; Singh, Parvinder Pal; Khan, Farrah Gul; Nargotra, Amit

    2015-11-01

    Mycobacterium tuberculosis bacteria cause deadly infections in patients [Corrected]. The rise of multidrug resistance associated with tuberculosis further makes the situation worse in treating the disease. M. tuberculosis proteasome is necessary for the pathogenesis of the bacterium validated as an anti-tubercular target, thus making it an attractive enzyme for designing Mtb inhibitors. In this study, a computational screening approach was applied to identify new proteasome inhibitor candidates from a library of 50,000 compounds. This chemical library was procured from the ChemBridge (20,000 compounds) and the ChemDiv (30,000 compounds) databases. After a detailed analysis of the computational screening results, 50 in silico hits were retrieved and tested in vitro finding 15 compounds with [Formula: see text] values ranging from 35.32 to 64.15 [Formula: see text]M on lysate. A structural analysis of these hits revealed that 14 of these compounds probably have non-covalent mode of binding to the target and have not reported for anti-tubercular or anti-proteasome activity. The binding interactions of all the 14 protein-inhibitor complexes were analyzed using molecular docking studies. Further, molecular dynamics simulations of the protein in complex with the two most promising hits were carried out so as to identify the key interactions and validate the structural stability.

  16. Development of an Effective Educational Computer Game Based on a Mission Synchronization-Based Peer-Assistance Approach

    Science.gov (United States)

    Chang, Shao-Chen; Hwang, Gwo-Jen

    2017-01-01

    In this study, a mission synchronization-based peer-assistance approach is proposed to improve students' learning performance in digital game-based learning activities. To evaluate the effectiveness of the proposed approach, an experiment has been conducted in an elementary school natural science course to examine the participants' learning…

  17. Computation of the Isotropic Hyperfine Coupling Constant: Efficiency and Insights from a New Approach Based on Wave Function Theory.

    Science.gov (United States)

    Giner, Emmanuel; Tenti, Lorenzo; Angeli, Celestino; Ferré, Nicolas

    2017-02-14

    The present paper reports an original computational strategy for the computation of the isotropic hyperfine coupling constants (hcc). The algorithm proposed here is based on an approach recently introduced by some of the authors, namely, the first-order breathing orbital self-consistent field (FOBO-SCF). The approach is an almost parameter-free wave function method capable to accurately treat the spin delocalization together with the spin polarization effects while staying in a restricted formalism and avoiding spin contamination. The efficiency of the method is tested on a series of small radicals, among which four nitroxide radicals and the comparison with high-level ab initio methods show very encouraging results. On the basis of these results, the method is then applied to compute the hcc of a challenging system, namely, the DEPMPO-OOH radical in various conformations. The reference values obtained on such a large system allows us to validate a cheap computational method based on density functional theory (DFT). Another interesting feature of the model applied here is that it allows for the rationalization of the results according to a relatively simple scheme based on a two-step mechanism. More precisely, the results are analyzed in terms of two separated contributions: first the spin delocalization and then the spin polarization.

  18. An assessment of environmental and toxicological risk to pesticide exposure based on a case-based approach to computing

    Science.gov (United States)

    Coelho, Cristina; Vicente, Henrique; Rosário Martins, M.; Lima, Nelson; Neves, Mariana; Neves, José

    2017-01-01

    Pesticide environmental fate and toxicity depends on its physical and chemical features, the soil composition, soil adsorption, as well as residues that may be found in different soil slots. Indeed, pesticide degradation in soil may be influenced by either biotic or abiotic factors. In addition, the toxicity of pesticides for living organisms depends on their adsorption, distribution, biotransformation, dissemination of metabolites together with interaction with cellular macromolecules and excretion. Biotransformation may result in the formation of less toxic and/or more toxic metabolites, while other processes determine the balance between toxic and a nontoxic upcoming. Aggregate exposure and risk assessment involve multiple pathways and routes, including the potential for pesticide residues in food and drinking water, in addition to residues from pesticide use in residential and non-occupational environments. Therefore, this work will focus on the development of a decision support system to assess the environmental and toxicological risk to pesticide exposure, built on top of a Logic Programming approach to Knowledge Representation and Reasoning, complemented with a Case Based attitude to computing. The proposed solution is unique in itself, once it caters for the explicit treatment of incomplete, unknown, or even self-contradictory information, either in terms of a qualitative or quantitative setting.

  19. Computer-Aided Diagnosis in Mammography Using Content-Based Image Retrieval Approaches: Current Status and Future Perspectives

    Directory of Open Access Journals (Sweden)

    Bin Zheng

    2009-06-01

    Full Text Available As the rapid advance of digital imaging technologies, the content-based image retrieval (CBIR has became one of the most vivid research areas in computer vision. In the last several years, developing computer-aided detection and/or diagnosis (CAD schemes that use CBIR to search for the clinically relevant and visually similar medical images (or regions depicting suspicious lesions has also been attracting research interest. CBIR-based CAD schemes have potential to provide radiologists with “visual aid” and increase their confidence in accepting CAD-cued results in the decision making. The CAD performance and reliability depends on a number of factors including the optimization of lesion segmentation, feature selection, reference database size, computational efficiency, and relationship between the clinical relevance and visual similarity of the CAD results. By presenting and comparing a number of approaches commonly used in previous studies, this article identifies and discusses the optimal approaches in developing CBIR-based CAD schemes and assessing their performance. Although preliminary studies have suggested that using CBIR-based CAD schemes might improve radiologists’ performance and/or increase their confidence in the decision making, this technology is still in the early development stage. Much research work is needed before the CBIR-based CAD schemes can be accepted in the clinical practice.

  20. A generalized computationally efficient inverse characterization approach combining direct inversion solution initialization with gradient-based optimization

    Science.gov (United States)

    Wang, Mengyu; Brigham, John C.

    2017-03-01

    A computationally efficient gradient-based optimization approach for inverse material characterization from incomplete system response measurements that can utilize a generally applicable parameterization (e.g., finite element-type parameterization) is presented and evaluated. The key to this inverse characterization algorithm is the use of a direct inversion strategy with Gappy proper orthogonal decomposition (POD) response field estimation to initialize the inverse solution estimate prior to gradient-based optimization. Gappy POD is used to estimate the complete (i.e., all components over the entire spatial domain) system response field from incomplete (e.g., partial spatial distribution) measurements obtained from some type of system testing along with some amount of a priori information regarding the potential distribution of the unknown material property. The estimated complete system response is used within a physics-based direct inversion procedure with a finite element-type parameterization to estimate the spatial distribution of the desired unknown material property with minimal computational expense. Then, this estimated spatial distribution of the unknown material property is used to initialize a gradient-based optimization approach, which uses the adjoint method for computationally efficient gradient calculations, to produce the final estimate of the material property distribution. The three-step [(1) Gappy POD, (2) direct inversion, and (3) gradient-based optimization] inverse characterization approach is evaluated through simulated test problems based on the characterization of elastic modulus distributions with localized variations (e.g., inclusions) within simple structures. Overall, this inverse characterization approach is shown to efficiently and consistently provide accurate inverse characterization estimates for material property distributions from incomplete response field measurements. Moreover, the solution procedure is shown to be capable

  1. A Programming Language Approach to Internet-Based Virtual Computing Environment

    Institute of Scientific and Technical Information of China (English)

    Ji Wang; Rui Shen; Huai-Min Wang

    2011-01-01

    There is an increasing need to build scalable distributed systems over the Internet infrastructure.However,the development of distributed scalable applications suffers from lack of a wide accepted virtual computing environment.Users have to take great efforts on the management and sharing of the involved resources over Internet,whose characteristics are intrinsic growth,autonomy and diversity.To deal with this challenge,Internet-based Virtual Computing Environment (iVCE) is proposed and developed to serve as a platform for distributed scalable applications over the open infrastructure,whose kernel mechanisms are on-demand aggregation and autonomic collaboration of resources.In this paper,we present a programming language for iVCE named Owlet.Owlet conforms with the conceptual model of iVCE,and exposes the iVCE to application developers.As an interaction language based on peer-to-peer content-based publish/subscribe scheme,Owlet abstracts the Internet as an environment for the roles to interact,and uses roles to build a relatively stable view of resources for the on-demand resource aggregation.It provides language constructs to use 1) distributed event driven rules to describe interaction protocols among different roles,2) conversations to correlate events and rules into a common context,and 3) resource pooling to do fault tolerance and load balancing among networked nodes.We have implemented an Owlet compiler and its runtime environment according to the architecture of iVCE,and built several Owlet applications,including a peer-to-peer file sharing application.Experimental results show that,with iVCE,the separation of resource aggregation logic and business logic significantly eases the process of building scalable distributed applications.

  2. Efficiently computing pathway free energies: New approaches based on chain-of-replica and Non-Boltzmann Bennett reweighting schemes.

    Science.gov (United States)

    Hudson, Phillip S; White, Justin K; Kearns, Fiona L; Hodoscek, Milan; Boresch, Stefan; Lee Woodcock, H

    2015-05-01

    Accurately modeling condensed phase processes is one of computation's most difficult challenges. Include the possibility that conformational dynamics may be coupled to chemical reactions, where multiscale (i.e., QM/MM) methods are needed, and this task becomes even more daunting. Free energy simulations (i.e., molecular dynamics), multiscale modeling, and reweighting schemes. Herein, we present two new approaches for mitigating the aforementioned challenges. The first is a new chain-of-replica method (off-path simulations, OPS) for computing potentials of mean force (PMFs) along an easily defined reaction coordinate. This development is coupled with a new distributed, highly-parallel replica framework (REPDstr) within the CHARMM package. Validation of these new schemes is carried out on two processes that undergo conformational changes. First is the simple torsional rotation of butane, while a much more challenging glycosidic rotation (in vacuo and solvated) is the second. Additionally, a new approach that greatly improves (i.e., possibly an order of magnitude) the efficiency of computing QM/MM PMFs is introduced and compared to standard schemes. Our efforts are grounded in the recently developed method for efficiently computing QM-based free energies (i.e., QM-Non-Boltzmann Bennett, QM-NBB). Again, we validate this new technique by computing the QM/MM PMF of butane's torsional rotation. The OPS-REPDstr method is a promising new approach that overcomes many limitations of standard pathway simulations in CHARMM. The combination of QM-NBB with pathway techniques is very promising as it offers significant advantages over current procedures. Efficiently computing potentials of mean force is a major, unresolved, area of interest. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014. Published by Elsevier B.V.

  3. Contextual Computing: A Bluetooth based approach for tracking healthcare providers in the emergency room.

    Science.gov (United States)

    Frisby, Joshua; Smith, Vernon; Traub, Stephen; Patel, Vimla L

    2017-01-01

    Hospital Emergency Departments (EDs) frequently experience crowding. One of the factors that contributes to this crowding is the "door to doctor time", which is the time from a patient's registration to when the patient is first seen by a physician. This is also one of the Meaningful Use (MU) performance measures that emergency departments report to the Center for Medicare and Medicaid Services (CMS). Current documentation methods for this measure are inaccurate due to the imprecision in manual data collection. We describe a method for automatically (in real time) and more accurately documenting the door to physician time. Using sensor-based technology, the distance between the physician and the computer is calculated by using the single board computers installed in patient rooms that log each time a Bluetooth signal is seen from a device that the physicians carry. This distance is compared automatically with the accepted room radius to determine if the physicians are present in the room at the time logged to provide greater precision. The logged times, accurate to the second, were compared with physicians' handwritten times, showing automatic recordings to be more precise. This real time automatic method will free the physician from extra cognitive load of manually recording data. This method for evaluation of performance is generic and can be used in any other setting outside the ED, and for purposes other than measuring physician time.

  4. A High Performance Computing Study of a Scalable FISST-Based Approach to Multi-Target, Multi-Sensor Tracking

    Science.gov (United States)

    Hussein, I.; Wilkins, M.; Roscoe, C.; Faber, W.; Chakravorty, S.; Schumacher, P.

    2016-09-01

    Finite Set Statistics (FISST) is a rigorous Bayesian multi-hypothesis management tool for the joint detection, classification and tracking of multi-sensor, multi-object systems. Implicit within the approach are solutions to the data association and target label-tracking problems. The full FISST filtering equations, however, are intractable. While FISST-based methods such as the PHD and CPHD filters are tractable, they require heavy moment approximations to the full FISST equations that result in a significant loss of information contained in the collected data. In this paper, we review Smart Sampling Markov Chain Monte Carlo (SSMCMC) that enables FISST to be tractable while avoiding moment approximations. We study the effect of tuning key SSMCMC parameters on tracking quality and computation time. The study is performed on a representative space object catalog with varying numbers of RSOs. The solution is implemented in the Scala computing language at the Maui High Performance Computing Center (MHPCC) facility.

  5. Multiscale approach including microfibril scale to assess elastic constants of cortical bone based on neural network computation and homogenization method

    CERN Document Server

    Barkaoui, Abdelwahed; Tarek, Merzouki; Hambli, Ridha; Ali, Mkaddem

    2014-01-01

    The complexity and heterogeneity of bone tissue require a multiscale modelling to understand its mechanical behaviour and its remodelling mechanisms. In this paper, a novel multiscale hierarchical approach including microfibril scale based on hybrid neural network computation and homogenisation equations was developed to link nanoscopic and macroscopic scales to estimate the elastic properties of human cortical bone. The multiscale model is divided into three main phases: (i) in step 0, the elastic constants of collagen-water and mineral-water composites are calculated by averaging the upper and lower Hill bounds; (ii) in step 1, the elastic properties of the collagen microfibril are computed using a trained neural network simulation. Finite element (FE) calculation is performed at nanoscopic levels to provide a database to train an in-house neural network program; (iii) in steps 2 to 10 from fibril to continuum cortical bone tissue, homogenisation equations are used to perform the computation at the higher s...

  6. P300-based brain-computer interface for environmental control: an asynchronous approach

    Science.gov (United States)

    Aloise, F.; Schettini, F.; Aricò, P.; Leotta, F.; Salinari, S.; Mattia, D.; Babiloni, F.; Cincotti, F.

    2011-04-01

    Brain-computer interface (BCI) systems allow people with severe motor disabilities to communicate and interact with the external world. The P300 potential is one of the most used control signals for EEG-based BCIs. Classic P300-based BCIs work in a synchronous mode; the synchronous control assumes that the user is constantly attending to the stimulation, and the number of stimulation sequences is fixed a priori. This issue is an obstacle for the use of these systems in everyday life; users will be engaged in a continuous control state, their distractions will cause misclassification and the speed of selection will not take into account users' current psychophysical condition. An efficient BCI system should be able to understand the user's intentions from the ongoing EEG instead. Also, it has to refrain from making a selection when the user is engaged in a different activity and it should increase or decrease its speed of selection depending on the current user's state. We addressed these issues by introducing an asynchronous BCI and tested its capabilities for effective environmental monitoring, involving 11 volunteers in three recording sessions. Results show that this BCI system can increase the bit rate during control periods while the system is proved to be very efficient in avoiding false negatives when the users are engaged in other tasks.

  7. A flexible, extendable, modular and computationally efficient approach to scattering-integral-based seismic full waveform inversion

    Science.gov (United States)

    Schumacher, F.; Friederich, W.; Lamara, S.

    2016-02-01

    We present a new conceptual approach to scattering-integral-based seismic full waveform inversion (FWI) that allows a flexible, extendable, modular and both computationally and storage-efficient numerical implementation. To achieve maximum modularity and extendability, interactions between the three fundamental steps carried out sequentially in each iteration of the inversion procedure, namely, solving the forward problem, computing waveform sensitivity kernels and deriving a model update, are kept at an absolute minimum and are implemented by dedicated interfaces. To realize storage efficiency and maximum flexibility, the spatial discretization of the inverted earth model is allowed to be completely independent of the spatial discretization employed by the forward solver. For computational efficiency reasons, the inversion is done in the frequency domain. The benefits of our approach are as follows: (1) Each of the three stages of an iteration is realized by a stand-alone software program. In this way, we avoid the monolithic, unflexible and hard-to-modify codes that have often been written for solving inverse problems. (2) The solution of the forward problem, required for kernel computation, can be obtained by any wave propagation modelling code giving users maximum flexibility in choosing the forward modelling method. Both time-domain and frequency-domain approaches can be used. (3) Forward solvers typically demand spatial discretizations that are significantly denser than actually desired for the inverted model. Exploiting this fact by pre-integrating the kernels allows a dramatic reduction of disk space and makes kernel storage feasible. No assumptions are made on the spatial discretization scheme employed by the forward solver. (4) In addition, working in the frequency domain effectively reduces the amount of data, the number of kernels to be computed and the number of equations to be solved. (5) Updating the model by solving a large equation system can be

  8. Ensemble-based computational approach discriminates functional activity of p53 cancer and rescue mutants.

    Directory of Open Access Journals (Sweden)

    Özlem Demir

    2011-10-01

    Full Text Available The tumor suppressor protein p53 can lose its function upon single-point missense mutations in the core DNA-binding domain ("cancer mutants". Activity can be restored by second-site suppressor mutations ("rescue mutants". This paper relates the functional activity of p53 cancer and rescue mutants to their overall molecular dynamics (MD, without focusing on local structural details. A novel global measure of protein flexibility for the p53 core DNA-binding domain, the number of clusters at a certain RMSD cutoff, was computed by clustering over 0.7 µs of explicitly solvated all-atom MD simulations. For wild-type p53 and a sample of p53 cancer or rescue mutants, the number of clusters was a good predictor of in vivo p53 functional activity in cell-based assays. This number-of-clusters (NOC metric was strongly correlated (r(2 = 0.77 with reported values of experimentally measured ΔΔG protein thermodynamic stability. Interpreting the number of clusters as a measure of protein flexibility: (i p53 cancer mutants were more flexible than wild-type protein, (ii second-site rescue mutations decreased the flexibility of cancer mutants, and (iii negative controls of non-rescue second-site mutants did not. This new method reflects the overall stability of the p53 core domain and can discriminate which second-site mutations restore activity to p53 cancer mutants.

  9. A soft computing based approach using modified selection strategy for feature reduction of medical systems.

    Science.gov (United States)

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  10. A Soft Computing Based Approach Using Modified Selection Strategy for Feature Reduction of Medical Systems

    Directory of Open Access Journals (Sweden)

    Kursat Zuhtuogullari

    2013-01-01

    Full Text Available The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  11. A computational approach to content-based retrieval of folk song melodies

    NARCIS (Netherlands)

    van Kranenburg, P.

    2010-01-01

    In order to develop a Music Information Retrieval system for folksong melodies, one needs to design an adequate computational model of melodic similarity, which is the subject of this Ph.D. thesis. Since understanding of both the properties of the melodies and computational methods is necessary, thi

  12. A Project-Based Learning Approach to Programmable Logic Design and Computer Architecture

    Science.gov (United States)

    Kellett, C. M.

    2012-01-01

    This paper describes a course in programmable logic design and computer architecture as it is taught at the University of Newcastle, Australia. The course is designed around a major design project and has two supplemental assessment tasks that are also described. The context of the Computer Engineering degree program within which the course is…

  13. Models@Home: distributed computing in bioinformatics using a screensaver based approach.

    NARCIS (Netherlands)

    Krieger, E.; Vriend, G.

    2002-01-01

    MOTIVATION: Due to the steadily growing computational demands in bioinformatics and related scientific disciplines, one is forced to make optimal use of the available resources. A straightforward solution is to build a network of idle computers and let each of them work on a small piece of a

  14. The Impact of Computational Experiment and Formative Assessment in Inquiry-Based Teaching and Learning Approach in STEM Education

    Science.gov (United States)

    Psycharis, Sarantos

    2016-04-01

    In this study, an instructional design model, based on the computational experiment approach, was employed in order to explore the effects of the formative assessment strategies and scientific abilities rubrics on students' engagement in the development of inquiry-based pedagogical scenario. In the following study, rubrics were used during the model development, based on prompts provided to students during the development of their models. Our results indicate that modelling is a process that needs sequencing and instructional support, in the form of rubrics, focused on the scientific abilities needed for the inquiry process. In this research, eighty (80) prospective primary school teachers participated, and the results of the research indicate that the development of inquiry-based scenario is strongly affected by the scientific abilities rubrics.

  15. A wavelet-based time frequency analysis approach for classification of motor imagery for brain computer interface applications

    Science.gov (United States)

    Qin, Lei; He, Bin

    2005-12-01

    Electroencephalogram (EEG) recordings during motor imagery tasks are often used as input signals for brain-computer interfaces (BCIs). The translation of these EEG signals to control signals of a device is based on a good classification of various kinds of imagination. We have developed a wavelet-based time-frequency analysis approach for classifying motor imagery tasks. Time-frequency distributions (TFDs) were constructed based on wavelet decomposition and event-related (de)synchronization patterns were extracted from symmetric electrode pairs. The weighted energy difference of the electrode pairs was then compared to classify the imaginary movement. The present method has been tested in nine human subjects and reached an averaged classification rate of 78%. The simplicity of the present technique suggests that it may provide an alternative method for EEG-based BCI applications.

  16. Changes in student approaches to learning with the introduction of computer-supported problem-based learning.

    Science.gov (United States)

    Strømsø, Helge I; Grøttum, Per; Hofgaard Lycke, Kirsten

    2004-04-01

    To study changes in student approaches to learning following the introduction of computer-supported, problem-based learning. Medical students at the University of Oslo undertake a 12-week period of clinical placement during their 10th term. In this period they continue to undertake problem-based learning (PBL) in the form of distributed problem-based learning (DPBL) in a computer-supported learning environment. A questionnaire focusing on learning styles, PBL, and information and communication technology (ICT) was distributed before and after the DPBL period. All students in their 10th term at the University of Oslo (n = 61). The introduction of DPBL did not seem to affect the participants' use of regulating strategies or their mental models of learning. After the DPBL period, group discussion and tutor input were reported to have less influence on students' self-study, while the students perceived themselves as being less active in groups and as expecting less from tutors. There was a relationship between perceived tutor influence and students' familiarity with ICT. The DPBL period seemed to increase students' task-related web accesses and use of experts, and to decrease their task-related use of textbooks and discussions with students outside the group. Students' general approaches to learning were not affected by the introduction of DPBL. However, there was a decrease in students' expectations concerning activity in the group and the importance of the tutor. These changes were related to students' familiarity with the use of computers. Web-based resources and experts became more important resources to the students during the DPBL period.

  17. Development of antibacterial conjugates using sulfamethoxazole with monocyclic terpenes: A systematic medicinal chemistry based computational approach.

    Science.gov (United States)

    Swain, Shasank S; Paidesetty, Sudhir K; Padhy, Rabindra N

    2017-03-01

    . aureus) and -10.19 (conjugate-2 against S. pneumoniae. Conjugates-2 and -5 were the most effective antibacterials based on Lipinski rule of five with lethal doses 3471 and 3500mg/kg, respectively and toxicity class levels. Conjugate-2 and conjugate-5 were more effective than individual monoterpenes and SMZ, against pathogenic bacteria. Synthesis, characterization and in vitro antibacterial study with acute toxicity testing for Wister rat model of the conjugate-5 could land at success in the recorded computational trial and it could be promoted for synthesis in the control of MDR bacteria. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  19. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  20. Using a Diagnosis-Based Approach to Individualize Instructional Explanations in Computer-Mediated Communication

    Science.gov (United States)

    Wittwer, Jorg; Nuckles, Matthias; Renkl, Alexander

    2010-01-01

    To maximize the effectiveness of instructional explanations, they should be tailored to an individual learner. However, instructors are often not able to collect diagnostically relevant information about a learner to individualize their explanations. This is particularly true in computer-mediated settings where it is more difficult to thoroughly…

  1. Computer-aided approach for customized cell-based defect reconstruction.

    Science.gov (United States)

    Meyer, Ulrich; Neunzehn, Jörg; Wiesmann, Hans Peter

    2012-01-01

    Computer-aided technologies like computer-aided design (CAD), computer-aided manufacturing (CAM), and a lot of other features like finite element method (FEM) have been recently employed for use in medical ways like in extracorporeal bone tissue engineering strategies. Aim of this pilot experimental study was to test whether autologous osteoblast-like cells cultured in vitro on individualized scaffolds can be used to support bone regeneration in a clinical environment. Mandibular bone defects were surgically introduced into the mandibles of Göttinger minipigs and the scaffold of the defect site was modelled by CAD/CAM techniques. From the minipigs harvested autologous bone cells from the porcine calvaria were cultivated in bioreactors. The cultured osteoblast-like cells were seeded on polylactic acid/polyglycolic acid (PLA/PGA) copolymer scaffolds being generated by rapid prototyping. The bone defects were then reconstructed by implanting these tissue-constructs into bone defects. The postoperative computerized topographic scans as well as the intraoperative sites demonstrated the accurate fit in the defect sites. The individual created, implanted scaffold constructs enriched with the porcine osteoblast-like cells were well tolerated and appeared to support bone formation, as revealed by immunohistochemical and histological analyses. The results of this investigations indicated that the in vitro expanded osteoblast-like cells spread on a resorbable individualized, computer-aided fabricated scaffold is capable of promoting the repair of bone tissue defects in vivo. The shown results warrant further attempts to combine computer modelling and tissue engineering for use in different ways in bone reconstructive surgery.

  2. Computational approaches to energy materials

    CERN Document Server

    Catlow, Richard; Walsh, Aron

    2013-01-01

    The development of materials for clean and efficient energy generation and storage is one of the most rapidly developing, multi-disciplinary areas of contemporary science, driven primarily by concerns over global warming, diminishing fossil-fuel reserves, the need for energy security, and increasing consumer demand for portable electronics. Computational methods are now an integral and indispensable part of the materials characterisation and development process.   Computational Approaches to Energy Materials presents a detailed survey of current computational techniques for the

  3. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  4. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  5. Development of software for computing forming information using a component based approach

    Directory of Open Access Journals (Sweden)

    Kwang Hee Ko

    2009-12-01

    Full Text Available In shipbuilding industry, the manufacturing technology has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology, however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary to create a “plug-in” framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a framework for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology, which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  6. A soft computing-based approach to optimise queuing-inventory control problem

    Science.gov (United States)

    Alaghebandha, Mohammad; Hajipour, Vahid

    2015-04-01

    In this paper, a multi-product continuous review inventory control problem within batch arrival queuing approach (MQr/M/1) is developed to find the optimal quantities of maximum inventory. The objective function is to minimise summation of ordering, holding and shortage costs under warehouse space, service level and expected lost-sales shortage cost constraints from retailer and warehouse viewpoints. Since the proposed model is Non-deterministic Polynomial-time hard, an efficient imperialist competitive algorithm (ICA) is proposed to solve the model. To justify proposed ICA, both ganetic algorithm and simulated annealing algorithm are utilised. In order to determine the best value of algorithm parameters that result in a better solution, a fine-tuning procedure is executed. Finally, the performance of the proposed ICA is analysed using some numerical illustrations.

  7. Design and Development of a Sample "Computer Programming" Course Tool via Story-Based E-Learning Approach

    Science.gov (United States)

    Kose, Utku; Koc, Durmus; Yucesoy, Suleyman Anil

    2013-01-01

    This study introduces a story-based e-learning oriented course tool that was designed and developed for using within "computer programming" courses. With this tool, students can easily adapt themselves to the subjects in the context of computer programming principles, thanks to the story-based, interactive processes. By using visually…

  8. A computer based approach for Material, Manpower and Equipment managementin the Construction Projects

    Science.gov (United States)

    Sasidhar, Jaladanki; Muthu, D.; Venkatasubramanian, C.; Ramakrishnan, K.

    2017-07-01

    The success of any construction project will depend on efficient management of resources in a perfect manner to complete the project with a reasonable budget and time and the quality cannot be compromised. The efficient and timely procurement of material, deployment of adequate labor at correct time and mobilization of machinery lacking in time, all of them causes delay, lack of quality and finally affect the project cost. It is known factor that Project cost can be controlled by taking corrective actions on mobilization of resources at a right time. This research focuses on integration of management systems with the computer to generate the model which uses OOM data structure which decides to include automatic commodity code generation, automatic takeoff execution, intelligent purchase order generation, and components of design and schedule integration to overcome the problems of stock out. To overcome the problem in equipment management system inventory management module is suggested and the data set of equipment registration number, equipment number, description, date of purchase, manufacturer, equipment price, market value, life of equipment, production data of the equipment which includes equipment number, date, name of the job, hourly rate, insurance, depreciation cost of the equipment, taxes, storage cost, interest, oil, grease, and fuel consumption, etc. is analyzed and the decision support systems to overcome the problem arising out improper management is generated. The problem on labor is managed using scheduling, Strategic management of human resources. From the generated support systems tool, the resources are mobilized at a right time and help the project manager to finish project in time and thereby save the abnormal project cost and also provides the percentage that can be improved and also research focuses on determining the percentage of delays that are caused by lack of management of materials, manpower and machinery in different types of projects

  9. Statistical learning of peptide retention behavior in chromatographic separations: a new kernel-based approach for computational proteomics

    Directory of Open Access Journals (Sweden)

    Huber Christian G

    2007-11-01

    Full Text Available Abstract Background High-throughput peptide and protein identification technologies have benefited tremendously from strategies based on tandem mass spectrometry (MS/MS in combination with database searching algorithms. A major problem with existing methods lies within the significant number of false positive and false negative annotations. So far, standard algorithms for protein identification do not use the information gained from separation processes usually involved in peptide analysis, such as retention time information, which are readily available from chromatographic separation of the sample. Identification can thus be improved by comparing measured retention times to predicted retention times. Current prediction models are derived from a set of measured test analytes but they usually require large amounts of training data. Results We introduce a new kernel function which can be applied in combination with support vector machines to a wide range of computational proteomics problems. We show the performance of this new approach by applying it to the prediction of peptide adsorption/elution behavior in strong anion-exchange solid-phase extraction (SAX-SPE and ion-pair reversed-phase high-performance liquid chromatography (IP-RP-HPLC. Furthermore, the predicted retention times are used to improve spectrum identifications by a p-value-based filtering approach. The approach was tested on a number of different datasets and shows excellent performance while requiring only very small training sets (about 40 peptides instead of thousands. Using the retention time predictor in our retention time filter improves the fraction of correctly identified peptide mass spectra significantly. Conclusion The proposed kernel function is well-suited for the prediction of chromatographic separation in computational proteomics and requires only a limited amount of training data. The performance of this new method is demonstrated by applying it to peptide

  10. An Efficient Approach for Computing Silhouette Coefficients

    Directory of Open Access Journals (Sweden)

    Moh'd B. Al- Zoubi

    2008-01-01

    Full Text Available One popular approach for finding the best number of clusters (K in a data set is through computing the silhouette coefficients. The silhouette coefficients for different values of K, are first found and then the maximum value of these coefficients is chosen. However, computing the silhouette coefficient for different Ks is a very time consuming process. This is due to the amount of CPU time spent on distance calculations. A proposed approach to compute the silhouette coefficient quickly had been presented. The approach was based on decreasing the number of addition operations when computing distances. The results were efficient and more than 50% of the CPU time was achieved when applied to different data sets.

  11. A Computational Agent-Based Modeling Approach for Competitive Wireless Service Market

    KAUST Repository

    Douglas, C C

    2011-04-01

    Using an agent-based modeling method, we study market dynamism with regard to wireless cellular services that are in competition for a greater market share and profit. In the proposed model, service providers and consumers are described as agents who interact with each other and actively participate in an economically well-defined marketplace. Parameters of the model are optimized using the Levenberg-Marquardt method. The quantitative prediction capabilities of the proposed model are examined through data reproducibility using past data from the U.S. and Korean wireless service markets. Finally, we investigate a disruptive market event, namely the introduction of the iPhone into the U.S. in 2007 and the resulting changes in the modeling parameters. We predict and analyze the impacts of the introduction of the iPhone into the Korean wireless service market assuming a release date of 2Q09 based on earlier data. © 2011 IEEE.

  12. Developing computer-based participatory approaches to mapping landscape values for landscape and resource management

    Science.gov (United States)

    Steve Carver; Alan Watson; Tim Waters; Roian Matt; Kari Gunderson; Brett Davis

    2009-01-01

    The last 50 years or so have seen a steady increase in the rate of destructive wildfires across the world, partly as a result of climate change and partly as a result of encroachment of human settlement on fire-based ecosystems (Russell et al. 2004; Westerling et al. 2006). Years of active fire suppression in such areas has inevitably led to the build-up of hazardous...

  13. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  14. Trust Based Pervasive Computing

    Institute of Scientific and Technical Information of China (English)

    LI Shiqun; Shane Balfe; ZHOU Jianying; CHEN Kefei

    2006-01-01

    Pervasive computing environment is a distributed and mobile space. Trust relationship must be established and ensured between devices and the systems in the pervasive computing environment. The trusted computing (TC) technology introduced by trusted computing group is a distributed-system-wide approach to the provisions of integrity protection of resources. The TC' notion of trust and security can be described as conformed system behaviors of a platform environment such that the conformation can be attested to a remote challenger. In this paper the trust requirements in a pervasive/ubiquitous environment are analyzed. Then security schemes for the pervasive computing are proposed using primitives offered by TC technology.

  15. Capacitated vehicle routing problem for PSS uses based on ubiquitous computing: An emerging markets approach

    Directory of Open Access Journals (Sweden)

    Alberto Ochoa-Ortíz

    2015-01-01

    Full Text Available El problema de ruteo de vehículos bajo las limitaciones de capacidad y basado en computación ubicua desde una perspectiva relacionada con PSS (Producto-Servicio de Sistemas para desarrollar configuraciones para el transporte urbano de mercancías es abordado. Éste trabajo considera las especificidades de la logística urbana bajo un contexto de mercados emergentes. En este caso, involucra: i bajas competencias logísticas de los tomadores de decisiones; ii la limitada disponibilidad de datos; y iii restringido acceso a tecnología de alto desempeño para calcular rutas de transporte óptimas. Por lo tanto, se propone el uso de un software libre que proporciona soluciones de bajo costo (en tiempo y recursos. El artículo muestra la aplicación de los resultados de una herramienta de software basado en la Teoría de Grafos utilizado para analizar y resolver un CVRP (Capacitated Vehicle Routing Problem. Se utilizó el caso de una empresa local de distribución de alimentos situada en una gran ciudad de México. Sobre la base de una flora de vehículos pequeños, todos con las mismas especificaciones técnicas y una capacidad de carga comparable.

  16. A computational model of the lexical-semantic system based on a grounded cognition approach.

    Science.gov (United States)

    Ursino, Mauro; Cuppini, Cristiano; Magosso, Elisa

    2010-01-01

    This work presents a connectionist model of the semantic-lexical system based on grounded cognition. The model assumes that the lexical and semantic aspects of language are memorized in two distinct stores. The semantic properties of objects are represented as a collection of features, whose number may vary among objects. Features are described as activation of neural oscillators in different sensory-motor areas (one area for each feature) topographically organized to implement a similarity principle. Lexical items are represented as activation of neural groups in a different layer. Lexical and semantic aspects are then linked together on the basis of previous experience, using physiological learning mechanisms. After training, features which frequently occurred together, and the corresponding word-forms, become linked via reciprocal excitatory synapses. The model also includes some inhibitory synapses: features in the semantic network tend to inhibit words not associated with them during the previous learning phase. Simulations show that after learning, presentation of a cue can evoke the overall object and the corresponding word in the lexical area. Moreover, different objects and the corresponding words can be simultaneously retrieved and segmented via a time division in the gamma-band. Word presentation, in turn, activates the corresponding features in the sensory-motor areas, recreating the same conditions occurring during learning. The model simulates the formation of categories, assuming that objects belong to the same category if they share some features. Simple exempla are shown to illustrate how words representing a category can be distinguished from words representing individual members. Finally, the model can be used to simulate patients with focalized lesions, assuming an impairment of synaptic strength in specific feature areas.

  17. REVIEW: Affective and Emotional Aspects of Human-Computer Interaction: Game-Based and Innovative Learning Approaches

    OpenAIRE

    GULUMBAY, Reviewed By Dr. A. Askim

    2006-01-01

    This book was edited by, Maja Pivec, an educator at the University of Applied Sciences, and published by IOS Pres in 2006. The learning process can be seen as an emotional and personal experience that is addictive and leads learners to proactive behavior. New research methods in this field are related to affective and emotional approaches to computer-supported learning and human-computer interactions. Bringing together scientists and research aspects from psychology, educational sciences, cog...

  18. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  19. Computer Networks A Systems Approach

    CERN Document Server

    Peterson, Larry L

    2011-01-01

    This best-selling and classic book teaches you the key principles of computer networks with examples drawn from the real world of network and protocol design. Using the Internet as the primary example, the authors explain various protocols and networking technologies. Their systems-oriented approach encourages you to think about how individual network components fit into a larger, complex system of interactions. Whatever your perspective, whether it be that of an application developer, network administrator, or a designer of network equipment or protocols, you will come away with a "big pictur

  20. A semi-supervised support vector machine approach for parameter setting in motor imagery-based brain computer interfaces

    Science.gov (United States)

    Long, Jinyi; Yu, Zhuliang

    2010-01-01

    Parameter setting plays an important role for improving the performance of a brain computer interface (BCI). Currently, parameters (e.g. channels and frequency band) are often manually selected. It is time-consuming and not easy to obtain an optimal combination of parameters for a BCI. In this paper, motor imagery-based BCIs are considered, in which channels and frequency band are key parameters. First, a semi-supervised support vector machine algorithm is proposed for automatically selecting a set of channels with given frequency band. Next, this algorithm is extended for joint channel-frequency selection. In this approach, both training data with labels and test data without labels are used for training a classifier. Hence it can be used in small training data case. Finally, our algorithms are applied to a BCI competition data set. Our data analysis results show that these algorithms are effective for selection of frequency band and channels when the training data set is small. PMID:21886673

  1. GRID COMPUTING AND CHECKPOINT APPROACH

    Directory of Open Access Journals (Sweden)

    Pankaj gupta

    2011-05-01

    Full Text Available Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation or problem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. In this paper we investigate the different techniques of fault tolerance which are used in many real time distributed systems. The main focus is on types of fault occurring in the system, fault detection techniques and the recovery techniques used. A fault can occur due to link failure, resource failure or by any other reason is to be tolerated for working the system smoothly and accurately. These faults can be detected and recovered by many techniques used accordingly. An appropriate fault detector can avoid loss due to system crash and reliable fault tolerance technique can save from system failure. This paper provides how these methods are applied to detect and tolerate faults from various Real Time Distributed Systems. The advantages of utilizing the check pointing functionality are obvious; however so far the Grid community has notdeveloped a widely accepted standard that would allow the Gridenvironment to consciously utilize low level check pointing packages.Therefore, such a standard named Grid Check pointing Architecture isbeing designed. The fault tolerance mechanism used here sets the jobcheckpoints based on the resource failure rate. If resource failureoccurs, the job is restarted from its last successful state using acheckpoint file from another grid resource. A critical aspect for anautomatic recovery is the availability of checkpoint files. A strategy to increase the availability of checkpoints is replication. Grid is a form distributed computing mainly to virtualizes and utilize geographically distributed idle resources. A grid is a distributed computational and storage environment often composed of

  2. Computational Approaches to Vestibular Research

    Science.gov (United States)

    Ross, Muriel D.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    The Biocomputation Center at NASA Ames Research Center is dedicated to a union between computational, experimental and theoretical approaches to the study of neuroscience and of life sciences in general. The current emphasis is on computer reconstruction and visualization of vestibular macular architecture in three-dimensions (3-D), and on mathematical modeling and computer simulation of neural activity in the functioning system. Our methods are being used to interpret the influence of spaceflight on mammalian vestibular maculas in a model system, that of the adult Sprague-Dawley rat. More than twenty 3-D reconstructions of type I and type II hair cells and their afferents have been completed by digitization of contours traced from serial sections photographed in a transmission electron microscope. This labor-intensive method has now been replace d by a semiautomated method developed in the Biocomputation Center in which conventional photography is eliminated. All viewing, storage and manipulation of original data is done using Silicon Graphics workstations. Recent improvements to the software include a new mesh generation method for connecting contours. This method will permit the investigator to describe any surface, regardless of complexity, including highly branched structures such as are routinely found in neurons. This same mesh can be used for 3-D, finite volume simulation of synapse activation and voltage spread on neuronal surfaces visualized via the reconstruction process. These simulations help the investigator interpret the relationship between neuroarchitecture and physiology, and are of assistance in determining which experiments will best test theoretical interpretations. Data are also used to develop abstract, 3-D models that dynamically display neuronal activity ongoing in the system. Finally, the same data can be used to visualize the neural tissue in a virtual environment. Our exhibit will depict capabilities of our computational approaches and

  3. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  4. Computational approach to Riemann surfaces

    CERN Document Server

    Klein, Christian

    2011-01-01

    This volume offers a well-structured overview of existent computational approaches to Riemann surfaces and those currently in development. The authors of the contributions represent the groups providing publically available numerical codes in this field. Thus this volume illustrates which software tools are available and how they can be used in practice. In addition examples for solutions to partial differential equations and in surface theory are presented. The intended audience of this book is twofold. It can be used as a textbook for a graduate course in numerics of Riemann surfaces, in which case the standard undergraduate background, i.e., calculus and linear algebra, is required. In particular, no knowledge of the theory of Riemann surfaces is expected; the necessary background in this theory is contained in the Introduction chapter. At the same time, this book is also intended for specialists in geometry and mathematical physics applying the theory of Riemann surfaces in their research. It is the first...

  5. Computationally efficient and flexible modular modelling approach for river and urban drainage systems based on surrogate conceptual models

    Science.gov (United States)

    Wolfs, Vincent; Willems, Patrick

    2015-04-01

    Water managers rely increasingly on mathematical simulation models that represent individual parts of the water system, such as the river, sewer system or waste water treatment plant. The current evolution towards integral water management requires the integration of these distinct components, leading to an increased model scale and scope. Besides this growing model complexity, certain applications gained interest and importance, such as uncertainty and sensitivity analyses, auto-calibration of models and real time control. All these applications share the need for models with a very limited calculation time, either for performing a large number of simulations, or a long term simulation followed by a statistical post-processing of the results. The use of the commonly applied detailed models that solve (part of) the de Saint-Venant equations is infeasible for these applications or such integrated modelling due to several reasons, of which a too long simulation time and the inability to couple submodels made in different software environments are the main ones. Instead, practitioners must use simplified models for these purposes. These models are characterized by empirical relationships and sacrifice model detail and accuracy for increased computational efficiency. The presented research discusses the development of a flexible integral modelling platform that complies with the following three key requirements: (1) Include a modelling approach for water quantity predictions for rivers, floodplains, sewer systems and rainfall runoff routing that require a minimal calculation time; (2) A fast and semi-automatic model configuration, thereby making maximum use of data of existing detailed models and measurements; (3) Have a calculation scheme based on open source code to allow for future extensions or the coupling with other models. First, a novel and flexible modular modelling approach based on the storage cell concept was developed. This approach divides each

  6. Monte Carlo standardless approach for laser induced breakdown spectroscopy based on massive parallel graphic processing unit computing

    Science.gov (United States)

    Demidov, A.; Eschlböck-Fuchs, S.; Kazakov, A. Ya.; Gornushkin, I. B.; Kolmhofer, P. J.; Pedarnig, J. D.; Huber, N.; Heitz, J.; Schmid, T.; Rössler, R.; Panne, U.

    2016-11-01

    The improved Monte-Carlo (MC) method for standard-less analysis in laser induced breakdown spectroscopy (LIBS) is presented. Concentrations in MC LIBS are found by fitting model-generated synthetic spectra to experimental spectra. The current version of MC LIBS is based on the graphic processing unit (GPU) computation and reduces the analysis time down to several seconds per spectrum/sample. The previous version of MC LIBS which was based on the central processing unit (CPU) computation requested unacceptably long analysis times of 10's minutes per spectrum/sample. The reduction of the computational time is achieved through the massively parallel computing on the GPU which embeds thousands of co-processors. It is shown that the number of iterations on the GPU exceeds that on the CPU by a factor > 1000 for the 5-dimentional parameter space and yet requires > 10-fold shorter computational time. The improved GPU-MC LIBS outperforms the CPU-MS LIBS in terms of accuracy, precision, and analysis time. The performance is tested on LIBS-spectra obtained from pelletized powders of metal oxides consisting of CaO, Fe2O3, MgO, and TiO2 that simulated by-products of steel industry, steel slags. It is demonstrated that GPU-based MC LIBS is capable of rapid multi-element analysis with relative error between 1 and 10's percent that is sufficient for industrial applications (e.g. steel slag analysis). The results of the improved GPU-based MC LIBS are positively compared to that of the CPU-based MC LIBS as well as to the results of the standard calibration-free (CF) LIBS based on the Boltzmann plot method.

  7. Developing a Computer Program for Detailed Study of Planing Hull’s Spray Based on Morabito’s Approach

    Institute of Scientific and Technical Information of China (English)

    Parviz Ghadimi; Sasan Tavakoli; Abbas Dashtimanesh; Arya Pirooz

    2014-01-01

    Recently, Morabito (2010) has studied the water spray phenomena in planing hulls and presented new analytical equations. However, these equations have not been used for detailed parametric studies of water spray around planing hulls. In this paper, a straight forward analysis is conducted to apply these analytical equations for finding the spray geometry profile by developing a computer program based on presented computational process. The obtained results of the developed computer program are compared against existing data in the literature and favorable accuracy is achieved. Parametric studies have been conducted for different physical parameters. Positions of spray apex are computed and three dimensional profiles of spray are examined. It is concluded that spray height increases by an increase in the speed coefficient or the deadrise angle. Ultimately, a computational process is added to Savitsky’s method and variations of spray apex are computed for different velocities. It is shown that vertical, lateral, and longitudinal positions of spray increase as the craft speed increases. On the other hand, two new angles are defined in top view and it is concluded that they have direct relation with the trim angle. However, they show inverse relation with the deadrise angle.

  8. Consequence Based Design. An approach for integrating computational collaborative models (Integrated Dynamic Models) in the building design phase

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    affect the design process and collaboration between building designers and simulationists. Within the limits of applying the approach of Consequence based design to five case studies, followed by documentation based on interviews, surveys and project related documentations derived from internal reports...... that secures validity and quality assurance with a simulationist while sustaining autonomous control of building design with the building designer. Consequence based design is defined by the specific use of integrated dynamic models. These models include the parametric capabilities of a visual programming tool...... relies on various advancements in the area of integrated dynamic models. It also relies on the application and test of the approach in practice to evaluate the Consequence based design and the use of integrated dynamic models. As a result, the Consequence based design approach has been applied in five...

  9. Computational approaches for systems metabolomics.

    Science.gov (United States)

    Krumsiek, Jan; Bartel, Jörg; Theis, Fabian J

    2016-06-01

    Systems genetics is defined as the simultaneous assessment and analysis of multi-omics datasets. In the past few years, metabolomics has been established as a robust tool describing an important functional layer in this approach. The metabolome of a biological system represents an integrated state of genetic and environmental factors and has been referred to as a 'link between genotype and phenotype'. In this review, we summarize recent progresses in statistical analysis methods for metabolomics data in combination with other omics layers. We put a special focus on complex, multivariate statistical approaches as well as pathway-based and network-based analysis methods. Moreover, we outline current challenges and pitfalls of metabolomics-focused multi-omics analyses and discuss future steps for the field.

  10. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  11. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  12. Computer based satellite design

    Science.gov (United States)

    Lashbrook, David D.

    1992-06-01

    A computer program to design geosynchronous spacecraft has been developed. The program consists of four separate but interrelated executable computer programs. The programs are compiled to run on a DOS based personnel computer. The source code is written in DoD mandated Ada programming language. The thesis presents the design technique and design equations used in the program. Detailed analysis is performed in the following areas for both dual spin and three axis stabilized spacecraft configurations: (1) Mass Propellent Budget and Mass Summary; (2) Battery Cell and Solar Cell Requirements for a Payload Power Requirement; and (3) Passive Thermal Control Requirements. A user's manual is included as Appendix A, and the source code for the computer programs as Appendix B.

  13. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 1: Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972).

  14. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 2, Appendixes: Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972).

  15. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  16. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  17. A multilevel modeling approach to examining individual differences in skill acquisition for a computer-based task.

    Science.gov (United States)

    Nair, Sankaran N; Czaja, Sara J; Sharit, Joseph

    2007-06-01

    This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50-80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performance and rate of change in performance varied according to the performance parameter assessed. Age and fluid intelligence predicted initial level and rate of improvement in overall quality, whereas crystallized intelligence and age predicted initial e-mail processing time, and crystallized intelligence predicted rate of change in e-mail processing time over days. We discuss the implications of these findings for the design of intervention strategies.

  18. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  19. Near-infrared spectroscopy (NIRS - electroencephalography (EEG based brain-state dependent electrotherapy (BSDE: A computational approach based on excitation-inhibition balance hypothesis

    Directory of Open Access Journals (Sweden)

    Snigdha Dagar

    2016-08-01

    Full Text Available Stroke is the leading cause of severe chronic disability and the second cause of death worldwide with 15 million new cases and 50 million stroke survivors. The post stroke chronic disability may be ameliorated with early neuro rehabilitation where non-invasive brain stimulation (NIBS techniques can be used as an adjuvant treatment to hasten the effects. However, the heterogeneity in the lesioned brain will require individualized NIBS intervention where innovative neuroimaging technologies of portable electroencephalography (EEG and functional-near-infrared spectroscopy (fNIRS can be leveraged for Brain State Dependent Electrotherapy (BSDE. In this hypothesis and theory article, we propose a computational approach based on excitation-inhibition (E-I balance hypothesis to objectively quantify the post stroke individual brain state using online fNIRS-EEG joint imaging. One of the key events that occurs following Stroke is the imbalance in local excitation-inhibition (that is the ratio of Glutamate/GABA which may be targeted with NIBS using a computational pipeline that includes individual forward models to predict current flow patterns through the lesioned brain or brain target region. The current flow will polarize the neurons which can be captured with excitation-inhibition based brain models. Furthermore, E-I balance hypothesis can be used to find the consequences of cellular polarization on neuronal information processing which can then be implicated in changes in function. We first review evidence that shows how this local imbalance between excitation-inhibition leading to functional dysfunction can be restored in targeted sites with NIBS (Motor Cortex, Somatosensory Cortex resulting in large scale plastic reorganization over the cortex, and probably facilitating recovery of functions. Secondly, we show evidence how BSDE based on inhibition–excitation balance hypothesis may target a specific brain site or network as an adjuvant treatment

  20. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  1. Computer Algebra, Instrumentation and the Anthropological Approach

    Science.gov (United States)

    Monaghan, John

    2007-01-01

    This article considers research and scholarship on the use of computer algebra in mathematics education following the instrumentation and the anthropological approaches. It outlines what these approaches are, positions them with regard to other approaches, examines tensions between the two approaches and makes suggestions for how work in this…

  2. Computational approaches for urban environments

    NARCIS (Netherlands)

    Helbich, M; Jokar Arsanjani, J; Leitner, M

    2015-01-01

    This book aims to promote the synergistic usage of advanced computational methodologies in close relationship to geospatial information across cities of different scales. A rich collection of chapters subsumes current research frontiers originating from disciplines such as geography, urban planning,

  3. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too

  4. What is computation : An epistemic approach

    NARCIS (Netherlands)

    Wiedermann, Jiří; van Leeuwen, Jan

    2015-01-01

    Traditionally, computations are seen as processes that transform information. Definitions of computation subsequently concentrate on a description of the mechanisms that lead to such processes. The bottleneck of this approach is twofold. First, it leads to a definition of computation that is too bro

  5. A computational approach for the annotation of hydrogen-bonded base interactions in crystallographic structures of the ribozymes

    Energy Technology Data Exchange (ETDEWEB)

    Hamdani, Hazrina Yusof, E-mail: hazrina@mfrlab.org [School of Biosciences and Biotechnology, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, 43600 UKM Bangi (Malaysia); Advanced Medical and Dental Institute, Universiti Sains Malaysia, Bertam, Kepala Batas (Malaysia); Artymiuk, Peter J., E-mail: p.artymiuk@sheffield.ac.uk [Dept. of Molecular Biology and Biotechnology, Firth Court, University of Sheffield, S10 T2N Sheffield (United Kingdom); Firdaus-Raih, Mohd, E-mail: firdaus@mfrlab.org [School of Biosciences and Biotechnology, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, 43600 UKM Bangi (Malaysia)

    2015-09-25

    A fundamental understanding of the atomic level interactions in ribonucleic acid (RNA) and how they contribute towards RNA architecture is an important knowledge platform to develop through the discovery of motifs from simple arrangements base pairs, to more complex arrangements such as triples and larger patterns involving non-standard interactions. The network of hydrogen bond interactions is important in connecting bases to form potential tertiary motifs. Therefore, there is an urgent need for the development of automated methods for annotating RNA 3D structures based on hydrogen bond interactions. COnnection tables Graphs for Nucleic ACids (COGNAC) is automated annotation system using graph theoretical approaches that has been developed for the identification of RNA 3D motifs. This program searches for patterns in the unbroken networks of hydrogen bonds for RNA structures and capable of annotating base pairs and higher-order base interactions, which ranges from triples to sextuples. COGNAC was able to discover 22 out of 32 quadruples occurrences of the Haloarcula marismortui large ribosomal subunit (PDB ID: 1FFK) and two out of three occurrences of quintuple interaction reported by the non-canonical interactions in RNA (NCIR) database. These and several other interactions of interest will be discussed in this paper. These examples demonstrate that the COGNAC program can serve as an automated annotation system that can be used to annotate conserved base-base interactions and could be added as additional information to established RNA secondary structure prediction methods.

  6. An equation-free computational approach for extracting population-level behavior from individual-based models of biological dispersal

    CERN Document Server

    Erban, R; Othmer, H G; Erban, Radek; Kevrekidis, Ioannis G.; Othmer, Hans G.

    2005-01-01

    The movement of many organisms can be described as a random walk at either or both the individual and population level. The rules for this random walk are based on complex biological processes and it may be difficult to develop a tractable, quantitatively-accurate, individual-level model. However, important problems in areas ranging from ecology to medicine involve large collections of individuals, and a further intellectual challenge is to model population-level behavior based on a detailed individual-level model. Because of the large number of interacting individuals and because the individual-level model is complex, classical direct Monte Carlo simulations can be very slow, and often of little practical use. In this case, an equation-free approach may provide effective methods for the analysis and simulation of individual-based models. In this paper we analyze equation-free coarse projective integration. For analytical purposes, we start with known partial differential equations describing biological rando...

  7. Two computational approaches for Monte Carlo based shutdown dose rate calculation with applications to the JET fusion machine

    Energy Technology Data Exchange (ETDEWEB)

    Petrizzi, L.; Batistoni, P.; Migliori, S. [Associazione EURATOM ENEA sulla Fusione, Frascati (Roma) (Italy); Chen, Y.; Fischer, U.; Pereslavtsev, P. [Association FZK-EURATOM Forschungszentrum Karlsruhe (Germany); Loughlin, M. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, Oxfordshire, OX (United Kingdom); Secco, A. [Nice Srl Via Serra 33 Camerano Casasco AT (Italy)

    2003-07-01

    In deuterium-deuterium (D-D) and deuterium-tritium (D-T) fusion plasmas neutrons are produced causing activation of JET machine components. For safe operation and maintenance it is important to be able to predict the induced activation and the resulting shut down dose rates. This requires a suitable system of codes which is capable of simulating both the neutron induced material activation during operation and the decay gamma radiation transport after shut-down in the proper 3-D geometry. Two methodologies to calculate the dose rate in fusion devices have been developed recently and applied to fusion machines, both using the MCNP Monte Carlo code. FZK has developed a more classical approach, the rigorous 2-step (R2S) system in which MCNP is coupled to the FISPACT inventory code with an automated routing. ENEA, in collaboration with the ITER Team, has developed an alternative approach, the direct 1 step method (D1S). Neutron and decay gamma transport are handled in one single MCNP run, using an ad hoc cross section library. The intention was to tightly couple the neutron induced production of a radio-isotope and the emission of its decay gammas for an accurate spatial distribution and a reliable calculated statistical error. The two methods have been used by the two Associations to calculate the dose rate in five positions of JET machine, two inside the vacuum chamber and three outside, at cooling times between 1 second and 1 year after shutdown. The same MCNP model and irradiation conditions have been assumed. The exercise has been proposed and financed in the frame of the Fusion Technological Program of the JET machine. The scope is to supply the designers with the most reliable tool and data to calculate the dose rate on fusion machines. Results showed that there is a good agreement: the differences range between 5-35%. The next step to be considered in 2003 will be an exercise in which the comparison will be done with dose-rate data from JET taken during and

  8. Antenna arrays a computational approach

    CERN Document Server

    Haupt, Randy L

    2010-01-01

    This book covers a wide range of antenna array topics that are becoming increasingly important in wireless applications, particularly in design and computer modeling. Signal processing and numerical modeling algorithms are explored, and MATLAB computer codes are provided for many of the design examples. Pictures of antenna arrays and components provided by industry and government sources are presented with explanations of how they work. Antenna Arrays is a valuable reference for practicing engineers and scientists in wireless communications, radar, and remote sensing, and an excellent textbook for advanced antenna courses.

  9. Computer Forensics Education - the Open Source Approach

    Science.gov (United States)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  10. Computational dynamics for robotics systems using a non-strict computational approach

    Science.gov (United States)

    Orin, David E.; Wong, Ho-Cheung; Sadayappan, P.

    1989-01-01

    A Non-Strict computational approach for real-time robotics control computations is proposed. In contrast to the traditional approach to scheduling such computations, based strictly on task dependence relations, the proposed approach relaxes precedence constraints and scheduling is guided instead by the relative sensitivity of the outputs with respect to the various paths in the task graph. An example of the computation of the Inverse Dynamics of a simple inverted pendulum is used to demonstrate the reduction in effective computational latency through use of the Non-Strict approach. A speedup of 5 has been obtained when the processes of the task graph are scheduled to reduce the latency along the crucial path of the computation. While error is introduced by the relaxation of precedence constraints, the Non-Strict approach has a smaller error than the conventional Strict approach for a wide range of input conditions.

  11. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...

  12. A DFN-based High Performance Computing Approach to the Simulation of Radionuclide Transport in Mineralogically Heterogeneous Fractured Rocks

    Science.gov (United States)

    Gylling, B.; Trinchero, P.; Molinero, J.; Deissmann, G.; Svensson, U.; Ebrahimi, H.; Hammond, G. E.; Bosbach, D.; Puigdomenech, I.

    2016-12-01

    Geological repositories for nuclear waste are based multi-barrier concepts using engineered and natural barriers. In fractured crystalline rocks, the efficiency of the host rock as transport barrier is related to the processes: advection along fractures, diffusion into the rock matrix and retention onto the available sorption sites. Anomalous matrix penetration profiles were observed in experiments (i.e. REPRO carried out by Posiva at the ONKALO underground facility in Finland and the Long Term Sorption Diffusion Experiment, LTDE-SD, carried out by SKB at the Äspö Hard Rock Laboratory in Sweden). The textural and mineralogical heterogeneity of the rock matrix was offered as plausible explanation for these anomalous penetration profiles. The heterogeneous structure of the rock matrix was characterised at the grain-scale using a micron-scale Discrete Fracture Network (DFN), which is then represented onto a micron-scale structured grid. Matrix fracture free volumes are identified as reactive biotite-bearing grains whereas the rest of the matrix domain constitutes the inter-granular regions. The reactive transport problem mimics the ingress of cesium along a single transmissive fracture. Part of the injected mass diffuses into the matrix where it might eventually sorb onto the surface of reactive grains. The reactive transport calculations are carried out using iDP (interface between DarcyTools and PFLOTRAN). The generation of the DFN is done by DarcyTools, which also takes care of solving the groundwater flow problem. Computed Darcy velocities are extracted and used as input for PFLOTRAN. All the simulation runs are carried out on the supercomputer JUQUEEN at the Jülich Supercomputing Centre. The results are compared with those derived with an alternative model, where biotite abundance is averaged over the whole matrix volume. The analysis of the cesium breakthrough computed at the fracture outlet shows that the averaged model provides later first-arrival time

  13. Towards personalised management of atherosclerosis via computational models in vascular clinics: technology based on patient-specific simulation approach.

    Science.gov (United States)

    Díaz-Zuccarini, Vanessa; Di Tomaso, Giulia; Agu, Obiekezie; Pichardo-Almarza, Cesar

    2014-01-01

    The development of a new technology based on patient-specific modelling for personalised healthcare in the case of atherosclerosis is presented. Atherosclerosis is the main cause of death in the world and it has become a burden on clinical services as it manifests itself in many diverse forms, such as coronary artery disease, cerebrovascular disease/stroke and peripheral arterial disease. It is also a multifactorial, chronic and systemic process that lasts for a lifetime, putting enormous financial and clinical pressure on national health systems. In this Letter, the postulate is that the development of new technologies for healthcare using computer simulations can, in the future, be developed as in-silico management and support systems. These new technologies will be based on predictive models (including the integration of observations, theories and predictions across a range of temporal and spatial scales, scientific disciplines, key risk factors and anatomical sub-systems) combined with digital patient data and visualisation tools. Although the problem is extremely complex, a simulation workflow and an exemplar application of this type of technology for clinical use is presented, which is currently being developed by a multidisciplinary team following the requirements and constraints of the Vascular Service Unit at the University College Hospital, London.

  14. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba

    2016-07-20

    The native nature of high dimension low sample size of gene expression data make the classification task more challenging. Therefore, feature (gene) selection become an apparent need. Selecting a meaningful and relevant genes for classifier not only decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we present a new feature selection technique that takes advantage of clustering both samples and genes. Materials and methods We used leukemia gene expression dataset [1]. The effectiveness of the selected features were evaluated by four different classification methods; support vector machines, k-nearest neighbor, random forest, and linear discriminate analysis. The method evaluate the importance and relevance of each gene cluster by summing the expression level for each gene belongs to this cluster. The gene cluster consider important, if it satisfies conditions depend on thresholds and percentage otherwise eliminated. Results Initial analysis identified 7120 differentially expressed genes of leukemia (Fig. 15a), after applying our feature selection methodology we end up with specific 1117 genes discriminating two classes of leukemia (Fig. 15b). Further applying the same method with more stringent higher positive and lower negative threshold condition, number reduced to 58 genes have be tested to evaluate the effectiveness of the method (Fig. 15c). The results of the four classification methods are summarized in Table 11. Conclusions The feature selection method gave good results with minimum classification error. Our heat-map result shows distinct pattern of refines genes discriminating between two classes of leukemia.

  15. a Radical Collaborative Approach: Developing a Model for Learning Theory, Human-Based Computation and Participant Motivation in a Rock-Art Heritage Application

    Science.gov (United States)

    Haubt, R.

    2016-06-01

    This paper explores a Radical Collaborative Approach in the global and centralized Rock-Art Database project to find new ways to look at rock-art by making information more accessible and more visible through public contributions. It looks at rock-art through the Key Performance Indicator (KPI), identified with the latest Australian State of the Environment Reports to help develop a better understanding of rock-art within a broader Cultural and Indigenous Heritage context. Using a practice-led approach the project develops a conceptual collaborative model that is deployed within the RADB Management System. Exploring learning theory, human-based computation and participant motivation the paper develops a procedure for deploying collaborative functions within the interface design of the RADB Management System. The paper presents the results of the collaborative model implementation and discusses considerations for the next iteration of the RADB Universe within an Agile Development Approach.

  16. Q-P Wave traveltime computation by an iterative approach

    KAUST Repository

    Ma, Xuxin

    2013-01-01

    In this work, we present a new approach to compute anisotropic traveltime based on solving successively elliptical isotropic traveltimes. The method shows good accuracy and is very simple to implement.

  17. Learning and geometry computational approaches

    CERN Document Server

    Smith, Carl

    1996-01-01

    The field of computational learning theory arose out of the desire to for­ mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo­ metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ­ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...

  18. High Performance Computation of Big Data: Performance Optimization Approach towards a Parallel Frequent Item Set Mining Algorithm for Transaction Data based on Hadoop MapReduce Framework

    Directory of Open Access Journals (Sweden)

    Guru Prasad M S

    2017-01-01

    Full Text Available The Huge amount of Big Data is constantly arriving with the rapid development of business organizations and they are interested in extracting knowledgeable information from collected data. Frequent item mining of Big Data helps with business decision and to provide high quality service. The result of traditional frequent item set mining algorithm on Big Data is not an effective way which leads to high computation time. An Apache Hadoop MapReduce is the most popular data intensive distributed computing framework for large scale data applications such as data mining. In this paper, the author identifies the factors affecting on the performance of frequent item mining algorithm based on Hadoop MapReduce technology and proposed an approach for optimizing the performance of large scale frequent item set mining. The Experiments result shows the potential of the proposed approach. Performance is significantly optimized for large scale data mining in MapReduce technique. The author believes that it has a valuable contribution in the high performance computing of Big Data

  19. A unified probabilistic approach to improve spelling in an event-related potential-based brain-computer interface.

    Science.gov (United States)

    Kindermans, Pieter-Jan; Verschore, Hannes; Schrauwen, Benjamin

    2013-10-01

    In recent years, in an attempt to maximize performance, machine learning approaches for event-related potential (ERP) spelling have become more and more complex. In this paper, we have taken a step back as we wanted to improve the performance without building an overly complex model, that cannot be used by the community. Our research resulted in a unified probabilistic model for ERP spelling, which is based on only three assumptions and incorporates language information. On top of that, the probabilistic nature of our classifier yields a natural dynamic stopping strategy. Furthermore, our method uses the same parameters across 25 subjects from three different datasets. We show that our classifier, when enhanced with language models and dynamic stopping, improves the spelling speed and accuracy drastically. Additionally, we would like to point out that as our model is entirely probabilistic, it can easily be used as the foundation for complex systems in future work. All our experiments are executed on publicly available datasets to allow for future comparison with similar techniques.

  20. Combined Computational Approach Based on Density Functional Theory and Artificial Neural Networks for Predicting The Solubility Parameters of Fullerenes.

    Science.gov (United States)

    Perea, J Darío; Langner, Stefan; Salvador, Michael; Kontos, Janos; Jarvas, Gabor; Winkler, Florian; Machui, Florian; Görling, Andreas; Dallos, Andras; Ameri, Tayebeh; Brabec, Christoph J

    2016-05-19

    The solubility of organic semiconductors in environmentally benign solvents is an important prerequisite for the widespread adoption of organic electronic appliances. Solubility can be determined by considering the cohesive forces in a liquid via Hansen solubility parameters (HSP). We report a numerical approach to determine the HSP of fullerenes using a mathematical tool based on artificial neural networks (ANN). ANN transforms the molecular surface charge density distribution (σ-profile) as determined by density functional theory (DFT) calculations within the framework of a continuum solvation model into solubility parameters. We validate our model with experimentally determined HSP of the fullerenes C60, PC61BM, bisPC61BM, ICMA, ICBA, and PC71BM and through comparison with previously reported molecular dynamics calculations. Most excitingly, the ANN is able to correctly predict the dispersive contributions to the solubility parameters of the fullerenes although no explicit information on the van der Waals forces is present in the σ-profile. The presented theoretical DFT calculation in combination with the ANN mathematical tool can be easily extended to other π-conjugated, electronic material classes and offers a fast and reliable toolbox for future pathways that may include the design of green ink formulations for solution-processed optoelectronic devices.

  1. An Approach to Dynamic Provisioning of Social and Computational Services

    NARCIS (Netherlands)

    Bonino da Silva Santos, Luiz Olavo; Sorathia, Vikram; Ferreira Pires, Luis; Sinderen, van Marten

    2010-01-01

    Service-Oriented Computing (SOC) builds upon the intuitive notion of service already known and used in our society for a long time. SOC-related approaches are based on computer-executable functional units that often represent automation of services that exist at the social level, i.e., services at t

  2. Inversion based on computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  3. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational...... thinking. We present two main theses on which the subject is based, and we present the included knowledge areas and didactical design principles. Finally we summarize the status and future plans for the subject and related development projects....

  4. Toward exascale computing through neuromorphic approaches.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.

    2010-09-01

    While individual neurons function at relatively low firing rates, naturally-occurring nervous systems not only surpass manmade systems in computing power, but accomplish this feat using relatively little energy. It is asserted that the next major breakthrough in computing power will be achieved through application of neuromorphic approaches that mimic the mechanisms by which neural systems integrate and store massive quantities of data for real-time decision making. The proposed LDRD provides a conceptual foundation for SNL to make unique advances toward exascale computing. First, a team consisting of experts from the HPC, MESA, cognitive and biological sciences and nanotechnology domains will be coordinated to conduct an exercise with the outcome being a concept for applying neuromorphic computing to achieve exascale computing. It is anticipated that this concept will involve innovative extension and integration of SNL capabilities in MicroFab, material sciences, high-performance computing, and modeling and simulation of neural processes/systems.

  5. Catabolite regulation analysis of Escherichia coli for acetate overflow mechanism and co-consumption of multiple sugars based on systems biology approach using computer simulation.

    Science.gov (United States)

    Matsuoka, Yu; Shimizu, Kazuyuki

    2013-10-20

    It is quite important to understand the basic principle embedded in the main metabolism for the interpretation of the fermentation data. For this, it may be useful to understand the regulation mechanism based on systems biology approach. In the present study, we considered the perturbation analysis together with computer simulation based on the models which include the effects of global regulators on the pathway activation for the main metabolism of Escherichia coli. Main focus is the acetate overflow metabolism and the co-fermentation of multiple carbon sources. The perturbation analysis was first made to understand the nature of the feed-forward loop formed by the activation of Pyk by FDP (F1,6BP), and the feed-back loop formed by the inhibition of Pfk by PEP in the glycolysis. Those together with the effect of transcription factor Cra caused by FDP level affected the glycolysis activity. The PTS (phosphotransferase system) acts as the feed-back system by repressing the glucose uptake rate for the increase in the glucose uptake rate. It was also shown that the increased PTS flux (or glucose consumption rate) causes PEP/PYR ratio to be decreased, and EIIA-P, Cya, cAMP-Crp decreased, where cAMP-Crp in turn repressed TCA cycle and more acetate is formed. This was further verified by the detailed computer simulation. In the case of multiple carbon sources such as glucose and xylose, it was shown that the sequential utilization of carbon sources was observed for wild type, while the co-consumption of multiple carbon sources with slow consumption rates were observed for the ptsG mutant by computer simulation, and this was verified by experiments. Moreover, the effect of a specific gene knockout such as Δpyk on the metabolic characteristics was also investigated based on the computer simulation. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. QPSO-based adaptive DNA computing algorithm.

    Science.gov (United States)

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm.

  7. QPSO-Based Adaptive DNA Computing Algorithm

    Directory of Open Access Journals (Sweden)

    Mehmet Karakose

    2013-01-01

    Full Text Available DNA (deoxyribonucleic acid computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO. Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1 parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2 adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3 numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm.

  8. A computational approach to negative priming

    Science.gov (United States)

    Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael

    2007-09-01

    Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).

  9. SU-E-J-177: A Computational Approach for Determination of Anisotropic PTV Margins Based On Statistical Shape Analysis for Prostate Cancer Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Shibayama, Y; Arimura, H; Nakamura, K; Honda, H; Toyofuku, F [Kyushu University, Fukuoka, JP (Japan); Hirose, T; Umezu, Y; Nakamura, Y [Kyushu University Hospital, Fukuoka, JP (Japan)

    2015-06-15

    Purpose: Our aim of this study was to propose a computational approach for determination of anisotropic planning target volume (PTV) margins based on statistical shape analysis with taking into account time variations of clinical target volume (CTV) shapes for the prostate cancer radiation treatment planning (RTP). Methods: Systematic and random setup errors were measured using orthogonal projection and cone beam computed tomography (CBCT) images for data of 20 patients, who underwent the intensity modulated radiation therapy for prostate cancer. The low-risk, intermediate-risk, and high-risk CTVs were defined as only a prostate, a prostate plus proximal 1-cm seminal vesicles, and a prostate plus proximal 2-cm seminal vesicles, respectively. All CTV regions were registered with a reference CTV region with a median volume to remove the effect of the setup errors, and converted to a point distribution models. The systematic and random errors for translations of CTV regions were automatically evaluated by analyzing the movements of centroids of CTV regions. The random and systematic errors for shape variations of CTV regions were obtained from covariance matrices based on point distributions for the CTV contours on CBCT images of 72 fractions of 10 patients. Anisotropic PTV margins for 6 directions (right, left, anterior, posterior, superior and inferior) were derived by using Yoda’s PTV margin model. Results: PTV margins with and without shape variations were 5.75 to 8.03 mm and 5.23 to 7.67 mm for low-risk group, 5.87 to 8.33 mm and 5.23 to 7.67 mm for intermediate-risk group, and 5.88 to 8.25 mm and 5.29 to 7.82 mm for highrisk group, respectively. Conclusion: The proposed computational approach could be feasible for determination of the anisotropic PTV margins with taking into account CTV shape variations for the RTP.

  10. Teacher Conceptions and Approaches Associated with an Immersive Instructional Implementation of Computer-Based Models and Assessment in a Secondary Chemistry Classroom

    Science.gov (United States)

    Waight, Noemi; Liu, Xiufeng; Gregorius, Roberto Ma.; Smith, Erica; Park, Mihwa

    2014-02-01

    This paper reports on a case study of an immersive and integrated multi-instructional approach (namely computer-based model introduction and connection with content; facilitation of individual student exploration guided by exploratory worksheet; use of associated differentiated labs and use of model-based assessments) in the implementation of coupled computer-based models and assessment in a high-school chemistry classroom. Data collection included in-depth teacher interviews, classroom observations, student interviews and researcher notes. Teacher conceptions highlighted the role of models as tools; the benefits of abstract portrayal via visualizations; appropriate enactment of model implementation; concerns with student learning and issues with time. The case study revealed numerous challenges reconciling macro, submicro and symbolic phenomena with the NetLogo model. Nonetheless, the effort exhibited by the teacher provided a platform to support the evolution of practice over time. Students' reactions reflected a continuum of confusion and benefits which were directly related to their background knowledge and experiences with instructional modes. The findings have implications for the role of teacher knowledge of models, the modeling process and pedagogical content knowledge; the continuum of student knowledge as novice users and the role of visual literacy in model decoding, comprehension and translation.

  11. Blueprinting Approach in Support of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2012-03-01

    Full Text Available Current cloud service offerings, i.e., Software-as-a-service (SaaS, Platform-as-a-service (PaaS and Infrastructure-as-a-service (IaaS offerings are often provided as monolithic, one-size-fits-all solutions and give little or no room for customization. This limits the ability of Service-based Application (SBA developers to configure and syndicate offerings from multiple SaaS, PaaS, and IaaS providers to address their application requirements. Furthermore, combining different independent cloud services necessitates a uniform description format that facilitates the design, customization, and composition. Cloud Blueprinting is a novel approach that allows SBA developers to easily design, configure and deploy virtual SBA payloads on virtual machines and resource pools on the cloud. We propose the Blueprint concept as a uniform abstract description for cloud service offerings that may cross different cloud computing layers, i.e., SaaS, PaaS and IaaS. To support developers with the SBA design and development in the cloud, this paper introduces a formal Blueprint Template for unambiguously describing a blueprint, as well as a Blueprint Lifecycle that guides developers through the manipulation, composition and deployment of different blueprints for an SBA. Finally, the empirical evaluation of the blueprinting approach within an EC’s FP7 project is reported and an associated blueprint prototype implementation is presented.

  12. A Community-Based Approach to Monitor Resource for Cloud Computing%基于社区模型的云资源监测

    Institute of Scientific and Technical Information of China (English)

    祁鑫; 李振

    2012-01-01

    云计算是一种新兴的商业计算模型,资源性能和负载监测是其重要的研究点.分析了传统的分布式计算资源监测策略,针对云计算环境,引入社区模型设计了层次式社区监测,提出了基于敏感因子的监测方法,以解决全局监控可能会带来的数据繁冗和无效问题.仿真实验表明,模型和策略在理论上是合理的,在效率上较传统监测系统有一定的提高.%Cloud computing is an emerging computing model, and the resource performance and load monitoring is an important research point. According to cloud computing environment, this paper analyzed the monitoring methods of traditional distributed system, designed a hierarchical model introducing community model, and proposed an approach based on sensitivity factors, to solve the problems of data redun-dancy and invalid in global monitoring. Simulation results show that the model and method is reasonable in theory, and the efficiency has been improved on some degree.

  13. Analysis of relay based valley coil system of K-130 Cyclotron and an approach to computer controlled system

    Energy Technology Data Exchange (ETDEWEB)

    Shoor, B.

    2016-09-11

    To overcome the first harmonic field imperfection in sector focused cyclotron, a set of coils placed in valleys are used to produce an opposite first harmonic effect. Usually, at the time of beam tuning the phase of the first harmonic is varied using a relay system. It can be shown analytically that magnitude of it changes simultaneously, when phase is changed. This is not desirable at the time of beam tuning. Moreover phase changes in long steps, which hampers accuracy of beam tuning. To overcome this, a computer controlled system is suggested where amplitude remains constant at the time of phase change. Moreover, phase can be changed continuously which gives better tuning accuracy.

  14. A Big Data Approach to Computational Creativity

    CERN Document Server

    Varshney, Lav R; Varshney, Kush R; Bhattacharjya, Debarun; Schoergendorfer, Angela; Chee, Yi-Min

    2013-01-01

    Computational creativity is an emerging branch of artificial intelligence that places computers in the center of the creative process. Broadly, creativity involves a generative step to produce many ideas and a selective step to determine the ones that are the best. Many previous attempts at computational creativity, however, have not been able to achieve a valid selective step. This work shows how bringing data sources from the creative domain and from hedonic psychophysics together with big data analytics techniques can overcome this shortcoming to yield a system that can produce novel and high-quality creative artifacts. Our data-driven approach is demonstrated through a computational creativity system for culinary recipes and menus we developed and deployed, which can operate either autonomously or semi-autonomously with human interaction. We also comment on the volume, velocity, variety, and veracity of data in computational creativity.

  15. Towards Lagrangian approach to quantum computations

    CERN Document Server

    Vlasov, A Yu

    2003-01-01

    In this work is discussed possibility and actuality of Lagrangian approach to quantum computations. Finite-dimensional Hilbert spaces used in this area provide some challenge for such consideration. The model discussed here can be considered as an analogue of Weyl quantization of field theory via path integral in L. D. Faddeev's approach. Weyl quantization is possible to use also in finite-dimensional case, and some formulas may be simply rewritten with change of integrals to finite sums. On the other hand, there are specific difficulties relevant to finite case. This work has some allusions with phase space models of quantum computations developed last time by different authors.

  16. Computational Paradigm to Elucidate the Effects of Arts-Based Approaches and Interventions: Individual and Collective Emerging Behaviors in Artwork Construction.

    Directory of Open Access Journals (Sweden)

    Billie Sandak

    Full Text Available Art therapy, as well as other arts-based therapies and interventions, is used to reduce pain, stress, depression, breathlessness and other symptoms in a wide variety of serious and chronic diseases, such as cancer, Alzheimer and schizophrenia. Arts-based approaches are also known to contribute to one's well-being and quality of life. However, much research is required, since the mechanisms by which these non-pharmacological treatments exert their therapeutic and psychosocial effects are not adequately understood. A typical clinical setting utilizing the arts consists of the creation work itself, such as the artwork, as well as the therapist and the patient, all of which constitute a rich and dynamic environment of occurrences. The underlying complex, simultaneous and interwoven processes of this setting are often considered intractable to human observers, and as a consequence are usually interpreted subjectively and described verbally, which affect their subsequent analyses and understanding. We introduce a computational research method for elucidating and analyzing emergent expressive and social behaviors, aiming to understand how arts-based approaches operate. Our methodology, which centers on the visual language of Statecharts and tools for its execution, enables rigorous qualitative and quantitative tracking, analysis and documentation of the underlying creation and interaction processes. Also, it enables one to carry out exploratory, hypotheses-generating and knowledge discovery investigations, which are empirical-based. Furthermore, we illustrate our method's use in a proof-of-principle study, applying it to a real-world artwork investigation with human participants. We explore individual and collective emergent behaviors impacted by diverse drawing tasks, yielding significant gender and age hypotheses, which may account for variation factors in response to art use. We also discuss how to gear our research method to systematic and

  17. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    Science.gov (United States)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  18. Targeting YAP/TAZ-TEAD protein-protein interactions using fragment-based and computational modeling approaches.

    Science.gov (United States)

    Kaan, Hung Yi Kristal; Sim, Adelene Y L; Tan, Siew Kim Joyce; Verma, Chandra; Song, Haiwei

    2017-01-01

    The Hippo signaling pathway, which is implicated in the regulation of organ size, has emerged as a potential target for the development of cancer therapeutics. YAP, TAZ (transcription co-activators) and TEAD (transcription factor) are the downstream transcriptional machinery and effectors of the pathway. Formation of the YAP/TAZ-TEAD complex leads to transcription of growth-promoting genes. Conversely, disrupting the interactions of the complex decreases cell proliferation. Herein, we screened a 1000-member fragment library using Thermal Shift Assay and identified a hit fragment. We confirmed its binding at the YAP/TAZ-TEAD interface by X-ray crystallography, and showed that it occupies the same hydrophobic pocket as a conserved phenylalanine of YAP/TAZ. This hit fragment serves as a scaffold for the development of compounds that have the potential to disrupt YAP/TAZ-TEAD interactions. Structure-activity relationship studies and computational modeling were also carried out to identify more potent compounds that may bind at this validated druggable binding site.

  19. Computer networking a top-down approach

    CERN Document Server

    Kurose, James

    2017-01-01

    Unique among computer networking texts, the Seventh Edition of the popular Computer Networking: A Top Down Approach builds on the author’s long tradition of teaching this complex subject through a layered approach in a “top-down manner.” The text works its way from the application layer down toward the physical layer, motivating readers by exposing them to important concepts early in their study of networking. Focusing on the Internet and the fundamentally important issues of networking, this text provides an excellent foundation for readers interested in computer science and electrical engineering, without requiring extensive knowledge of programming or mathematics. The Seventh Edition has been updated to reflect the most important and exciting recent advances in networking.

  20. Snapshot Based Virtualization Mechanism for Cloud Computing

    Directory of Open Access Journals (Sweden)

    A.Rupa

    2012-09-01

    Full Text Available Virtualization in cloud computing has been the latest evolutionary technology in current applications of various industries and IT firms are adopting Cloud Technology. The concept of cloud computing was introduced long back. Since its inception there have been many number of new innovations implemented by different experts and researchers etc. Virtualization in cloud computing is very effective approach to gain different operational advantages in cloud computing. In this paper we have proposed the concept of virtualization using Snapshot based Mechanism, where the Memory virtualization and Storage virtualization are discussed in this paper.

  1. Hybrid soft computing approaches research and applications

    CERN Document Server

    Dutta, Paramartha; Chakraborty, Susanta

    2016-01-01

    The book provides a platform for dealing with the flaws and failings of the soft computing paradigm through different manifestations. The different chapters highlight the necessity of the hybrid soft computing methodology in general with emphasis on several application perspectives in particular. Typical examples include (a) Study of Economic Load Dispatch by Various Hybrid Optimization Techniques, (b) An Application of Color Magnetic Resonance Brain Image Segmentation by ParaOptiMUSIG activation Function, (c) Hybrid Rough-PSO Approach in Remote Sensing Imagery Analysis,  (d) A Study and Analysis of Hybrid Intelligent Techniques for Breast Cancer Detection using Breast Thermograms, and (e) Hybridization of 2D-3D Images for Human Face Recognition. The elaborate findings of the chapters enhance the exhibition of the hybrid soft computing paradigm in the field of intelligent computing.

  2. Atomistic understanding of the C·T mismatched DNA base pair tautomerization via the DPT: QM and QTAIM computational approaches.

    Science.gov (United States)

    Brovarets', Ol'ha O; Hovorun, Dmytro M

    2013-11-15

    It was established that the cytosine·thymine (C·T) mismatched DNA base pair with cis-oriented N1H glycosidic bonds has propeller-like structure (|N3C4C4N3| = 38.4°), which is stabilized by three specific intermolecular interactions-two antiparallel N4H…O4 (5.19 kcal mol(-1)) and N3H…N3 (6.33 kcal mol(-1)) H-bonds and a van der Waals (vdW) contact O2…O2 (0.32 kcal mol(-1)). The C·T base mispair is thermodynamically stable structure (ΔG(int) = -1.54 kcal mol(-1) ) and even slightly more stable than the A·T Watson-Crick DNA base pair (ΔG(int) = -1.43 kcal mol(-1)) at the room temperature. It was shown that the C·T ↔ C*·T* tautomerization via the double proton transfer (DPT) is assisted by the O2…O2 vdW contact along the entire range of the intrinsic reaction coordinate (IRC). The positive value of the Grunenberg's compliance constants (31.186, 30.265, and 22.166 Å/mdyn for the C·T, C*·T*, and TS(C·T ↔ C*·T*), respectively) proves that the O2…O2 vdW contact is a stabilizing interaction. Based on the sweeps of the H-bond energies, it was found that the N4H…O4/O4H…N4, and N3H…N3 H-bonds in the C·T and C*·T* base pairs are anticooperative and weaken each other, whereas the middle N3H…N3 H-bond and the O2…O2 vdW contact are cooperative and mutually reinforce each other. It was found that the tautomerization of the C·T base mispair through the DPT is concerted and asynchronous reaction that proceeds via the TS(C·T ↔ C*·T*) stabilized by the loosened N4-H-O4 covalent bridge, N3H…N3 H-bond (9.67 kcal mol(-1) ) and O2…O2 vdW contact (0.41 kcal mol(-1) ). The nine key points, describing the evolution of the C·T ↔ C*·T* tautomerization via the DPT, were detected and completely investigated along the IRC. The C*·T* mispair was revealed to be the dynamically unstable structure with a lifetime 2.13·× 10(-13) s. In this case, as for the A·T Watson-Crick DNA base pair, activates the mechanism of the quantum protection of the C

  3. Following Roman waterways from a computer screen: GIS-based approaches to the analysis of Barcino’s aqueducts

    OpenAIRE

    Orengo, Hèctor A.; Miró, Carme

    2011-01-01

    From the 1950's until today the Roman colony of Barcino (modern Barcelona) has been believed to posses two aqueducts. One was transporting water from the Montcada mountains and the other one from the Collserola range. In this article, GIS-based least-cost route analysis (LCR) in combination with more traditional archaeological techniques is applied to analyse these aqueduct’s routes. The results obtained suggest Barcino had only one aqueduct: the one carrying water from Montcada. The aqueduct...

  4. Agent Based Computing Machine

    Science.gov (United States)

    2005-12-09

    be used in Phase 2 to accomplish the following enhancements. Due to the speed and support of MPI for C/C++ on Beowulf clusters , these languages could...1.7 ABC Machine Formal Definition 24 1.8 Computational Analysis 31 1.9 Programming Concepts 34 1.10 Cluster Mapping 38 1.11 Phase 1 Results 43 2...options for hardware implementation are explored including an emulation with a high performance cluster , a high performance silicon chip and the

  5. What is intrinsic motivation? A typology of computational approaches

    Directory of Open Access Journals (Sweden)

    Pierre-Yves Oudeyer

    2009-11-01

    Full Text Available Intrinsic motivation, the causal mechanism for spontaneous exploration and curiosity, is a central concept in developmental psychology. It has been argued to be a crucial mechanism for open-ended cognitive development in humans, and as such has gathered a growing interest from developmental roboticists in the recent years. The goal of this paper is threefold. First, it provides a synthesis of the different approaches of intrinsic motivation in psychology. Second, by interpreting these approaches in a computational reinforcement learning framework, we argue that they are not operational and even sometimes inconsistent. Third, we set the ground for a systematic operational study of intrinsic motivation by presenting a formal typology of possible computational approaches. This typology is partly based on existing computational models, but also presents new ways of conceptualizing intrinsic motivation. We argue that this kind of computational typology might be useful for opening new avenues for research both in psychology and developmental robotics.

  6. Computer networks ISE a systems approach

    CERN Document Server

    Peterson, Larry L

    2007-01-01

    Computer Networks, 4E is the only introductory computer networking book written by authors who have had first-hand experience with many of the protocols discussed in the book, who have actually designed some of them as well, and who are still actively designing the computer networks today. This newly revised edition continues to provide an enduring, practical understanding of networks and their building blocks through rich, example-based instruction. The authors' focus is on the why of network design, not just the specifications comprising today's systems but how key technologies and p

  7. Handbook of computational approaches to counterterrorism

    CERN Document Server

    Subrahmanian, VS

    2012-01-01

    Terrorist groups throughout the world have been studied primarily through the use of social science methods. However, major advances in IT during the past decade have led to significant new ways of studying terrorist groups, making forecasts, learning models of their behaviour, and shaping policies about their behaviour. Handbook of Computational Approaches to Counterterrorism provides the first in-depth look at how advanced mathematics and modern computing technology is shaping the study of terrorist groups. This book includes contributions from world experts in the field, and presents extens

  8. The Formal Approach to Computer Game Rule Development Automation

    OpenAIRE

    Elena, A

    2009-01-01

    Computer game rules development is one of the weakly automated tasks in game development. This paper gives an overview of the ongoing research project which deals with automation of rules development for turn-based strategy computer games. Rules are the basic elements of these games. This paper proposes a new approach to automation including visual formal rules model creation, model verification and modelbased code generation.

  9. The process group approach to reliable distributed computing

    Science.gov (United States)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  10. Moment matrices, border bases and radical computation

    OpenAIRE

    Mourrain, B.; J. B. Lasserre; Laurent, Monique; Rostalski, P.; Trebuchet, Philippe

    2013-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-denite programming. While the border basis algorithms of [17] are ecient and numerically stable for computing complex roots, algorithms based on moment matrices [12] allow the incorporation of additional polynomials, ...

  11. Novel computational approaches characterizing knee physiotherapy

    OpenAIRE

    Wangdo Kim; Veloso, Antonio P; Duarte Araujo; Kohles, Sean S.

    2014-01-01

    A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physi...

  12. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  13. Computer-based and web-based radiation safety training

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  14. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  15. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface

    Science.gov (United States)

    Yaacoub, Charles; Mhanna, Georges; Rihana, Sandy

    2017-01-01

    Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5%) while improving the accuracy, sensitivity, specificity, and precision of the classifier. PMID:28124985

  16. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface

    Directory of Open Access Journals (Sweden)

    Charles Yaacoub

    2017-01-01

    Full Text Available Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5% while improving the accuracy, sensitivity, specificity, and precision of the classifier.

  17. A Unitifed Computational Approach to Oxide Aging Processes

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, D.J.; Fleetwood, D.M.; Hjalmarson, H.P.; Schultz, P.A.

    1999-01-27

    In this paper we describe a unified, hierarchical computational approach to aging and reliability problems caused by materials changes in the oxide layers of Si-based microelectronic devices. We apply this method to a particular low-dose-rate radiation effects problem

  18. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  19. A Two-Tier Test-Based Approach to Improving Students' Computer-Programming Skills in a Web-Based Learning Environment

    Science.gov (United States)

    Yang, Tzu-Chi; Hwang, Gwo-Jen; Yang, Stephen J. H.; Hwang, Gwo-Haur

    2015-01-01

    Computer programming is an important skill for engineering and computer science students. However, teaching and learning programming concepts and skills has been recognized as a great challenge to both teachers and students. Therefore, the development of effective learning strategies and environments for programming courses has become an important…

  20. Computer science approach to quantum control

    Energy Technology Data Exchange (ETDEWEB)

    Janzing, D.

    2006-07-01

    definitions of complexity in computer science must be based upon a notion of elementary computation steps that correspond to not too complex real physical processes. This book tries to shed light on both aspects of this unification. (orig.)

  1. An approach to computing direction relations between separated object groups

    Science.gov (United States)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  2. A tale of three bio-inspired computational approaches

    Science.gov (United States)

    Schaffer, J. David

    2014-05-01

    I will provide a high level walk-through for three computational approaches derived from Nature. First, evolutionary computation implements what we may call the "mother of all adaptive processes." Some variants on the basic algorithms will be sketched and some lessons I have gleaned from three decades of working with EC will be covered. Then neural networks, computational approaches that have long been studied as possible ways to make "thinking machines", an old dream of man's, and based upon the only known existing example of intelligence. Then, a little overview of attempts to combine these two approaches that some hope will allow us to evolve machines we could never hand-craft. Finally, I will touch on artificial immune systems, Nature's highly sophisticated defense mechanism, that has emerged in two major stages, the innate and the adaptive immune systems. This technology is finding applications in the cyber security world.

  3. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  4. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  5. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  6. An Approach to Ad hoc Cloud Computing

    CERN Document Server

    Kirby, Graham; Macdonald, Angus; Fernandes, Alvaro

    2010-01-01

    We consider how underused computing resources within an enterprise may be harnessed to improve utilization and create an elastic computing infrastructure. Most current cloud provision involves a data center model, in which clusters of machines are dedicated to running cloud infrastructure software. We propose an additional model, the ad hoc cloud, in which infrastructure software is distributed over resources harvested from machines already in existence within an enterprise. In contrast to the data center cloud model, resource levels are not established a priori, nor are resources dedicated exclusively to the cloud while in use. A participating machine is not dedicated to the cloud, but has some other primary purpose such as running interactive processes for a particular user. We outline the major implementation challenges and one approach to tackling them.

  7. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  8. Agent-Based Cloud Computing

    OpenAIRE

    Sim, Kwang Mong

    2012-01-01

    Agent-based cloud computing is concerned with the design and development of software agents for bolstering cloud service\\ud discovery, service negotiation, and service composition. The significance of this work is introducing an agent-based paradigm for\\ud constructing software tools and testbeds for cloud resource management. The novel contributions of this work include: 1) developing\\ud Cloudle: an agent-based search engine for cloud service discovery, 2) showing that agent-based negotiatio...

  9. A Systematic Approach for Computing Zero-Point Energy, Quantum Partition Function, and Tunneling Effect Based on Kleinert's Variational Perturbation Theory.

    Science.gov (United States)

    Wong, Kin-Yiu; Gao, Jiali

    2008-09-09

    In this paper, we describe an automated integration-free path-integral (AIF-PI) method, based on Kleinert's variational perturbation (KP) theory, to treat internuclear quantum-statistical effects in molecular systems. We have developed an analytical method to obtain the centroid potential as a function of the variational parameter in the KP theory, which avoids numerical difficulties in path-integral Monte Carlo or molecular dynamics simulations, especially at the limit of zero-temperature. Consequently, the variational calculations using the KP theory can be efficiently carried out beyond the first order, i.e., the Giachetti-Tognetti-Feynman-Kleinert variational approach, for realistic chemical applications. By making use of the approximation of independent instantaneous normal modes (INM), the AIF-PI method can readily be applied to many-body systems. Previously, we have shown that in the INM approximation, the AIF-PI method is accurate for computing the quantum partition function of a water molecule (3 degrees of freedom) and the quantum correction factor for the collinear H(3) reaction rate (2 degrees of freedom). In this work, the accuracy and properties of the KP theory are further investigated by using the first three order perturbations on an asymmetric double-well potential, the bond vibrations of H(2), HF, and HCl represented by the Morse potential, and a proton-transfer barrier modeled by the Eckart potential. The zero-point energy, quantum partition function, and tunneling factor for these systems have been determined and are found to be in excellent agreement with the exact quantum results. Using our new analytical results at the zero-temperature limit, we show that the minimum value of the computed centroid potential in the KP theory is in excellent agreement with the ground state energy (zero-point energy) and the position of the centroid potential minimum is the expectation value of particle position in wave mechanics. The fast convergent property

  10. Hypercomputation based on quantum computing

    CERN Document Server

    Sicard, A; Ospina, J; Sicard, Andr\\'es; V\\'elez, Mario; Ospina, Juan

    2004-01-01

    We present a quantum algorithm for a (classically) incomputable decision problem: the Hilbert's tenth problem; namely, we present a hypercomputation model based on quantum computation. The model is inspired by the one proposed by Tien D. Kieu. Our model exploits the quantum adiabatic process and the characteristics of the representation of the dynamical algebra su(1,1) associated to the infinite square well. Furthermore, it is demonstrated that the model proposed is a universal quantum computation model.

  11. Computer Mechatronics: A Radical Approach to Mechatronics Education

    OpenAIRE

    Nilsson, Martin

    2005-01-01

    This paper describes some distinguishing features of a course on mechatronics, based on computer science. We propose a teaching approach called Controlled Problem-Based Learning (CPBL). We have applied this method on three generations (2003-2005) of mainly fourth-year undergraduate students at Lund University (LTH). Although students found the course difficult, there were no dropouts, and all students attended the examination 2005.

  12. Demystifying the GMAT: Computer-Based Testing Terms

    Science.gov (United States)

    Rudner, Lawrence M.

    2012-01-01

    Computer-based testing can be a powerful means to make all aspects of test administration not only faster and more efficient, but also more accurate and more secure. While the Graduate Management Admission Test (GMAT) exam is a computer adaptive test, there are other approaches. This installment presents a primer of computer-based testing terms.

  13. Measurement-Based and Universal Blind Quantum Computation

    Science.gov (United States)

    Broadbent, Anne; Fitzsimons, Joseph; Kashefi, Elham

    Measurement-based quantum computation (MBQC) is a novel approach to quantum computation where the notion of measurement is the main driving force of computation. This is in contrast with the more traditional circuit model which is based on unitary operation. We review here the mathematical model underlying MBQC and the first quantum cryptographic protocol designed using the unique features of MBQC.

  14. Brain emotional learning based Brain Computer Interface

    Directory of Open Access Journals (Sweden)

    Abdolreza Asadi Ghanbari

    2012-09-01

    Full Text Available A brain computer interface (BCI enables direct communication between a brain and a computer translating brain activity into computer commands using preprocessing, feature extraction and classification operations. Classification is crucial as it has a substantial effect on the BCI speed and bit rate. Recent developments of brain-computer interfaces (BCIs bring forward some challenging problems to the machine learning community, of which classification of time-varying electrophysiological signals is a crucial one. Constructing adaptive classifiers is a promising approach to deal with this problem. In this paper, we introduce adaptive classifiers for classify electroencephalogram (EEG signals. The adaptive classifier is brain emotional learning based adaptive classifier (BELBAC, which is based on emotional learning process. The main purpose of this research is to use a structural model based on the limbic system of mammalian brain, for decision making and control engineering applications. We have adopted a network model developed by Moren and Balkenius, as a computational model that mimics amygdala, orbitofrontal cortex, thalamus, sensory input cortex and generally, those parts of the brain thought responsible for processing emotions. The developed method was compared with other methods used for EEG signals classification (support vector machine (SVM and two different neural network types (MLP, PNN. The result analysis demonstrated an efficiency of the proposed approach.

  15. Elucidating Drug-Enzyme Interactions and Their Structural Basis for Improving the Affinity and Potency of Isoniazid and Its Derivatives Based on Computer Modeling Approaches

    Directory of Open Access Journals (Sweden)

    Auradee Punkvang

    2010-04-01

    Full Text Available The enoyl-ACP reductase enzyme (InhA from M. tuberculosis is recognized as the primary target of isoniazid (INH, a first-line antibiotic for tuberculosis treatment. To identify the specific interactions of INH-NAD adduct and its derivative adducts in InhA binding pocket, molecular docking calculations and quantum chemical calculations were performed on a set of INH derivative adducts. Reliable binding modes of INH derivative adducts in the InhA pocket were established using the Autodock 3.05 program, which shows a good ability to reproduce the X-ray bound conformation with rmsd of less than 1.0 Å. The interaction energies of the INH-NAD adduct and its derivative adducts with individual amino acids in the InhA binding pocket were computed based on quantum chemical calculations at the MP2/6-31G (d level. The molecular docking and quantum chemical calculation results reveal that hydrogen bond interactions are the main interactions for adduct binding. To clearly delineate the linear relationship between structure and activity of these adducts, CoMFA and CoMSIA models were set up based on molecular docking alignment. The resulting CoMFA and CoMSIA models are in conformity with the best statistical qualities, in which r2cv is 0.67 and 0.74, respectively. Structural requirements of isoniazid derivatives that can be incorporated into the isoniazid framework to improve the activity have been identified through CoMFA and CoMSIA steric and electrostatic contour maps. The integrated results from structure-based, ligand-based design approaches and quantum chemical calculations provide useful structural information facilitating the design of new and more potentially effective antitubercular agents as follow: the R substituents of isoniazid derivatives should contain a large plane and both sides of the plane should contain an electropositive group. Moreover, the steric and electrostatic fields of the 4-pyridyl ring are optimal for greater potency.

  16. Computational Approach To Understanding Autism Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Włodzisław Duch

    2012-01-01

    Full Text Available Every year the prevalence of Autism Spectrum of Disorders (ASD is rising. Is there a unifying mechanism of various ASD cases at the genetic, molecular, cellular or systems level? The hypothesis advanced in this paper is focused on neural dysfunctions that lead to problems with attention in autistic people. Simulations of attractor neural networks performing cognitive functions help to assess system long-term neurodynamics. The Fuzzy Symbolic Dynamics (FSD technique is used for the visualization of attractors in the semantic layer of the neural model of reading. Large-scale simulations of brain structures characterized by a high order of complexity requires enormous computational power, especially if biologically motivated neuron models are used to investigate the influence of cellular structure dysfunctions on the network dynamics. Such simulations have to be implemented on computer clusters in a grid-based architectures

  17. Combining risk-management and computational approaches for trustworthiness evaluation of socio-technical systems

    OpenAIRE

    Gol Mohammadi, N.; Bandyszak, T.; Goldsteen, A.; Kalogiros, C.; Weyer, T.; Moffie, M.; Nasser, B.; Surridge, M

    2015-01-01

    The analysis of existing software evaluation techniques reveals the need for evidence-based evaluation of systems’ trustworthiness. This paper aims at evaluating trustworthiness of socio-technical systems during design-time. Our approach combines two existing evaluation techniques: a computa-tional approach and a risk management approach. The risk-based approach identifies threats to trustworthiness on an abstract level. Computational ap-proaches are applied to evaluate the expected end-to-en...

  18. Computation-based virtual screening for designing novel antimalarial drugs by targeting falcipain-III: a structure-based drug designing approach.

    Science.gov (United States)

    Kesharwani, Rajesh Kumar; Singh, Durg Vijay; Misra, Krishna

    2013-01-01

    Cysteine proteases (falcipains), a papain-family of enzymes of Plasmodium falciparum, are responsible for haemoglobin degradation and thus necessary for its survival during asexual life cycle phase inside the human red blood cells while remaining non-functional for the human body. Therefore, these can act as potential targets for designing antimalarial drugs. The P. falciparum cysteine proteases, falcipain-II and falcipain- III are the enzymes which initiate the haemoglobin degradation, therefore, have been selected as targets. In the present study, we have designed new leupeptin analogues and subjected to virtual screening using Glide at the active site cavity of falcipain-II and falcipain-III to select the best docked analogues on the basis of Glide score and also compare with the result of AutoDock. The proposed analogues can be synthesized and tested in vivo as future potent antimalarial drugs. Protein falcipain-II and falcipain-III together with bounds inhibitors epoxysuccinate E64 (E64) and leupeptin respectively were retrieved from protein data bank (PDB) and latter leupeptin was used as lead molecule to design new analogues by using Ligbuilder software and refined the molecules on the basis of Lipinski rule of five and fitness score parameters. All the designed leupeptin analogues were screened via docking simulation at the active site cavity of falcipain-II and falcipain-III by using Glide software and AutoDock. The 104 new leupeptin-based antimalarial ligands were designed using structure-based drug designing approach with the help of Ligbuilder and subjected for virtual screening via docking simulation method against falcipain-II and falcipain-III receptor proteins. The Glide docking results suggest that the ligands namely result_037 shows good binding and other two, result_044 and result_042 show nearly similar binding than naturally occurring PDB bound ligand E64 against falcipain-II and in case of falcipain-III, 15 designed leupeptin analogues having

  19. Computation-based virtual screening for designing novel antimalarial drugs by targeting falcipain-III: A structure-based drug designing approach

    Directory of Open Access Journals (Sweden)

    Rajesh Kumar Kesharwani

    2013-04-01

    Full Text Available Background & objectives: Cysteine proteases (falcipains, a papain-family of enzymes of Plasmodium falciparum, are responsible for haemoglobin degradation and thus necessary for its survival during asexual life cycle phase inside the human red blood cells while remaining non-functional for the human body. Therefore, these can act as potential targets for designing antimalarial drugs. The P. falciparum cysteine proteases, falcipain-II and falcipain- III are the enzymes which initiate the haemoglobin degradation, therefore, have been selected as targets. In the present study, we have designed new leupeptin analogues and subjected to virtual screening using Glide at the active site cavity of falcipain-II and falcipain-III to select the best docked analogues on the basis of Glide score and also compare with the result of AutoDock. The proposed analogues can be synthesized and tested in vivo as future potent antimalarial drugs. Methods: Protein falcipain-II and falcipain-III together with bounds inhibitors epoxysuccinate E64 (E64 and leupeptin respectively were retrieved from protein data bank (PDB and latter leupeptin was used as lead molecule to design new analogues by using Ligbuilder software and refined the molecules on the basis of Lipinski rule of five and fitness score parameters. All the designed leupeptin analogues were screened via docking simulation at the active site cavity of falcipain-II and falcipain-III by using Glide software and AutoDock. Results: The 104 new leupeptin-based antimalarial ligands were designed using structure-based drug designing approach with the help of Ligbuilder and subjected for virtual screening via docking simulation method against falcipain-II and falcipain-III receptor proteins. The Glide docking results suggest that the ligands namely result_037 shows good binding and other two, result_044 and result_042 show nearly similar binding than naturally occurring PDB bound ligand E64 against falcipain-II and in

  20. Computer Forensics for Graduate Accountants: A Motivational Curriculum Design Approach

    Directory of Open Access Journals (Sweden)

    Grover Kearns

    2010-06-01

    Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

  1. Efficient Approach for Load Balancing in Virtual Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Harvinder singh

    2014-10-01

    Full Text Available Cloud computing technology is changing the focus of IT world and it is becoming famous because of its great characteristics. Load balancing is one of the main challenges in cloud computing for distributing workloads across multiple computers or a computer cluster, network links, central processing units, disk drives, or other resources. Successful load balancing optimizes resource use, maximizes throughput, minimizes response time, and avoids overload. The objective of this paper to propose an approach for scheduling algorithms that can maintain the load balancing and provides better improved strategies through efficient job scheduling and modified resource allocation techniques. The results discussed in this paper, based on existing round robin, least connection, throttled load balance, fastest response time and a new proposed algorithm fastest with least connection scheduling algorithms. This new algorithm identifies the overall response time and data centre processing time is improved as well as cost is reduced in comparison to the existing scheduling parameters.

  2. EFFICIENT APPROACH FOR LOAD BALANCING IN VIRTUAL CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Harvinder Singh

    2015-10-01

    Full Text Available Cloud computing technology is changing the focus of IT world and it is becoming famous because of its great characteristics. Load balancing is one of the main challenges in cloud computing for distributing workloads across multiple computers or a computer cluster, network links, central processing units, disk drives, or other resources. Successful load balancing optimizes resource use, maximizes throughput, minimizes response time, and avoids overload. The objective of this paper to propose an approach for scheduling algorithms that can maintain the load balancing and provides better improved strategies through efficient job scheduling and modified resource allocation techniques. The results discussed in this paper, based on existing round robin, least connection, throttled load balance, fastest response time and a new proposed algorithm fastest with least connection scheduling algorithms. This new algorithm identifies the overall response time and data centre processing time is improved as well as cost is reduced in comparison to the existing scheduling parameters.

  3. Novel computational approaches characterizing knee physiotherapy

    Directory of Open Access Journals (Sweden)

    Wangdo Kim

    2014-01-01

    Full Text Available A knee joint’s longevity depends on the proper integration of structural components in an axial alignment. If just one of the components is abnormally off-axis, the biomechanical system fails, resulting in arthritis. The complexity of various failures in the knee joint has led orthopedic surgeons to select total knee replacement as a primary treatment. In many cases, this means sacrificing much of an otherwise normal joint. Here, we review novel computational approaches to describe knee physiotherapy by introducing a new dimension of foot loading to the knee axis alignment producing an improved functional status of the patient. New physiotherapeutic applications are then possible by aligning foot loading with the functional axis of the knee joint during the treatment of patients with osteoarthritis.

  4. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  5. Transaction based approach

    Science.gov (United States)

    Hunka, Frantisek; Matula, Jiri

    2017-07-01

    Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.

  6. Processing of laser Doppler flowmetry signals from healthy subjects and patients with varicose veins: Information categorisation approach based on intrinsic mode functions and entropy computation.

    Science.gov (United States)

    Humeau-Heurtier, Anne; Klonizakis, Markos

    2015-06-01

    The diagnosis of pathologies from signal processing approaches has shown to be of importance. This can provide noninvasive information at the earliest stage. In this work, the problem of categorising - in a quantifiable manner - information content of microvascular blood flow signals recorded in healthy participants and patients with varicose veins is addressed. For this purpose, laser Doppler flowmetry (LDF) signals - that reflect microvascular blood flow - recorded both at rest and after acetylcholine (ACh) stimulation (an endothelial-dependent vasodilator) are analyzed. Each signal is processed with the empirical mode decomposition (EMD) to obtain its intrinsic mode functions (IMFs). An entropy measure of each IMFs is then computed. The results show that IMFs of LDF signals have different complexity for different physiologic/pathological states. This is true both at rest and after ACh stimulation. Thus, the proposed framework (EMD + entropy computation) may be used to gain a noninvasive understanding of LDF signals in patients with microvascular dysfunctions.

  7. Global computational algebraic topology approach for diffusion

    Science.gov (United States)

    Auclair-Fortier, Marie-Flavie; Ziou, Djemel; Allili, Madjid

    2004-05-01

    One physical process involved in many computer vision problems is the heat diffusion process. Such Partial differential equations are continuous and have to be discretized by some techniques, mostly mathematical processes like finite differences or finite elements. The continuous domain is subdivided into sub-domains in which there is only one value. The diffusion equation comes from the energy conservation then it is valid on a whole domain. We use the global equation instead of discretize the PDE obtained by a limit process on this global equation. To encode these physical global values over pixels of different dimensions, we use a computational algebraic topology (CAT)-based image model. This model has been proposed by Ziou and Allili and used for the deformation of curves and optical flow. It introduces the image support as a decomposition in terms of points, edges, surfaces, volumes, etc. Images of any dimensions can then be handled. After decomposing the physical principles of the heat transfer into basic laws, we recall the CAT-based image model and use it to encode the basic laws. We then present experimental results for nonlinear graylevel diffusion for denoising, ensuring thin features preservation.

  8. A common geometric data-base approach for computer-aided manufacturing of wind-tunnel models and theoretical aerodynamic analysis

    Science.gov (United States)

    See, M. J.; Cozzolongo, J. V.

    1983-01-01

    A more automated process to produce wind tunnel models using existing facilities is discussed. A process was sought to more rapidly determine the aerodynamic characteristics of advanced aircraft configurations. Such aerodynamic characteristics are determined from theoretical analyses and wind tunnel tests of the configurations. Computers are used to perform the theoretical analyses, and a computer aided manufacturing system is used to fabricate the wind tunnel models. In the past a separate set of input data describing the aircraft geometry had to be generated for each process. This process establishes a common data base by enabling the computer aided manufacturing system to use, via a software interface, the geometric input data generated for the theoretical analysis. Thus, only one set of geometric data needs to be generated. Tests reveal that the process can reduce by several weeks the time needed to produce a wind tunnel model component. In addition, this process increases the similarity of the wind tunnel model to the mathematical model used by the theoretical aerodynamic analysis programs. Specifically, the wind tunnel model can be machined to within 0.008 in. of the original mathematical model. However, the software interface is highly complex and cumbersome to operate, making it unsuitable for routine use. The procurement of an independent computer aided design/computer aided manufacturing system with the capability to support both the theoretical analysis and the manufacturing tasks was recommended.

  9. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure.

    Science.gov (United States)

    Chen, Wen Hao; Yang, Sam Y S; Xiao, Ti Qiao; Mayo, Sherry C; Wang, Yu Dan; Wang, Hai Peng

    2014-05-01

    Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials.

  10. The Metacognitive Approach to Computer Education: Making Explicit the Learning Journey

    Science.gov (United States)

    Phelps, Renata

    2007-01-01

    This paper presents a theoretical and practical exploration of a metacognitive approach to computer education, developed through a three-year action research project. It is argued that the approach contrasts significantly with often-employed directive and competency-based approaches to computer education and is more appropriate in addressing the…

  11. An Evaluation of Training Interventions and Computed Scoring Techniques for Grading a Level Turn Task and a Straight In Landing Approach on a PC-Based Flight Simulator

    Science.gov (United States)

    Heath, Bruce E.

    2007-01-01

    One result of the relatively recent advances in computing technology has been the decreasing cost of computers and increasing computational power. This has allowed high fidelity airplane simulations to be run on personal computers (PC). Thus, simulators are now used routinely by pilots to substitute real flight hours for simulated flight hours for training for an aircraft type rating thereby reducing the cost of flight training. However, FAA regulations require that such substitution training must be supervised by Certified Flight Instructors (CFI). If the CFI presence could be reduced or eliminated for certain tasks this would mean a further cost savings to the pilot. This would require that the flight simulator have a certain level of 'intelligence' in order to provide feedback on pilot perfolmance similar to that of a CFI. The 'intelligent' flight sinlulator would have at least the capability to use data gathered from the flight to create a measure for the performance of the student pilot. Also, to fully utilize the advances in computational power, the sinlulator would be capable of interacting with the student pilot using the best possible training interventions. This thesis reposts on the two studies conducted at Tuskegee University investigating the effects of interventions on the learning of two flight maneuvers on a flight sinlulator and the robustness and accuracy of calculated perfornlance indices as compared to CFI evaluations of performance. The intent of these studies is to take a step in the direction of creating an 'intelligent' flight simulator. The first study deals with the comparisons of novice pilot performance trained at different levels of above real-time to execute a level S-turn. The second study examined the effect of out-of-the-window (OTW) visual cues in the form of hoops on the performance of novice pilots learning to fly a landing approach on the flight simulator. The reliability/robustness of the computed performance metrics was

  12. Computational approaches to homogeneous gold catalysis.

    Science.gov (United States)

    Faza, Olalla Nieto; López, Carlos Silva

    2015-01-01

    Homogenous gold catalysis has been exploding for the last decade at an outstanding pace. The best described reactivity of Au(I) and Au(III) species is based on gold's properties as a soft Lewis acid, but new reactivity patterns have recently emerged which further expand the range of transformations achievable using gold catalysis, with examples of dual gold activation, hydrogenation reactions, or Au(I)/Au(III) catalytic cycles.In this scenario, to develop fully all these new possibilities, the use of computational tools to understand at an atomistic level of detail the complete role of gold as a catalyst is unavoidable. In this work we aim to provide a comprehensive review of the available benchmark works on methodological options to study homogenous gold catalysis in the hope that this effort can help guide the choice of method in future mechanistic studies involving gold complexes. This is relevant because a representative number of current mechanistic studies still use methods which have been reported as inappropriate and dangerously inaccurate for this chemistry.Together with this, we describe a number of recent mechanistic studies where computational chemistry has provided relevant insights into non-conventional reaction paths, unexpected selectivities or novel reactivity, which illustrate the complexity behind gold-mediated organic chemistry.

  13. Computer-aided diagnosis of Parkinson’s disease based on [123I]FP-CIT SPECT binding potential images, using the voxels-as-features approach and support vector machines

    Science.gov (United States)

    Oliveira, Francisco P. M.; Castelo-Branco, Miguel

    2015-04-01

    Objective. The aim of the present study was to develop a fully-automated computational solution for computer-aided diagnosis in Parkinson syndrome based on [123I]FP-CIT single photon emission computed tomography (SPECT) images. Approach. A dataset of 654 [123I]FP-CIT SPECT brain images from the Parkinson’s Progression Markers Initiative were used. Of these, 445 images were of patients with Parkinson’s disease at an early stage and the remainder formed a control group. The images were pre-processed using automated template-based registration followed by the computation of the binding potential at a voxel level. Then, the binding potential images were used for classification, based on the voxel-as-feature approach and using the support vector machines paradigm. Main results. The obtained estimated classification accuracy was 97.86%, the sensitivity was 97.75% and the specificity 98.09%. Significance. The achieved classification accuracy was very high and, in fact, higher than accuracies found in previous studies reported in the literature. In addition, results were obtained on a large dataset of early Parkinson’s disease subjects. In summation, the information provided by the developed computational solution potentially supports clinical decision-making in nuclear medicine, using important additional information beyond the commonly used uptake ratios and respective statistical comparisons. (ClinicalTrials.gov Identifier: NCT01141023)

  14. Hunter disease eClinic: interactive, computer-assisted, problem-based approach to independent learning about a rare genetic disease

    Directory of Open Access Journals (Sweden)

    Moldovan Laura

    2010-10-01

    Full Text Available Abstract Background Computer-based teaching (CBT is a well-known educational device, but it has never been applied systematically to the teaching of a complex, rare, genetic disease, such as Hunter disease (MPS II. Aim To develop interactive teaching software functioning as a virtual clinic for the management of MPS II. Implementation and Results The Hunter disease eClinic, a self-training, user-friendly educational software program, available at the Lysosomal Storage Research Group (http://www.lysosomalstorageresearch.ca, was developed using the Adobe Flash multimedia platform. It was designed to function both to provide a realistic, interactive virtual clinic and instantaneous access to supporting literature on Hunter disease. The Hunter disease eClinic consists of an eBook and an eClinic. The eClinic is the interactive virtual clinic component of the software. Within an environment resembling a real clinic, the trainee is instructed to perform a medical history, to examine the patient, and to order appropriate investigation. The program provides clinical data derived from the management of actual patients with Hunter disease. The eBook provides instantaneous, electronic access to a vast collection of reference information to provide detailed background clinical and basic science, including relevant biochemistry, physiology, and genetics. In the eClinic, the trainee is presented with quizzes designed to provide immediate feedback on both trainee effectiveness and efficiency. User feedback on the merits of the program was collected at several seminars and formal clinical rounds at several medical centres, primarily in Canada. In addition, online usage statistics were documented for a 2-year period. Feedback was consistently positive and confirmed the practical benefit of the program. The online English-language version is accessed daily by users from all over the world; a Japanese translation of the program is also available. Conclusions The

  15. The fundamentals of computational intelligence system approach

    CERN Document Server

    Zgurovsky, Mikhail Z

    2017-01-01

    This monograph is dedicated to the systematic presentation of main trends, technologies and methods of computational intelligence (CI). The book pays big attention to novel important CI technology- fuzzy logic (FL) systems and fuzzy neural networks (FNN). Different FNN including new class of FNN- cascade neo-fuzzy neural networks are considered and their training algorithms are described and analyzed. The applications of FNN to the forecast in macroeconomics and at stock markets are examined. The book presents the problem of portfolio optimization under uncertainty, the novel theory of fuzzy portfolio optimization free of drawbacks of classical model of Markovitz as well as an application for portfolios optimization at Ukrainian, Russian and American stock exchanges. The book also presents the problem of corporations bankruptcy risk forecasting under incomplete and fuzzy information, as well as new methods based on fuzzy sets theory and fuzzy neural networks and results of their application for bankruptcy ris...

  16. A machine-learning approach for computation of fractional flow reserve from coronary computed tomography.

    Science.gov (United States)

    Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin

    2016-07-01

    Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.

  17. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  18. A computational psychiatry approach identifies how alpha-2A noradrenergic agonist Guanfacine affects feature-based reinforcement learning in the macaque

    Science.gov (United States)

    Hassani, S. A.; Oemisch, M.; Balcarras, M.; Westendorff, S.; Ardid, S.; van der Meer, M. A.; Tiesinga, P.; Womelsdorf, T.

    2017-01-01

    Noradrenaline is believed to support cognitive flexibility through the alpha 2A noradrenergic receptor (a2A-NAR) acting in prefrontal cortex. Enhanced flexibility has been inferred from improved working memory with the a2A-NA agonist Guanfacine. But it has been unclear whether Guanfacine improves specific attention and learning mechanisms beyond working memory, and whether the drug effects can be formalized computationally to allow single subject predictions. We tested and confirmed these suggestions in a case study with a healthy nonhuman primate performing a feature-based reversal learning task evaluating performance using Bayesian and Reinforcement learning models. In an initial dose-testing phase we found a Guanfacine dose that increased performance accuracy, decreased distractibility and improved learning. In a second experimental phase using only that dose we examined the faster feature-based reversal learning with Guanfacine with single-subject computational modeling. Parameter estimation suggested that improved learning is not accounted for by varying a single reinforcement learning mechanism, but by changing the set of parameter values to higher learning rates and stronger suppression of non-chosen over chosen feature information. These findings provide an important starting point for developing nonhuman primate models to discern the synaptic mechanisms of attention and learning functions within the context of a computational neuropsychiatry framework. PMID:28091572

  19. A computational psychiatry approach identifies how alpha-2A noradrenergic agonist Guanfacine affects feature-based reinforcement learning in the macaque.

    Science.gov (United States)

    Hassani, S A; Oemisch, M; Balcarras, M; Westendorff, S; Ardid, S; van der Meer, M A; Tiesinga, P; Womelsdorf, T

    2017-01-16

    Noradrenaline is believed to support cognitive flexibility through the alpha 2A noradrenergic receptor (a2A-NAR) acting in prefrontal cortex. Enhanced flexibility has been inferred from improved working memory with the a2A-NA agonist Guanfacine. But it has been unclear whether Guanfacine improves specific attention and learning mechanisms beyond working memory, and whether the drug effects can be formalized computationally to allow single subject predictions. We tested and confirmed these suggestions in a case study with a healthy nonhuman primate performing a feature-based reversal learning task evaluating performance using Bayesian and Reinforcement learning models. In an initial dose-testing phase we found a Guanfacine dose that increased performance accuracy, decreased distractibility and improved learning. In a second experimental phase using only that dose we examined the faster feature-based reversal learning with Guanfacine with single-subject computational modeling. Parameter estimation suggested that improved learning is not accounted for by varying a single reinforcement learning mechanism, but by changing the set of parameter values to higher learning rates and stronger suppression of non-chosen over chosen feature information. These findings provide an important starting point for developing nonhuman primate models to discern the synaptic mechanisms of attention and learning functions within the context of a computational neuropsychiatry framework.

  20. A semantic-web approach for modeling computing infrastructures

    NARCIS (Netherlands)

    M. Ghijsen; J. van der Ham; P. Grosso; C. Dumitru; H. Zhu; Z. Zhao; C. de Laat

    2013-01-01

    This paper describes our approach to modeling computing infrastructures. Our main contribution is the Infrastructure and Network Description Language (INDL) ontology. The aim of INDL is to provide technology independent descriptions of computing infrastructures, including the physical resources as w

  1. Pharmacophore modeling, docking, and principal component analysis based clustering: combined computer-assisted approaches to identify new inhibitors of the human rhinovirus coat protein.

    Science.gov (United States)

    Steindl, Theodora M; Crump, Carolyn E; Hayden, Frederick G; Langer, Thierry

    2005-10-06

    The development and application of a sophisticated virtual screening and selection protocol to identify potential, novel inhibitors of the human rhinovirus coat protein employing various computer-assisted strategies are described. A large commercially available database of compounds was screened using a highly selective, structure-based pharmacophore model generated with the program Catalyst. A docking study and a principal component analysis were carried out within the software package Cerius and served to validate and further refine the obtained results. These combined efforts led to the selection of six candidate structures, for which in vitro anti-rhinoviral activity could be shown in a biological assay.

  2. Computation of Difference Grobner Bases

    Directory of Open Access Journals (Sweden)

    Vladimir P. Gerdt

    2012-07-01

    Full Text Available This paper is an updated and extended version of our note \\cite{GR'06} (cf.\\ also \\cite{GR-ACAT}. To compute difference \\Gr bases of ideals generated by linear polynomials we adopt to difference polynomial rings the involutive algorithm based on Janet-like division. The algorithm has been implemented in Maple in the form of the package LDA (Linear Difference Algebra and we describe the main features of the package. Its applications are illustrated by generation of finite difference approximations to linear partial differential equations and by reduction of Feynman integrals. We also present the algorithm for an ideal generated by a finite set of nonlinear difference polynomials. If the algorithm terminates, then it constructs a \\Gr basis of the ideal.

  3. Computational Approach to Dendritic Spine Taxonomy and Shape Transition Analysis

    Science.gov (United States)

    Bokota, Grzegorz; Magnowska, Marta; Kuśmierczyk, Tomasz; Łukasik, Michał; Roszkowska, Matylda; Plewczynski, Dariusz

    2016-01-01

    The common approach in morphological analysis of dendritic spines of mammalian neuronal cells is to categorize spines into subpopulations based on whether they are stubby, mushroom, thin, or filopodia shaped. The corresponding cellular models of synaptic plasticity, long-term potentiation, and long-term depression associate the synaptic strength with either spine enlargement or spine shrinkage. Although a variety of automatic spine segmentation and feature extraction methods were developed recently, no approaches allowing for an automatic and unbiased distinction between dendritic spine subpopulations and detailed computational models of spine behavior exist. We propose an automatic and statistically based method for the unsupervised construction of spine shape taxonomy based on arbitrary features. The taxonomy is then utilized in the newly introduced computational model of behavior, which relies on transitions between shapes. Models of different populations are compared using supplied bootstrap-based statistical tests. We compared two populations of spines at two time points. The first population was stimulated with long-term potentiation, and the other in the resting state was used as a control. The comparison of shape transition characteristics allowed us to identify the differences between population behaviors. Although some extreme changes were observed in the stimulated population, statistically significant differences were found only when whole models were compared. The source code of our software is freely available for non-commercial use1. Contact: d.plewczynski@cent.uw.edu.pl. PMID:28066226

  4. Computational algebraic topology-based video restoration

    Science.gov (United States)

    Rochel, Alban; Ziou, Djemel; Auclair-Fortier, Marie-Flavie

    2005-03-01

    This paper presents a scheme for video denoising by diffusion of gray levels, based on the Computational Algebraic Topology (CAT) image model. The diffusion approach is similar to the one used to denoise static images. Rather than using the heat transfer partial differential equation, discretizing it and solving it by a purely mathematical process, the CAT approach considers the global expression of the heat transfer and decomposes it into elementary physical laws. Some of these laws describe conservative relations, leading to error-free expressions, whereas others depend on metric quantities and require approximation. This scheme allows for a physical interpretation for each step of the resolution process. We propose a nonlinear and an anisotropic diffusion algorithms based on the extension to video of an existing 2D algorithm thanks to the flexibility of the topological support. Finally it is validated with experimental results.

  5. A computational language approach to modeling prose recall in schizophrenia.

    Science.gov (United States)

    Rosenstein, Mark; Diaz-Asper, Catherine; Foltz, Peter W; Elvevåg, Brita

    2014-06-01

    Many cortical disorders are associated with memory problems. In schizophrenia, verbal memory deficits are a hallmark feature. However, the exact nature of this deficit remains elusive. Modeling aspects of language features used in memory recall have the potential to provide means for measuring these verbal processes. We employ computational language approaches to assess time-varying semantic and sequential properties of prose recall at various retrieval intervals (immediate, 30 min and 24 h later) in patients with schizophrenia, unaffected siblings and healthy unrelated control participants. First, we model the recall data to quantify the degradation of performance with increasing retrieval interval and the effect of diagnosis (i.e., group membership) on performance. Next we model the human scoring of recall performance using an n-gram language sequence technique, and then with a semantic feature based on Latent Semantic Analysis. These models show that automated analyses of the recalls can produce scores that accurately mimic human scoring. The final analysis addresses the validity of this approach by ascertaining the ability to predict group membership from models built on the two classes of language features. Taken individually, the semantic feature is most predictive, while a model combining the features improves accuracy of group membership prediction slightly above the semantic feature alone as well as over the human rating approach. We discuss the implications for cognitive neuroscience of such a computational approach in exploring the mechanisms of prose recall.

  6. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  7. Aluminium in Biological Environments: A Computational Approach

    Science.gov (United States)

    Mujika, Jon I; Rezabal, Elixabete; Mercero, Jose M; Ruipérez, Fernando; Costa, Dominique; Ugalde, Jesus M; Lopez, Xabier

    2014-01-01

    The increased availability of aluminium in biological environments, due to human intervention in the last century, raises concerns on the effects that this so far “excluded from biology” metal might have on living organisms. Consequently, the bioinorganic chemistry of aluminium has emerged as a very active field of research. This review will focus on our contributions to this field, based on computational studies that can yield an understanding of the aluminum biochemistry at a molecular level. Aluminium can interact and be stabilized in biological environments by complexing with both low molecular mass chelants and high molecular mass peptides. The speciation of the metal is, nonetheless, dictated by the hydrolytic species dominant in each case and which vary according to the pH condition of the medium. In blood, citrate and serum transferrin are identified as the main low molecular mass and high molecular mass molecules interacting with aluminium. The complexation of aluminium to citrate and the subsequent changes exerted on the deprotonation pathways of its tritable groups will be discussed along with the mechanisms for the intake and release of aluminium in serum transferrin at two pH conditions, physiological neutral and endosomatic acidic. Aluminium can substitute other metals, in particular magnesium, in protein buried sites and trigger conformational disorder and alteration of the protonation states of the protein's sidechains. A detailed account of the interaction of aluminium with proteic sidechains will be given. Finally, it will be described how alumnium can exert oxidative stress by stabilizing superoxide radicals either as mononuclear aluminium or clustered in boehmite. The possibility of promotion of Fenton reaction, and production of hydroxyl radicals will also be discussed. PMID:24757505

  8. Human Computer Interaction: An intellectual approach

    Directory of Open Access Journals (Sweden)

    Kuntal Saroha

    2011-08-01

    Full Text Available This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI relating tohuman psychology. Human-computer interaction (HCI isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. It involves input and output devices andthe interaction techniques that use them; how information ispresented and requested; how the computer’s actions arecontrolled and monitored; all forms of help, documentation,and training; the tools used to design, build, test, andevaluate user interfaces; and the processes that developersfollow when creating Interfaces.

  9. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  10. A task-based parallelism and vectorized approach to 3D Method of Characteristics (MOC) reactor simulation for high performance computing architectures

    Science.gov (United States)

    Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.

    2016-05-01

    In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.

  11. A combined experimental atomic force microscopy-based nanoindentation and computational modeling approach to unravel the key contributors to the time-dependent mechanical behavior of single cells.

    Science.gov (United States)

    Florea, Cristina; Tanska, Petri; Mononen, Mika E; Qu, Chengjuan; Lammi, Mikko J; Laasanen, Mikko S; Korhonen, Rami K

    2017-02-01

    Cellular responses to mechanical stimuli are influenced by the mechanical properties of cells and the surrounding tissue matrix. Cells exhibit viscoelastic behavior in response to an applied stress. This has been attributed to fluid flow-dependent and flow-independent mechanisms. However, the particular mechanism that controls the local time-dependent behavior of cells is unknown. Here, a combined approach of experimental AFM nanoindentation with computational modeling is proposed, taking into account complex material behavior. Three constitutive models (porohyperelastic, viscohyperelastic, poroviscohyperelastic) in tandem with optimization algorithms were employed to capture the experimental stress relaxation data of chondrocytes at 5 % strain. The poroviscohyperelastic models with and without fluid flow allowed through the cell membrane provided excellent description of the experimental time-dependent cell responses (normalized mean squared error (NMSE) of 0.003 between the model and experiments). The viscohyperelastic model without fluid could not follow the entire experimental data that well (NMSE = 0.005), while the porohyperelastic model could not capture it at all (NMSE = 0.383). We also show by parametric analysis that the fluid flow has a small, but essential effect on the loading phase and short-term cell relaxation response, while the solid viscoelasticity controls the longer-term responses. We suggest that the local time-dependent cell mechanical response is determined by the combined effects of intrinsic viscoelasticity of the cytoskeleton and fluid flow redistribution in the cells, although the contribution of fluid flow is smaller when using a nanosized probe and moderate indentation rate. The present approach provides new insights into viscoelastic responses of chondrocytes, important for further understanding cell mechanobiological mechanisms in health and disease.

  12. An Approach for Location privacy in Pervasive Computing Environment

    Directory of Open Access Journals (Sweden)

    Sudheer Kumar Singh

    2010-05-01

    Full Text Available This paper focus on location privacy in location based services, Location privacy is a particular type of information privacy that can be defined as the ability to prevent others from learning one’s current or past location. Many systems such as GPS implicitly and automatically give its users location privacy. Once user sends his or her current location to the application server, Application server stores current locations of users in application server database. User can not delete or modify his or her location data after sending once to application server. Addressing this problem, Here in this paper, we are giving theoretical concept for protecting location privacy in pervasive computing environment. This approach based on user anonymity based location privacy. Going through the basic user anonymity based a location privacy approach that uses trusted proxy. By analysis of this approach, we propose an improvement over it using dummy-locations of users and also dummies of requested services by users from the application server. In this paper, this approach reduces the user’s overheads to extracting necessary information from reply message coming from application server. In this approach, user send a message having (current location and ID+ requested service to the trusted proxy and trusted proxy generates dummies location related to current location and also generates temporary pseudonym corresponding to real ID of users. After Analysis of this approach wehave found on problem with requested service. Addressing this problem, we improve our method by using dummies of requested service generated by trusted proxy. Trusted proxy generated Dummies (false position by dummies location algorithms.

  13. A programmable approach to revising knowledge bases

    Institute of Scientific and Technical Information of China (English)

    LUAN Shangmin; DAI Guozhong; LI Wei

    2005-01-01

    This paper presents a programmable approach to revising knowledge bases consisting of clauses. Some theorems and lemmas are shown in order to give procedures for generating maximally consistent subsets. Then a complete procedure and an incomplete procedure for generating the maximal consistent subsets are presented, and the correctness of the procedures is also shown. Furthermore, a way to implement knowledge base revision is presented, and a prototype system is introduced. Compared with related works, the main characteristic of our approach is that the approach can be implemented by a computer program.

  14. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  15. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  16. Computational Models of Spreadsheet Development: Basis for Educational Approaches

    CERN Document Server

    Hodnigg, Karin; Mittermeir, Roland T

    2008-01-01

    Among the multiple causes of high error rates in spreadsheets, lack of proper training and of deep understanding of the computational model upon which spreadsheet computations rest might not be the least issue. The paper addresses this problem by presenting a didactical model focussing on cell interaction, thus exceeding the atomicity of cell computations. The approach is motivated by an investigation how different spreadsheet systems handle certain computational issues implied from moving cells, copy-paste operations, or recursion.

  17. Heterogeneous Computing in Economics: A Simplified Approach

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.; Grassi, Stefano

    This paper shows the potential of heterogeneous computing in solving dynamic equilibrium models in economics. We illustrate the power and simplicity of the C++ Accelerated Massive Parallelism recently introduced by Microsoft. Starting from the same exercise as Aldrich et al. (2011) we document a ...

  18. Molecular electromagnetism a computational chemistry approach

    CERN Document Server

    Sauer, Stephan P A

    2011-01-01

    A textbook for a one-semester course for students in chemistry physics and nanotechnology, this book examines the interaction of molecules with electric and magnetic fields as, for example in light. The book provides the necessary background knowledge for simulating these interactions on computers with modern quantum chemical software.

  19. Scaling Critical Zone analysis tasks from desktop to the cloud utilizing contemporary distributed computing and data management approaches: A case study for project based learning of Cyberinfrastructure concepts

    Science.gov (United States)

    Swetnam, T. L.; Pelletier, J. D.; Merchant, N.; Callahan, N.; Lyons, E.

    2015-12-01

    Earth science is making rapid advances through effective utilization of large-scale data repositories such as aerial LiDAR and access to NSF-funded cyberinfrastructures (e.g. the OpenTopography.org data portal, iPlant Collaborative, and XSEDE). Scaling analysis tasks that are traditionally developed using desktops, laptops or computing clusters to effectively leverage national and regional scale cyberinfrastructure pose unique challenges and barriers to adoption. To address some of these challenges in Fall 2014 an 'Applied Cyberinfrastructure Concepts' a project-based learning course (ISTA 420/520) at the University of Arizona focused on developing scalable models of 'Effective Energy and Mass Transfer' (EEMT, MJ m-2 yr-1) for use by the NSF Critical Zone Observatories (CZO) project. EEMT is a quantitative measure of the flux of available energy to the critical zone, and its computation involves inputs that have broad applicability (e.g. solar insolation). The course comprised of 25 students with varying level of computational skills and with no prior domain background in the geosciences, collaborated with domain experts to develop the scalable workflow. The original workflow relying on open-source QGIS platform on a laptop was scaled to effectively utilize cloud environments (Openstack), UA Campus HPC systems, iRODS, and other XSEDE and OSG resources. The project utilizes public data, e.g. DEMs produced by OpenTopography.org and climate data from Daymet, which are processed using GDAL, GRASS and SAGA and the Makeflow and Work-queue task management software packages. Students were placed into collaborative groups to develop the separate aspects of the project. They were allowed to change teams, alter workflows, and design and develop novel code. The students were able to identify all necessary dependencies, recompile source onto the target execution platforms, and demonstrate a functional workflow, which was further improved upon by one of the group leaders over

  20. Toward a rationale for the PTC124 (Ataluren) promoted readthrough of premature stop codons: a computational approach and GFP-reporter cell-based assay.

    Science.gov (United States)

    Lentini, Laura; Melfi, Raffaella; Di Leonardo, Aldo; Spinello, Angelo; Barone, Giampaolo; Pace, Andrea; Palumbo Piccionello, Antonio; Pibiri, Ivana

    2014-03-03

    The presence in the mRNA of premature stop codons (PTCs) results in protein truncation responsible for several inherited (genetic) diseases. A well-known example of these diseases is cystic fibrosis (CF), where approximately 10% (worldwide) of patients have nonsense mutations in the CF transmembrane regulator (CFTR) gene. PTC124 (3-(5-(2-fluorophenyl)-1,2,4-oxadiazol-3-yl)-benzoic acid), also known as Ataluren, is a small molecule that has been suggested to allow PTC readthrough even though its target has yet to be identified. In the lack of a general consensus about its mechanism of action, we experimentally tested the ability of PTC124 to promote the readthrough of premature termination codons by using a new reporter. The reporter vector was based on a plasmid harboring the H2B histone coding sequence fused in frame with the green fluorescent protein (GFP) cDNA, and a TGA stop codon was introduced in the H2B-GFP gene by site-directed mutagenesis. Additionally, an unprecedented computational study on the putative supramolecular interaction between PTC124 and an 11-codon (33-nucleotides) sequence corresponding to a CFTR mRNA fragment containing a central UGA nonsense mutation showed a specific interaction between PTC124 and the UGA codon. Altogether, the H2B-GFP-opal based assay and the molecular dynamics (MD) simulation support the hypothesis that PTC124 is able to promote the specific readthrough of internal TGA premature stop codons.

  1. Fighting obesity with a sugar-based library: discovery of novel MCH-1R antagonists by a new computational-VAST approach for exploration of GPCR binding sites.

    Science.gov (United States)

    Heifetz, Alexander; Barker, Oliver; Verquin, Geraldine; Wimmer, Norbert; Meutermans, Wim; Pal, Sandeep; Law, Richard J; Whittaker, Mark

    2013-05-24

    Obesity is an increasingly common disease. While antagonism of the melanin-concentrating hormone-1 receptor (MCH-1R) has been widely reported as a promising therapeutic avenue for obesity treatment, no MCH-1R antagonists have reached the market. Discovery and optimization of new chemical matter targeting MCH-1R is hindered by reduced HTS success rates and a lack of structural information about the MCH-1R binding site. X-ray crystallography and NMR, the major experimental sources of structural information, are very slow processes for membrane proteins and are not currently feasible for every GPCR or GPCR-ligand complex. This situation significantly limits the ability of these methods to impact the drug discovery process for GPCR targets in "real-time", and hence, there is an urgent need for other practical and cost-efficient alternatives. We present here a conceptually pioneering approach that integrates GPCR modeling with design, synthesis, and screening of a diverse library of sugar-based compounds from the VAST technology (versatile assembly on stable templates) to provide structural insights on the MCH-1R binding site. This approach creates a cost-efficient new avenue for structure-based drug discovery (SBDD) against GPCR targets. In our work, a primary VAST hit was used to construct a high-quality MCH-1R model. Following model validation, a structure-based virtual screen yielded a 14% hit rate and 10 novel chemotypes of potent MCH-1R antagonists, including EOAI3367472 (IC50 = 131 nM) and EOAI3367474 (IC50 = 213 nM).

  2. Computational approaches to understand cardiac electrophysiology and arrhythmias

    Science.gov (United States)

    Roberts, Byron N.; Yang, Pei-Chi; Behrens, Steven B.; Moreno, Jonathan D.

    2012-01-01

    Cardiac rhythms arise from electrical activity generated by precisely timed opening and closing of ion channels in individual cardiac myocytes. These impulses spread throughout the cardiac muscle to manifest as electrical waves in the whole heart. Regularity of electrical waves is critically important since they signal the heart muscle to contract, driving the primary function of the heart to act as a pump and deliver blood to the brain and vital organs. When electrical activity goes awry during a cardiac arrhythmia, the pump does not function, the brain does not receive oxygenated blood, and death ensues. For more than 50 years, mathematically based models of cardiac electrical activity have been used to improve understanding of basic mechanisms of normal and abnormal cardiac electrical function. Computer-based modeling approaches to understand cardiac activity are uniquely helpful because they allow for distillation of complex emergent behaviors into the key contributing components underlying them. Here we review the latest advances and novel concepts in the field as they relate to understanding the complex interplay between electrical, mechanical, structural, and genetic mechanisms during arrhythmia development at the level of ion channels, cells, and tissues. We also discuss the latest computational approaches to guiding arrhythmia therapy. PMID:22886409

  3. Computational anatomy based on whole body imaging basic principles of computer-assisted diagnosis and therapy

    CERN Document Server

    Masutani, Yoshitaka

    2017-01-01

    This book deals with computational anatomy, an emerging discipline recognized in medical science as a derivative of conventional anatomy. It is also a completely new research area on the boundaries of several sciences and technologies, such as medical imaging, computer vision, and applied mathematics. Computational Anatomy Based on Whole Body Imaging highlights the underlying principles, basic theories, and fundamental techniques in computational anatomy, which are derived from conventional anatomy, medical imaging, computer vision, and applied mathematics, in addition to various examples of applications in clinical data. The book will cover topics on the basics and applications of the new discipline. Drawing from areas in multidisciplinary fields, it provides comprehensive, integrated coverage of innovative approaches to computational anatomy. As well,Computational Anatomy Based on Whole Body Imaging serves as a valuable resource for researchers including graduate students in the field and a connection with ...

  4. Music Genre Classification Systems - A Computational Approach

    OpenAIRE

    Ahrendt, Peter; Hansen, Lars Kai

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular...

  5. PRINCEPS: A Computer-Based Approach to the Structural Description and Recognition of Trends within Structural Databases, and Its Application to the Ce-Ni-Si System

    Directory of Open Access Journals (Sweden)

    Yiming Guo

    2016-04-01

    Full Text Available Intermetallic crystal structures offer an enormous structural diversity, with an endless array of structural motifs whose connection to stability and physical properties are often mysterious. Making sense of the often complex crystal structures that arise here, developing a clear structural description, and identifying connections to other phases can be laborious and require an encyclopedic knowledge of structure types. In this Article, we present PRINCEPS, an algorithm based on a new coordination environment projection scheme that facilitates the structural analysis and comparison of such crystal structures. We demonstrate the potential of this approach by applying it to the complex Ce-Ni-Si ternary system, whose 17 binary and 21 ternary phases would present a daunting challenge to one seeking to understand the system by manual inspection (but has nonetheless been well-described through the heroic efforts of previous researchers. With the help of PRINCEPS, most of the ternary phases in this system can be rationalized as intergrowths of simple structural fragments, and grouped into a handful of structural series (with some outliers. These results illustrate how the PRINCEPS approach can be used to organize a vast collection of crystal structures into structurally meaningful families, and guide the description of complex atomic arrangements.

  6. Soft computing approach to pattern classification and object recognition a unified concept

    CERN Document Server

    Ray, Kumar S

    2012-01-01

    Soft Computing Approach to Pattern Classification and Object Recognition establishes an innovative, unified approach to supervised pattern classification and model-based occluded object recognition. The book also surveys various soft computing tools, fuzzy relational calculus (FRC), genetic algorithm (GA) and multilayer perceptron (MLP) to provide a strong foundation for the reader. The supervised approach to pattern classification and model-based approach to occluded object recognition are treated in one framework , one based on either a conventional interpretation or a new interpretation of

  7. Unified QSAR & network-based computational chemistry approach to antimicrobials. II. Multiple distance and triadic census analysis of antiparasitic drugs complex networks.

    Science.gov (United States)

    Prado-Prado, Francisco J; Ubeira, Florencio M; Borges, Fernanda; González-Díaz, Humberto

    2010-01-15

    In the previous work, we reported a multitarget Quantitative Structure-Activity Relationship (mt-QSAR) model to predict drug activity against different fungal species. This mt-QSAR allowed us to construct a drug-drug multispecies Complex Network (msCN) to investigate drug-drug similarity (González-Díaz and Prado-Prado, J Comput Chem 2008, 29, 656). However, important methodological points remained unclear, such as follows: (1) the accuracy of the methods when applied to other problems; (2) the effect of the distance type used to construct the msCN; (3) how to perform the inverse procedure to study species-species similarity with multidrug resistance CNs (mdrCN); and (4) the implications and necessary steps to perform a substructural Triadic Census Analysis (TCA) of the msCN. To continue the present series with other important problem, we developed here a mt-QSAR model for more than 700 drugs tested in the literature against different parasites (predicting antiparasitic drugs). The data were processed by Linear Discriminate Analysis (LDA) and the model classifies correctly 93.62% (1160 out of 1239 cases) in training. The model validation was carried out by means of external predicting series; the model classified 573 out of 607, that is, 94.4% of cases. Next, we carried out the first comparative study of the topology of six different drug-drug msCNs based on six different distances such as Euclidean, Chebychev, Manhattan, etc. Furthermore, we compared the selected drug-drug msCN and species-species mdsCN with random networks. We also introduced here the inverse methodology to construct species-species msCN based on a mt-QSAR model. Last, we reported the first substructural analysis of drug-drug msCN using Triadic Census Analysis (TCA) algorithm. Copyright 2009 Wiley Periodicals, Inc.

  8. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  9. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  10. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...... context spanning heterogeneous devices, multiple applications, services, and information sources. In this article, we present ABC as an approach to contextualize information, and present our research into designing activity-centric computing technologies.......An important research topic in artificial intelligence is automatic sensing and inferencing of contextual information, which is used to build computer models of the user’s activity. One approach to build such activity-aware systems is the notion of activity-based computing (ABC). ABC is a computing...

  11. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    An important research topic in artificial intelligence is automatic sensing and inferencing of contextual information, which is used to build computer models of the user’s activity. One approach to build such activity-aware systems is the notion of activity-based computing (ABC). ABC is a computing...... paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...... context spanning heterogeneous devices, multiple applications, services, and information sources. In this article, we present ABC as an approach to contextualize information, and present our research into designing activity-centric computing technologies....

  12. Aesthetic Approaches to Human-Computer Interaction

    DEFF Research Database (Denmark)

    This volume consists of revised papers from the First International Workshop on Activity Theory Based Practical Methods for IT Design. The workshop took place in Copenhagen, Denmark, September 2-3, 2004. The particular focus of the workshop was the development of methods based on activity theory...... for practical development of IT-based systems....

  13. Acoustic gravity waves: A computational approach

    Science.gov (United States)

    Hariharan, S. I.; Dutt, P. K.

    1987-01-01

    This paper discusses numerical solutions of a hyperbolic initial boundary value problem that arises from acoustic wave propagation in the atmosphere. Field equations are derived from the atmospheric fluid flow governed by the Euler equations. The resulting original problem is nonlinear. A first order linearized version of the problem is used for computational purposes. The main difficulty in the problem as with any open boundary problem is in obtaining stable boundary conditions. Approximate boundary conditions are derived and shown to be stable. Numerical results are presented to verify the effectiveness of these boundary conditions.

  14. Noise-based communication and computing

    CERN Document Server

    Kish, Laszlo B

    2008-01-01

    We discuss the speed-error-heat triangle and related problems with rapidly increasing energy dissipation and error rate during miniaturization. These and the independently growing need of unconditional data security have provoked non-conventional approaches in the physics of informatics. Noise-based informatics is a potentially promising possibility which is the way how biological brains process the information. Recently, it has been shown that thermal noise and its electronically enhanced versions (Johnson-like noises) can be utilized as information carrier with peculiar properties. Relevant examples are Zero power (stealth) communication, Unconditionally secure communication with Johnson(-like) noise and Kirchhoff loop and Noise-driven computing. The zero power communication utilizes the equilibrium background noise in the channel to transfer information. The unconditionally secure communication is based on the properties of Johnson(-like) noise and those of a simple Kirchhoff's loop. The scheme utilizes on...

  15. A complex network approach to cloud computing

    CERN Document Server

    Travieso, Gonzalo; Bruno, Odemir Martinez; Costa, Luciano da Fontoura

    2015-01-01

    Cloud computing has become an important means to speed up computing. One problem influencing heavily the performance of such systems is the choice of nodes as servers responsible for executing the users' tasks. In this article we report how complex networks can be used to model such a problem. More specifically, we investigate the performance of the processing respectively to cloud systems underlain by Erdos-Renyi and Barabasi-Albert topology containing two servers. Cloud networks involving two communities not necessarily of the same size are also considered in our analysis. The performance of each configuration is quantified in terms of two indices: the cost of communication between the user and the nearest server, and the balance of the distribution of tasks between the two servers. Regarding the latter index, the ER topology provides better performance than the BA case for smaller average degrees and opposite behavior for larger average degrees. With respect to the cost, smaller values are found in the BA ...

  16. A new approach in CHP steam turbines thermodynamic cycles computations

    Directory of Open Access Journals (Sweden)

    Grković Vojin R.

    2012-01-01

    Full Text Available This paper presents a new approach in mathematical modeling of thermodynamic cycles and electric power of utility district-heating and cogeneration steam turbines. The approach is based on the application of the dimensionless mass flows, which describe the thermodynamic cycle of a combined heat and power steam turbine. The mass flows are calculated relative to the mass flow to low pressure turbine. The procedure introduces the extraction mass flow load parameter νh which clearly indicates the energy transformation process, as well as the cogeneration turbine design features, but also its fitness for the electrical energy system requirements. The presented approach allows fast computations, as well as direct calculation of the selected energy efficiency indicators. The approach is exemplified with the calculation results of the district heat power to electric power ratio, as well as the cycle efficiency, versus νh. The influence of νh on the conformity of a combined heat and power turbine to the grid requirements is also analyzed and discussed. [Projekat Ministarstva nauke Republike Srbije, br. 33049: Development of CHP demonstration plant with gasification of biomass

  17. Progress in silicon-based quantum computing.

    Science.gov (United States)

    Clark, R G; Brenner, R; Buehler, T M; Chan, V; Curson, N J; Dzurak, A S; Gauja, E; Goan, H S; Greentree, A D; Hallam, T; Hamilton, A R; Hollenberg, L C L; Jamieson, D N; McCallum, J C; Milburn, G J; O'Brien, J L; Oberbeck, L; Pakes, C I; Prawer, S D; Reilly, D J; Ruess, F J; Schofield, S R; Simmons, M Y; Stanley, F E; Starrett, R P; Wellard, C; Yang, C

    2003-07-15

    We review progress at the Australian Centre for Quantum Computer Technology towards the fabrication and demonstration of spin qubits and charge qubits based on phosphorus donor atoms embedded in intrinsic silicon. Fabrication is being pursued via two complementary pathways: a 'top-down' approach for near-term production of few-qubit demonstration devices and a 'bottom-up' approach for large-scale qubit arrays with sub-nanometre precision. The 'top-down' approach employs a low-energy (keV) ion beam to implant the phosphorus atoms. Single-atom control during implantation is achieved by monitoring on-chip detector electrodes, integrated within the device structure. In contrast, the 'bottom-up' approach uses scanning tunnelling microscope lithography and epitaxial silicon overgrowth to construct devices at an atomic scale. In both cases, surface electrodes control the qubit using voltage pulses, and dual single-electron transistors operating near the quantum limit provide fast read-out with spurious-signal rejection.

  18. Genetic braid optimization: A heuristic approach to compute quasiparticle braids

    Science.gov (United States)

    McDonald, Ross B.; Katzgraber, Helmut G.

    2013-02-01

    In topologically protected quantum computation, quantum gates can be carried out by adiabatically braiding two-dimensional quasiparticles, reminiscent of entangled world lines. Bonesteel [Phys. Rev. Lett.10.1103/PhysRevLett.95.140503 95, 140503 (2005)], as well as Leijnse and Flensberg [Phys. Rev. B10.1103/PhysRevB.86.104511 86, 104511 (2012)], recently provided schemes for computing quantum gates from quasiparticle braids. Mathematically, the problem of executing a gate becomes that of finding a product of the generators (matrices) in that set that approximates the gate best, up to an error. To date, efficient methods to compute these gates only strive to optimize for accuracy. We explore the possibility of using a generic approach applicable to a variety of braiding problems based on evolutionary (genetic) algorithms. The method efficiently finds optimal braids while allowing the user to optimize for the relative utilities of accuracy and/or length. Furthermore, when optimizing for error only, the method can quickly produce efficient braids.

  19. Novel computational approaches for the analysis of cosmic magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)

    2016-07-01

    In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.

  20. 基于本体概念群组划分的语义距离计算方法%An Ontology Concept-Based Cluster Partition Approach for Computing the Semantic Distance between Concepts

    Institute of Scientific and Technical Information of China (English)

    彭志平; 李晓明; 柯文德

    2011-01-01

    The semantic similarity computing between concepts is an important component in natural language processing etc. , and the semantic similarity computing between concepts based on semantic distance is currently dominant technique. In this paper, the ontology based cluster partition approach for computing the semantic distance between concepts is proposed on the basis of the analysis of the lacks in the existing algorithms. The rules for computing the semantic distance between concepts are given under the situation of multi-concept clusters, and then the approach for computing the semantic distance between concepts within single cluster as well as cross-cluster is put forward. In the proposed approach, the non-symmetry of semantic similarities in the pairs of hyponymy concepts is worked out by introducing the forward semantic distance and the reverse semantic distance, and the other binary relationships of the pairs of non-hyponymy concepts are deal with by dynamically allocating the relation weights in the light of the locations of concept nodes. Experimented results shows that the proposed approach is effective and it is preferable to other typical similar ones.%概念的语义相似度计算是自然语言处理等领域的重要研究内容,基于语义距离的概念相似度计算是其主要方法.在分析现有算法存在弊端的基础上,提出基于领域本体群组划分的概念语义距离计算方法.首先给出多概念群组下概念语义距离的计算规则,然后分别提出群组内和群组间的概念语义距离计算方法,通过引人正向和反向的语义距离来解决上下位关系概念对的语义相似度非对称性,并通过概念节点的位置动态分配关系的权值来处理其他非上下位的二元关系.实验表明,基于领域本体群组划分的概念语义距离计算方法是有效的,与其他典型的同类方法相比,具有明显的优势.

  1. Aesthetic Approaches to Human-Computer Interaction

    DEFF Research Database (Denmark)

    This volume consists of revised papers from the First International Workshop on Activity Theory Based Practical Methods for IT Design. The workshop took place in Copenhagen, Denmark, September 2-3, 2004. The particular focus of the workshop was the development of methods based on activity theory ...

  2. An Educational Approach to Computationally Modeling Dynamical Systems

    Science.gov (United States)

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  3. A New Approach to Practical Active-Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Buus; Nordholt, Peter Sebastian; Orlandi, Claudio

    2012-01-01

    We propose a new approach to practical two-party computation secure against an active adversary. All prior practical protocols were based on Yao’s garbled circuits. We use an OT-based approach and get efficiency via OT extension in the random oracle model. To get a practical protocol we introduce...

  4. A New Approach to Practical Active-Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Buus; Nordholt, Peter Sebastian; Orlandi, Claudio

    2011-01-01

    We propose a new approach to practical two-party computation secure against an active adversary. All prior practical protocols were based on Yao's garbled circuits. We use an OT-based approach and get efficiency via OT extension in the random oracle model. To get a practical protocol we introduce...

  5. Biologically motivated computationally intensive approaches to image pattern recognition

    NARCIS (Netherlands)

    Petkov, Nikolay

    1995-01-01

    This paper presents some of the research activities of the research group in vision as a grand challenge problem whose solution is estimated to need the power of Tflop/s computers and for which computational methods have yet to be developed. The concerned approaches are biologically motivated, in th

  6. Soft Computing Approach for Software Cost Estimation

    OpenAIRE

    Iman Attarzadeh; Siew Hock Ow

    2010-01-01

    Software metric and estimation is base on measuring of software attributes which are typically related to the product, the process and the resources of software development. One of the greatest challenges for software developers is predicting the development effort for a software system based on some metrics for the last decades. Project managers are required to the ability to give a good estimation on software development effort. Most of the traditional techniques such as function points, re...

  7. Web-Based Computing Resource Agent Publishing

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Web-based Computing Resource Publishing is a efficient way to provide additional computing capacity for users who need more computing resources than that they themselves could afford by making use of idle computing resources in the Web.Extensibility and reliability are crucial for agent publishing. The parent-child agent framework and primary-slave agent framework were proposed respectively and discussed in detail.

  8. General approaches in ensemble quantum computing

    Indian Academy of Sciences (India)

    V Vimalan; N Chandrakumar

    2008-01-01

    We have developed methodology for NMR quantum computing focusing on enhancing the efficiency of initialization, of logic gate implementation and of readout. Our general strategy involves the application of rotating frame pulse sequences to prepare pseudopure states and to perform logic operations. We demonstrate experimentally our methodology for both homonuclear and heteronuclear spin ensembles. On model two-spin systems, the initialization time of one of our sequences is three-fourths (in the heteronuclear case) or one-fourth (in the homonuclear case), of the typical pulsed free precession sequences, attaining the same initialization efficiency. We have implemented the logical SWAP operation in homonuclear AMX spin systems using selective isotropic mixing, reducing the duration taken to a third compared to the standard re-focused INEPT-type sequence. We introduce the 1D version for readout of the rotating frame SWAP operation, in an attempt to reduce readout time. We further demonstrate the Hadamard mode of 1D SWAP, which offers 2N-fold reduction in experiment time for a system with -working bits, attaining the same sensitivity as the standard 1D version.

  9. Delay Computation Using Fuzzy Logic Approach

    Directory of Open Access Journals (Sweden)

    Ramasesh G. R.

    2012-10-01

    Full Text Available The paper presents practical application of fuzzy sets and system theory in predicting delay, with reasonable accuracy, a wide range of factors pertaining to construction projects. In this paper we shall use fuzzy logic to predict delays on account of Delayed supplies and Labor shortage. It is observed that the project scheduling software use either deterministic method or probabilistic method for computation of schedule durations, delays, lags and other parameters. In other words, these methods use only quantitative inputs leaving-out the qualitative aspects associated with individual activity of work. The qualitative aspect viz., the expertise of the mason or the lack of experience can have a significant impact on the assessed duration. Such qualitative aspects do not find adequate representation in the Project Scheduling software. A realistic project is considered for which a PERT chart has been prepared using showing all the major activities in reasonable detail. This project has been periodically updated until its completion. It is observed that some of the activities are delayed due to extraneous factors resulting in the overall delay of the project. The software has the capability to calculate the overall delay through CPM (Critical Path Method when each of the activity-delays is reported. We shall now demonstrate that by using fuzzy logic, these delays could have been predicted well in advance.

  10. Linearized Aeroelastic Computations in the Frequency Domain Based on Computational Fluid Dynamics

    CERN Document Server

    Amsallem, David; Choi, Youngsoo; Farhat, Charbel

    2015-01-01

    An iterative, CFD-based approach for aeroelastic computations in the frequency domain is presented. The method relies on a linearized formulation of the aeroelastic problem and a fixed-point iteration approach and enables the computation of the eigenproperties of each of the wet aeroelastic eigenmodes. Numerical experiments on the aeroelastic analysis and design optimization of two wing configurations illustrate the capability of the method for the fast and accurate aeroelastic analysis of aircraft configurations and its advantage over classical time-domain approaches.

  11. An Automatic Approach to Detect Software Anomalies in Cloud Computing Using Pragmatic Bayes Approach

    Directory of Open Access Journals (Sweden)

    Nethaji V

    2014-06-01

    Full Text Available Software detection of anomalies is a vital element of operations in data centers and service clouds. Statistical Process Control (SPC cloud charts sense routine anomalies and their root causes are identified based on the differential profiling strategy. By automating the tasks, most of the manual overhead incurred in detecting the software anomalies and the analysis time are reduced to a larger extent but detailed analysis of profiling data are not performed in most of the cases. On the other hand, the cloud scheduler judges both the requirements of the user and the available infrastructure to equivalent their requirements. OpenStack prototype works on cloud trust management which provides the scheduler but complexity occurs when hosting the cloud system. At the same time, Trusted Computing Base (TCB of a computing node does not achieve the scalability measure. This unique paradigm brings about many software anomalies, which have not been well studied. This work, a Pragmatic Bayes approach studies the problem of detecting software anomalies and ensures scalability by comparing information at the current time to historical data. In particular, PB approach uses the two component Gaussian mixture to deviations at current time in cloud environment. The introduction of Gaussian mixture in PB approach achieves higher scalability measure which involves supervising massive number of cells and fast enough to be potentially useful in many streaming scenarios. Wherein previous works has been ensured for scheduling often lacks of scalability, this paper shows the superiority of the method using a Bayes per section error rate procedure through simulation, and provides the detailed analysis of profiling data in the marginal distributions using the Amazon EC2 dataset. Extensive performance analysis shows that the PB approach is highly efficient in terms of runtime, scalability, software anomaly detection ratio, CPU utilization, density rate, and computational

  12. COMPUTER BASED HEART PULSES MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Ali N. Hamoodi

    2013-05-01

    Full Text Available In this work the measurement and displays of blood oxygen saturation and pulse rate are investigated practically using computer.The analysis involves the variation in blood oxygen saturation ratio and pulse rate. The results obtained are compared with kontron pulse oximeter 7840 device. The value obtained for the same person pulse rate is approximately equal to that obtained by the konton pulse oximeter 7840 device. The sensor used in this work is the finger clip.The advantages of using computer over kontron pulse oximeter 7840 device is that the data of the patient can be saved in the computer for many years and also it can be display at any time so that the doctor get file contains all data for each patient. 

  13. The Swarm Computing Approach to Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-07-01

    Full Text Available We have proposed to use some features of swarm behaviours in modelling business processes. Due to these features we deal with a propagation of business processes in all accessible directions. This propagation is involved into our formalization instead of communicating sequential processes. As a result, we have constructed a business process diagram language based on the swarm behavior and an extension of that language in the form of reflexive management language.

  14. Mobile Cloud Computing: A Review on Smartphone Augmentation Approaches

    CERN Document Server

    Abolfazli, Saeid; Gani, Abdullah

    2012-01-01

    Smartphones have recently gained significant popularity in heavy mobile processing while users are increasing their expectations toward rich computing experience. However, resource limitations and current mobile computing advancements hinder this vision. Therefore, resource-intensive application execution remains a challenging task in mobile computing that necessitates device augmentation. In this article, smartphone augmentation approaches are reviewed and classified in two main groups, namely hardware and software. Generating high-end hardware is a subset of hardware augmentation approaches, whereas conserving local resource and reducing resource requirements approaches are grouped under software augmentation methods. Our study advocates that consreving smartphones' native resources, which is mainly done via task offloading, is more appropriate for already-developed applications than new ones, due to costly re-development process. Cloud computing has recently obtained momentous ground as one of the major co...

  15. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  16. Local-basis-function approach to computed tomography

    Science.gov (United States)

    Hanson, K. M.; Wecksung, G. W.

    1985-12-01

    In the local basis-function approach, a reconstruction is represented as a linear expansion of basis functions, which are arranged on a rectangular grid and possess a local region of support. The basis functions considered here are positive and may overlap. It is found that basis functions based on cubic B-splines offer significant improvements in the calculational accuracy that can be achieved with iterative tomographic reconstruction algorithms. By employing repetitive basis functions, the computational effort involved in these algorithms can be minimized through the use of tabulated values for the line or strip integrals over a single-basis function. The local nature of the basis functions reduces the difficulties associated with applying local constraints on reconstruction values, such as upper and lower limits. Since a reconstruction is specified everywhere by a set of coefficients, display of a coarsely represented image does not require an arbitrary choice of an interpolation function.

  17. Leaching from Heterogeneous Heck Catalysts: A Computational Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The possibility of carrying out a purely heterogeneous Heck reaction in practice without Pd leaching has been previously considered by a number of research groups but no general consent has yet arrived. Here, the reaction was, for the first time, evaluated by a simple computational approach. Modelling experiments were performed on one of the initial catalytic steps: phenyl halides attachment on Pd (111) to (100) and (111) to (111) ridges of a Pd crystal. Three surface structures of resulting [PhPdX] were identified as possible reactive intermediates. Following potential energy minimisation calculations based on a universal force field, the relative stabilities of these surface species were then determined. Results showed the most stable species to be one in which a Pd ridge atom is removed from the Pd crystal structure, suggesting Pd leaching induced by phenyl halides is energetically favourable.

  18. Computing Accurate Grammatical Feedback in a Virtual Writing Conference for German-Speaking Elementary-School Children: An Approach Based on Natural Language Generation

    Science.gov (United States)

    Harbusch, Karin; Itsova, Gergana; Koch, Ulrich; Kuhner, Christine

    2009-01-01

    We built a natural language processing (NLP) system implementing a "virtual writing conference" for elementary-school children, with German as the target language. Currently, state-of-the-art computer support for writing tasks is restricted to multiple-choice questions or quizzes because automatic parsing of the often ambiguous and fragmentary…

  19. Computing Accurate Grammatical Feedback in a Virtual Writing Conference for German-Speaking Elementary-School Children: An Approach Based on Natural Language Generation

    Science.gov (United States)

    Harbusch, Karin; Itsova, Gergana; Koch, Ulrich; Kuhner, Christine

    2009-01-01

    We built a natural language processing (NLP) system implementing a "virtual writing conference" for elementary-school children, with German as the target language. Currently, state-of-the-art computer support for writing tasks is restricted to multiple-choice questions or quizzes because automatic parsing of the often ambiguous and fragmentary…

  20. A novel algorithm for computer based assessment

    OpenAIRE

    2012-01-01

    Student learning outcomes have been evaluated through graded assignments and tests by most paper-based assessment systems. But computer based assessments has the opportunity to improve the efficiency of assessments process. The use of internet is also made possible

  1. An Integrated Computer-Aided Approach for Environmental Studies

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Chen, Fei; Jaksland, Cecilia;

    1997-01-01

    A general framework for an integrated computer-aided approach to solve process design, control, and environmental problems simultaneously is presented. Physicochemical properties and their relationships to the molecular structure play an important role in the proposed integrated approach. The scope...... and applicability of the integrated approach is highlighted through examples involving estimation of properties and environmental pollution prevention. The importance of mixture effects on some environmentally important properties is also demonstrated....

  2. Distributed measurement-based quantum computation

    CERN Document Server

    Danos, V; Kashefi, E; Panangaden, P; Danos, Vincent; Hondt, Ellie D'; Kashefi, Elham; Panangaden, Prakash

    2005-01-01

    We develop a formal model for distributed measurement-based quantum computations, adopting an agent-based view, such that computations are described locally where possible. Because the network quantum state is in general entangled, we need to model it as a global structure, reminiscent of global memory in classical agent systems. Local quantum computations are described as measurement patterns. Since measurement-based quantum computation is inherently distributed, this allows us to extend naturally several concepts of the measurement calculus, a formal model for such computations. Our goal is to define an assembly language, i.e. we assume that computations are well-defined and we do not concern ourselves with verification techniques. The operational semantics for systems of agents is given by a probabilistic transition system, and we define operational equivalence in a way that it corresponds to the notion of bisimilarity. With this in place, we prove that teleportation is bisimilar to a direct quantum channe...

  3. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  4. Towards scalable quantum communication and computation: Novel approaches and realizations

    Science.gov (United States)

    Jiang, Liang

    Quantum information science involves exploration of fundamental laws of quantum mechanics for information processing tasks. This thesis presents several new approaches towards scalable quantum information processing. First, we consider a hybrid approach to scalable quantum computation, based on an optically connected network of few-qubit quantum registers. Specifically, we develop a novel scheme for scalable quantum computation that is robust against various imperfections. To justify that nitrogen-vacancy (NV) color centers in diamond can be a promising realization of the few-qubit quantum register, we show how to isolate a few proximal nuclear spins from the rest of the environment and use them for the quantum register. We also demonstrate experimentally that the nuclear spin coherence is only weakly perturbed under optical illumination, which allows us to implement quantum logical operations that use the nuclear spins to assist the repetitive-readout of the electronic spin. Using this technique, we demonstrate more than two-fold improvement in signal-to-noise ratio. Apart from direct application to enhance the sensitivity of the NV-based nano-magnetometer, this experiment represents an important step towards the realization of robust quantum information processors using electronic and nuclear spin qubits. We then study realizations of quantum repeaters for long distance quantum communication. Specifically, we develop an efficient scheme for quantum repeaters based on atomic ensembles. We use dynamic programming to optimize various quantum repeater protocols. In addition, we propose a new protocol of quantum repeater with encoding, which efficiently uses local resources (about 100 qubits) to identify and correct errors, to achieve fast one-way quantum communication over long distances. Finally, we explore quantum systems with topological order. Such systems can exhibit remarkable phenomena such as quasiparticles with anyonic statistics and have been proposed as

  5. Establishing performance requirements of computer based systems subject to uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, D.

    1997-02-01

    An organized systems design approach is dictated by the increasing complexity of computer based systems. Computer based systems are unique in many respects but share many of the same problems that have plagued design engineers for decades. The design of complex systems is difficult at best, but as a design becomes intensively dependent on the computer processing of external and internal information, the design process quickly borders chaos. This situation is exacerbated with the requirement that these systems operate with a minimal quantity of information, generally corrupted by noise, regarding the current state of the system. Establishing performance requirements for such systems is particularly difficult. This paper briefly sketches a general systems design approach with emphasis on the design of computer based decision processing systems subject to parameter and environmental variation. The approach will be demonstrated with application to an on-board diagnostic (OBD) system for automotive emissions systems now mandated by the state of California and the Federal Clean Air Act. The emphasis is on an approach for establishing probabilistically based performance requirements for computer based systems.

  6. Computational physical oceanography -- A comprehensive approach based on generalized CFD/grid techniques for planetary scale simulations of oceanic flows. Final report, September 1, 1995--August 31, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Beddhu, M.; Jiang, M.Y.; Whitfield, D.L.; Taylor, L.K.; Arabshahi, A.

    1997-02-20

    The original intention for this work was to impart the technology that was developed in the field of computational aeronautics to the field of computational physical oceanography. This technology transfer involved grid generation techniques and solution procedures to solve the governing equations over the grids thus generated. Specifically, boundary fitting non-orthogonal grids would be generated over a sphere taking into account the topography of the ocean floor and the topography of the continents. The solution methodology to be employed involved the application of an upwind, finite volume discretization procedure that uses higher order numerical fluxes at the cell faces to discretize the governing equations and an implicit Newton relaxation technique to solve the discretized equations. This report summarizes the efforts put forth during the past three years to achieve these goals and indicates the future direction of this work as it is still an ongoing effort.

  7. The Validation of Computer-based Models in Engineering: Some Lessons from Computing Science

    Directory of Open Access Journals (Sweden)

    D. J. Murray-Smith

    2001-01-01

    Full Text Available Questions of the quality of computer-based models and the formal processes of model testing, involving internal verification and external validation, are usually given only passing attention in engineering reports and in technical publications. However, such models frequently provide a basis for analysis methods, design calculations or real-time decision-making in complex engineering systems. This paper reviews techniques used for external validation of computer-based models and contrasts the somewhat casual approach which is usually adopted in this field with the more formal approaches to software testing and documentation recommended for large software projects. Both activities require intimate knowledge of the intended application, a systematic approach and considerable expertise and ingenuity in the design of tests. It is concluded that engineering degree courses dealing with modelling techniques and computer simulation should put more emphasis on model limitations, testing and validation.

  8. A Two Layer Approach to the Computability and Complexity of Real Functions

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2003-01-01

    We present a new model for computability and complexity of real functions together with an implementation that it based on it. The model uses a two-layer approach in which low-type basic objects perform the computation of a real function, but, whenever needed, can be complemented with higher type...... in computable analysis, while the efficiency of the implementation is not compromised by the need to create and maintain higher-type objects....

  9. Activity-based computing for medical work in hospitals

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2009-01-01

    Studies have revealed that people organize and think of their work in terms of activities that are carried out in pursuit of some overall objective, often in collaboration with others. Nevertheless, modern computer systems are typically single-user oriented, that is, designed to support individual...... tasks such as word processing while sitting at a desk. This article presents the concept of Activity-Based Computing (ABC), which seeks to create computational support for human activities. The ABC approach has been designed to address activity-based computing support for clinical work in hospitals....... In a hospital, the challenges arising from the management of parallel activities and interruptions are amplified because multitasking is now combined with a high degree of mobility, collaboration, and urgency. The article presents the empirical and theoretical background for activity-based computing, its...

  10. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  11. Object Based Middleware for Grid Computing

    Directory of Open Access Journals (Sweden)

    S. Muruganantham

    2010-01-01

    Full Text Available Problem statement: “Grid” computing has emerged as an important new field, distinguished from conventional distributed computing by its focus on large-scale resource sharing, innovative applications and, in some cases, high-performance orientation. The role of middleware is to ease the task of designing, programming and managing distributed applications by providing a simple, consistent and integrated distributed programming environment. Essentially, middleware is a distributed software layer, which abstracts over the complexity and heterogeneity of the underlying distributed environment with its multitude of network technologies, machine architectures, operating systems and programming languages. Approach: This study brought out the development of supportive middleware to manage resources and distributed workload across multiple administrative boundaries is of central importance to Grid computing. Active middleware services that perform look-up, scheduling and staging are being developed that allow users to identify and utilize appropriate resources that provide sustainable system and user-level qualities of service. Results: Different middleware platforms support different programming models. Perhaps the most popular model is object-based middleware in which applications are structured into objects that interact via location transparent method invocation. Conclusion: The Object Management Group’s CORBA platform offer an Interface Definition Language (IDL which is used to abstract over the fact that objects can be implemented in any suitable programming language, an object request broker which is responsible for transparently directing method invocations to the appropriate target object and a set of services such as naming, time, transactions, replication which further enhance the programming environment.

  12. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  13. Nutraceuticals based computational medicinal chemistry

    OpenAIRE

    2013-01-01

    In recent years, the edible biomedicinal products called nutraceuticals have been becoming more popular among the pharmaceutical industries and the consumers. In the process of developing nutraceuticals, in silico approaches play an important role in structural elucidation, receptor-ligand interactions, drug designing etc., that critically help the laboratory experiments to avoid biological and financial risk. In this thesis, three nutraceuticals possessing antimicrobial and anticancer activi...

  14. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  15. Reversible Data Hiding Based on DNA Computing

    Directory of Open Access Journals (Sweden)

    Bin Wang

    2017-01-01

    Full Text Available Biocomputing, especially DNA, computing has got great development. It is widely used in information security. In this paper, a novel algorithm of reversible data hiding based on DNA computing is proposed. Inspired by the algorithm of histogram modification, which is a classical algorithm for reversible data hiding, we combine it with DNA computing to realize this algorithm based on biological technology. Compared with previous results, our experimental results have significantly improved the ER (Embedding Rate. Furthermore, some PSNR (peak signal-to-noise ratios of test images are also improved. Experimental results show that it is suitable for protecting the copyright of cover image in DNA-based information security.

  16. An Architecture Approach of Tourism Cloud Based on Cloud Computing%一种基于云计算的旅游云构架模式研究

    Institute of Scientific and Technical Information of China (English)

    周相兵; 马洪江; 苗放

    2013-01-01

    Tourist industry has been one of income source at many areas. However, How do tourism resources implement integrated management and unified planning, which is badly in need of solving problems in recent years. Therefore, a tourist cloud solution is presented, which employ cloud computing to solve this problems, use cloud computing to virtualize many cloud terminals of tourism resources, and adopt IaaS, PaaS and SaaS to structure tourism cloud platfrom; Finally, the platfrom have good ability of the unified management, unified sale, unified consumption and unified service for tourism resources In a certain area Thereby realize tourist industry of the cluster and collaboration development, and improve processing reliability of tourist information and capability of parallel computing for tourism resources, reduce crisis and risch of tourist industry in progress.%旅游产业已成为众多地区的重要经济来源之一,然而怎样来实现集成化管理和完成旅游资源统一规划,是近年急需要解决的问题.因此,提出一种旅游云解决方案,即以云计算策略来解决这个问题,它是通过云计算将旅游资源虚拟化多个云端进行构架,并采用云计算中的IaaS、PaaS和SaaS来构架一种旅游云,以最终使区域内或地区的旅游资源形成一个统一管理、统一销售、统一消费和统一服务的一个旅游云平台与服务系统.从而实现该区域或地区的旅游产业集群和旅游能协同发展,并提高旅游资源信息化处理的可靠性和实时并行处理能力,以及降低旅游产业化所带来的危机和风险.

  17. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal

    2017-01-01

    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  18. Moment Matrices, Border Bases and Real Radical Computation

    OpenAIRE

    Lasserre, Jean-Bernard; Laurent, Monique; Mourrain, Bernard; Rostalski, Philipp; Trébuchet, Philippe

    2013-01-01

    International audience; In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is finite. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-definite programming. While the border basis algorithms of [17] are efficient and numerically stable for computing complex roots, algorithms based on moment matrices [12] allow the incorpora...

  19. Computational biomechanics for medicine new approaches and new applications

    CERN Document Server

    Miller, Karol; Wittek, Adam; Nielsen, Poul

    2015-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologiesand advancements. Thisvolumecomprises twelve of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, France, Spain and Switzerland. Some of the interesting topics discussed are:real-time simulations; growth and remodelling of soft tissues; inverse and meshless solutions; medical image analysis; and patient-specific solid mechanics simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  20. A distributed computing approach to mission operations support. [for spacecraft

    Science.gov (United States)

    Larsen, R. L.

    1975-01-01

    Computing mission operation support includes orbit determination, attitude processing, maneuver computation, resource scheduling, etc. The large-scale third-generation distributed computer network discussed is capable of fulfilling these dynamic requirements. It is shown that distribution of resources and control leads to increased reliability, and exhibits potential for incremental growth. Through functional specialization, a distributed system may be tuned to very specific operational requirements. Fundamental to the approach is the notion of process-to-process communication, which is effected through a high-bandwidth communications network. Both resource-sharing and load-sharing may be realized in the system.

  1. Sustainable manufacturing for obsolete computers based on 3R engineering

    Institute of Scientific and Technical Information of China (English)

    SHI Pei-jing; XU Yi; WANG Hong-mei; XU Bin-shi

    2005-01-01

    The volume tendency of in-use and end-of-life computers in China were analyzed; the emerging danger of obsolete computers by incorrect treatment was summarized; the integration disposal technologies based on 3R (recycle, remanufacture and reuse) engineering aiming at monitors, electronic devices, metals, plastics materials, and overall computers were put forward; the economic and social benefits were also analyzed. The results show that the integration disposal process of obsolete computer is an optimum approach to save the resource of electromechanical products. Remanufacturing and disposal 100 thousand obsolete computers per year can create profits about RMB10 million yuan and provide employment for 300 persons. It can be deduced that there are great potential opportunities for the obsolete computers disposal industry containing recycle, remanufacture and reuse engineering.

  2. Game based learning for computer science education

    NARCIS (Netherlands)

    Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus

    2011-01-01

    Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.

  3. Game based learning for computer science education

    NARCIS (Netherlands)

    Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus

    2011-01-01

    Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.

  4. A Near-Term Quantum Computing Approach for Hard Computational Problems in Space Exploration

    CERN Document Server

    Smelyanskiy, Vadim N; Knysh, Sergey I; Williams, Colin P; Johnson, Mark W; Thom, Murray C; Macready, William G; Pudenz, Kristen L

    2012-01-01

    In this article, we show how to map a sampling of the hardest artificial intelligence problems in space exploration onto equivalent Ising models that then can be attacked using quantum annealing implemented in D-Wave machine. We overview the existing results as well as propose new Ising model implementations for quantum annealing. We review supervised and unsupervised learning algorithms for classification and clustering with applications to feature identification and anomaly detection. We introduce algorithms for data fusion and image matching for remote sensing applications. We overview planning problems for space exploration mission applications and algorithms for diagnostics and recovery with applications to deep space missions. We describe combinatorial optimization algorithms for task assignment in the context of autonomous unmanned exploration. Finally, we discuss the ways to circumvent the limitation of the Ising mapping using a "blackbox" approach based on ideas from probabilistic computing. In this ...

  5. [Computational chemistry in structure-based drug design].

    Science.gov (United States)

    Cao, Ran; Li, Wei; Sun, Han-Zi; Zhou, Yu; Huang, Niu

    2013-07-01

    Today, the understanding of the sequence and structure of biologically relevant targets is growing rapidly and researchers from many disciplines, physics and computational science in particular, are making significant contributions to modern biology and drug discovery. However, it remains challenging to rationally design small molecular ligands with desired biological characteristics based on the structural information of the drug targets, which demands more accurate calculation of ligand binding free-energy. With the rapid advances in computer power and extensive efforts in algorithm development, physics-based computational chemistry approaches have played more important roles in structure-based drug design. Here we reviewed the newly developed computational chemistry methods in structure-based drug design as well as the elegant applications, including binding-site druggability assessment, large scale virtual screening of chemical database, and lead compound optimization. Importantly, here we address the current bottlenecks and propose practical solutions.

  6. An improved Hough transform-based fingerprint alignment approach

    CSIR Research Space (South Africa)

    Mlambo, CS

    2014-11-01

    Full Text Available An improved Hough Transform based fingerprint alignment approach is presented, which improves computing time and memory usage with accurate alignment parameter (rotation and translation) results. This is achieved by studying the strengths...

  7. An overview of computer-based natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  8. Human Computer Interaction Approach in Developing Customer Relationship Management

    Directory of Open Access Journals (Sweden)

    Mohd H.N.M. Nasir

    2008-01-01

    Full Text Available Problem statement: Many published studies have found that more than 50% of Customer Relationship Management (CRM system implementations have failed due to the failure of system usability and does not fulfilled user expectation. This study presented the issues that contributed to the failures of CRM system and proposed a prototype of CRM system developed using Human Computer Interaction approaches in order to resolve the identified issues. Approach: In order to capture the users' requirements, a single in-depth case study of a multinational company was chosen in this research, in which the background, current conditions and environmental interactions were observed, recorded and analyzed for stages of patterns in relation to internal and external influences. Some techniques of blended data gathering which are interviews, naturalistic observation and studying user documentation were employed and then the prototype of CRM system was developed which incorporated User-Centered Design (UCD approach, Hierarchical Task Analysis (HTA, metaphor and identification of users' behaviors and characteristics. The implementation of these techniques, were then measured in terms of usability. Results: Based on the usability testing conducted, the results showed that most of the users agreed that the system is comfortable to work with by taking the quality attributes of learnability, memorizeablity, utility, sortability, font, visualization, user metaphor, information easy view and color as measurement parameters. Conclusions/Recommendations: By combining all these techniques, a comfort level for the users that leads to user satisfaction and higher usability degree can be achieved in a proposed CRM system. Thus, it is important that the companies should put usability quality attribute into a consideration before developing or procuring CRM system to ensure the implementation successfulness of the CRM system.

  9. Development of Computer Science Disciplines - A Social Network Analysis Approach

    CERN Document Server

    Pham, Manh Cuong; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and workshop proceedings. That results in an imprecise and incomplete analysis of the computer science knowledge. This paper presents an analysis on the computer science knowledge network constructed from all types of publications, aiming at providing a complete view of computer science research. Based on the combination of two important digital libraries (DBLP and CiteSeerX), we study the knowledge network created at journal/conference level using citation linkage, to identify the development of sub-disciplines. We investiga...

  10. On computer-based assessment of mathematics

    OpenAIRE

    Pead, Daniel

    2010-01-01

    This work explores some issues arising from the widespread use of computer based assessment of Mathematics in primary and secondary education. In particular, it considers the potential of computer based assessment for testing “process skills” and “problem solving”. This is discussed through a case study of the World Class Tests project which set out to test problem solving skills. The study also considers how on-screen “eAssessment” differs from conventional paper tests and how transferri...

  11. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  12. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  13. Computational Scenario-based Capability Planning

    CERN Document Server

    Abbass, Hussein; Dam, Helen; Baker, Stephen; Whitacre, James M; Sarker, Ruhul; 10.1145/1389095.1389378

    2009-01-01

    Scenarios are pen-pictures of plausible futures, used for strategic planning. The aim of this investigation is to expand the horizon of scenario-based planning through computational models that are able to aid the analyst in the planning process. The investigation builds upon the advances of Information and Communication Technology (ICT) to create a novel, flexible and customizable computational capability-based planning methodology that is practical and theoretically sound. We will show how evolutionary computation, in particular evolutionary multi-objective optimization, can play a central role - both as an optimizer and as a source for innovation.

  14. A multidisciplinary approach to solving computer related vision problems.

    Science.gov (United States)

    Long, Jennifer; Helland, Magne

    2012-09-01

    This paper proposes a multidisciplinary approach to solving computer related vision issues by including optometry as a part of the problem-solving team. Computer workstation design is increasing in complexity. There are at least ten different professions who contribute to workstation design or who provide advice to improve worker comfort, safety and efficiency. Optometrists have a role identifying and solving computer-related vision issues and in prescribing appropriate optical devices. However, it is possible that advice given by optometrists to improve visual comfort may conflict with other requirements and demands within the workplace. A multidisciplinary approach has been advocated for solving computer related vision issues. There are opportunities for optometrists to collaborate with ergonomists, who coordinate information from physical, cognitive and organisational disciplines to enact holistic solutions to problems. This paper proposes a model of collaboration and examples of successful partnerships at a number of professional levels including individual relationships between optometrists and ergonomists when they have mutual clients/patients, in undergraduate and postgraduate education and in research. There is also scope for dialogue between optometry and ergonomics professional associations. A multidisciplinary approach offers the opportunity to solve vision related computer issues in a cohesive, rather than fragmented way. Further exploration is required to understand the barriers to these professional relationships. © 2012 The College of Optometrists.

  15. Computer Based Training Authors' and Designers' training

    Directory of Open Access Journals (Sweden)

    Frédéric GODET

    2016-03-01

    Full Text Available This communication, through couple of studies driven since 10 years, tries to show how important is the training of authors in Computer Based Training (CBT. We submit here an approach to prepare designers mastering Interactive Multimedia modules in this domain. Which institutions are really dedicating their efforts in training authors and designers in this area of CBTs? Television devices and broadcast organisations offered since year 60s' a first support for Distance Learning. New media, New Information and Communication Technologies (NICT allowed several public and private organisations to start Distance Learning projects. As usual some of them met their training objectives, other of them failed. Did their really failed? Currently, nobody has the right answer. Today, we do not have enough efficient tools allowing us to evaluate trainees' acquisition in a short term view. Training evaluation needs more than 10 to 20 years of elapsed time to bring reliable measures. Nevertheless, given the high investments already done in this area, we cannot wait until the final results of the pedagogical evaluation. A lot of analyses showed relevant issues which can be used as directions for CBTs authors and designers training. Warning - Our studies and the derived conclusions are mainly based on projects driven in the field. We additionally bring our several years experience in the training of movie film authors in the design of interactive multimedia products. Some of our examples are extracting from vocational training projects where we were involved in all development phases from the analysis of needs to the evaluation of the acquisition within the trainee's / employee job's. Obviously, we cannot bring and exhaustive approach in this domain where a lot of parameters are involved as frame for the CBT interactive multimedia modules authors' and designers' training.

  16. Element-Based Computational Model

    Directory of Open Access Journals (Sweden)

    Conrad Mueller

    2012-02-01

    Full Text Available A variation on the data-flow model is proposed to use for developing parallel architectures. While the model is a data driven model it has significant differences to the data-flow model. The proposed model has an evaluation cycleof processing elements (encapsulated data that is similar to the instruction cycle of the von Neumann model. The elements contain the information required to process them. The model is inherently parallel. An emulation of the model has been implemented. The objective of this paper is to motivate support for taking the research further. Using matrix multiplication as a case study, the element/data-flow based model is compared with the instruction-based model. This is done using complexity analysis followed by empirical testing to verify this analysis. The positive results are given as motivation for the research to be taken to the next stage - that is, implementing the model using FPGAs.

  17. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  18. Assessing Trustworthiness in Social Media: A Social Computing Approach

    Science.gov (United States)

    2015-11-17

    31-May-2015 Approved for Public Release; Distribution Unlimited Final Report: Assessing Trustworthiness in Social Media : A Social Computing Approach... media . We propose to investigate research issues related to social media trustworthiness and its assessment by leveraging social research methods...attributes of interest associated with a particular social media user related to the received information. This tool provides a way to combine different

  19. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  20. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    Mourrain, B.; Lasserre, J.B.; Laurent, M.; Rostalski, P.; Trebuchet, P.

    2011-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-denite programming.

  1. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    Mourrain, B.; Lasserre, J.B.; Laurent, M.; Rostalski, P.; Trebuchet, P.

    2013-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-denite programming.

  2. Interface Design in Computer-Based Language Testing.

    Science.gov (United States)

    Fulcher, Glenn

    2003-01-01

    Describes a three-phase process model for interface design, drawing on practices developed in the software industry and adapting them for computer-based languages tests. Describes good practice in initial design, emphasizes the importance of usability testing, and argues that only through following a principled approach to interface design can the…

  3. Improving Computer Based Speech Therapy Using a Fuzzy Expert System

    OpenAIRE

    Ovidiu Andrei Schipor; Stefan Gheorghe Pentiuc; Maria Doina Schipor

    2012-01-01

    In this paper we present our work about Computer Based Speech Therapy systems optimization. We focus especially on using a fuzzy expert system in order to determine specific parameters of personalized therapy, i.e. the number, length and content of training sessions. The efficiency of this new approach was tested during an experiment performed with our CBST, named LOGOMON.

  4. Intelligent Financial Portfolio Composition based on Evolutionary Computation Strategies

    CERN Document Server

    Gorgulho, Antonio; Horta, Nuno C G

    2013-01-01

    The management of financial portfolios or funds constitutes a widely known problematic in financial markets which normally requires a rigorous analysis in order to select the most profitable assets. This subject is becoming popular among computer scientists which try to adapt known Intelligent Computation techniques to the market’s domain. This book proposes a potential system based on Genetic Algorithms, which aims to manage a financial portfolio by using technical analysis indicators. The results are promising since the approach clearly outperforms the remaining approaches during the recent market crash.

  5. A GPU-Computing Approach to Solar Stokes Profile Inversion

    CERN Document Server

    Harker, Brian J

    2012-01-01

    We present a new computational approach to the inversion of solar photospheric Stokes polarization profiles, under the Milne-Eddington model, for vector magnetography. Our code, named GENESIS (GENEtic Stokes Inversion Strategy), employs multi-threaded parallel-processing techniques to harness the computing power of graphics processing units GPUs, along with algorithms designed to exploit the inherent parallelism of the Stokes inversion problem. Using a genetic algorithm (GA) engineered specifically for use with a GPU, we produce full-disc maps of the photospheric vector magnetic field from polarized spectral line observations recorded by the Synoptic Optical Long-term Investigations of the Sun (SOLIS) Vector Spectromagnetograph (VSM) instrument. We show the advantages of pairing a population-parallel genetic algorithm with data-parallel GPU-computing techniques, and present an overview of the Stokes inversion problem, including a description of our adaptation to the GPU-computing paradigm. Full-disc vector ma...

  6. Cloud Computing – A Unified Approach for Surveillance Issues

    Science.gov (United States)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  7. Cloud computing approaches to accelerate drug discovery value chain.

    Science.gov (United States)

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  8. Computational intelligence approaches for pattern discovery in biological systems.

    Science.gov (United States)

    Fogel, Gary B

    2008-07-01

    Biology, chemistry and medicine are faced by tremendous challenges caused by an overwhelming amount of data and the need for rapid interpretation. Computational intelligence (CI) approaches such as artificial neural networks, fuzzy systems and evolutionary computation are being used with increasing frequency to contend with this problem, in light of noise, non-linearity and temporal dynamics in the data. Such methods can be used to develop robust models of processes either on their own or in combination with standard statistical approaches. This is especially true for database mining, where modeling is a key component of scientific understanding. This review provides an introduction to current CI methods, their application to biological problems, and concludes with a commentary about the anticipated impact of these approaches in bioinformatics.

  9. Towards applied theories based on computability logic

    CERN Document Server

    Japaridze, Giorgi

    2008-01-01

    Computability logic (CL) (see http://www.cis.upenn.edu/~giorgi/cl.html) is a recently launched program for redeveloping logic as a formal theory of computability, as opposed to the formal theory of truth that logic has more traditionally been. Formulas in it represent computational problems, "truth" means existence of an algorithmic solution, and proofs encode such solutions. Within the line of research devoted to finding axiomatizations for ever more expressive fragments of CL, the present paper introduces a new deductive system CL12 and proves its soundness and completeness with respect to the semantics of CL. Conservatively extending classical predicate calculus and offering considerable additional expressive and deductive power, CL12 presents a reasonable, computationally meaningful, constructive alternative to classical logic as a basis for applied theories. To obtain a model example of such theories, this paper rebuilds the traditional, classical-logic-based Peano arithmetic into a computability-logic-b...

  10. Combined computational-experimental approach to predict blood-brain barrier (BBB) permeation based on "green" salting-out thin layer chromatography supported by simple molecular descriptors.

    Science.gov (United States)

    Ciura, Krzesimir; Belka, Mariusz; Kawczak, Piotr; Bączek, Tomasz; Markuszewski, Michał J; Nowakowska, Joanna

    2017-09-05

    The objective of this paper is to build QSRR/QSAR model for predicting the blood-brain barrier (BBB) permeability. The obtained models are based on salting-out thin layer chromatography (SOTLC) constants and calculated molecular descriptors. Among chromatographic methods SOTLC was chosen, since the mobile phases are free of organic solvent. As consequences, there are less toxic, and have lower environmental impact compared to classical reserved phases liquid chromatography (RPLC). During the study three stationary phase silica gel, cellulose plates and neutral aluminum oxide were examined. The model set of solutes presents a wide range of log BB values, containing compounds which cross the BBB readily and molecules poorly distributed to the brain including drugs acting on the nervous system as well as peripheral acting drugs. Additionally, the comparison of three regression models: multiple linear regression (MLR), partial least-squares (PLS) and orthogonal partial least squares (OPLS) were performed. The designed QSRR/QSAR models could be useful to predict BBB of systematically synthesized newly compounds in the drug development pipeline and are attractive alternatives of time-consuming and demanding directed methods for log BB measurement. The study also shown that among several regression techniques, significant differences can be obtained in models performance, measured by R(2) and Q(2), hence it is strongly suggested to evaluate all available options as MLR, PLS and OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Structure-based design of a potent and selective small peptide inhibitor of Mycobacterium tuberculosis 6-hydroxymethyl-7, 8-dihydropteroate synthase: a computer modelling approach.

    Science.gov (United States)

    Rao, Gita Subba; Kumar, Manoj

    2008-06-01

    In an attempt to design novel anti-TB drugs, the target chosen is the enzyme 6-hydroxymethyl-7,8-dihydropteroate synthase (DHPS), which is an attractive target since it is present in microorganisms but not in humans. The existing drugs for this target are the sulfa drugs, which have been used for about seven decades. However, single mutations in the DHPS gene can cause resistance to sulfa drugs. Therefore, there is a need for the design of novel drugs. Based on the recently determined crystal structure of Mycobacterium tuberculosis (M.tb) DHPS complexed with a known substrate analogue, and on the crystal structures of E. coli DHPS and Staphylococcus aureus DHPS, we have identified a dipeptide inhibitor with the sequence WK. Docking calculations indicate that this peptide has a significantly higher potency than the sulfa drugs. In addition, the potency is 70-90 times higher for M.tb DHPS as compared to that for the pterin and folate-binding sites of key human proteins. Thus, the designed inhibitor is a promising lead compound for the development of novel antimycobcaterial agents.

  12. Moment Matrices, Border Bases and Real Radical Computation

    CERN Document Server

    Lasserre, Jean-Bernard; Mourrain, Bernard; Rostalki, Philipp; Trébuchet, Philippe

    2011-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is finite. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-definite programming. While the border basis algorithms of [17] are efficient and numerically stable for computing complex roots, algorithms based on moment matrices [12] allow the incorporation of additional polynomials, e.g., to re- strict the computation to real roots or to eliminate multiple solutions. The proposed algorithm can be used to compute a border basis of the input ideal and, as opposed to other approaches, it can also compute the quotient structure of the (real) radical ideal directly, i.e., without prior algebraic techniques such as Gr\\"obner bases. It thus combines the strength of existing algorithms and provides a unified treatment for the computation of border bases for the ideal, the radical ideal and the real r...

  13. Neuromolecular computing: a new approach to human brain evolution.

    Science.gov (United States)

    Wallace, R; Price, H

    1999-09-01

    Evolutionary approaches in human cognitive neurobiology traditionally emphasize macroscopic structures. It may soon be possible to supplement these studies with models of human information-processing of the molecular level. Thin-film, simulation, fluorescence microscopy, and high-resolution X-ray crystallographic studies provide evidence for transiently organized neural membrane molecular systems with possible computational properties. This review article examines evidence for hydrophobic-mismatch molecular interactions within phospholipid microdomains of a neural membrane bilayer. It is proposed that these interactions are a massively parallel algorithm which can rapidly compute near-optimal solutions to complex cognitive and physiological problems. Coupling of microdomain activity to permenant ion movements at ligand-gated and voltage-gated channels permits the conversion of molecular computations into neuron frequency codes. Evidence for microdomain transport of proteins to specific locations within the bilayer suggests that neuromolecular computation may be under some genetic control and thus modifiable by natural selection. A possible experimental approach for examining evolutionary changes in neuromolecular computation is briefly discussed.

  14. Dielectric properties of periodic heterostructures: A computational electrostatics approach

    Science.gov (United States)

    Brosseau, C.; Beroual, A.

    1999-04-01

    The dielectric properties of heterogeneous materials for various condensed-matter systems are important for several technologies, e.g. impregnated polymers for high-density capacitors, polymer carbon black mixtures for automotive tires and current limiters in circuit protection. These multiscale systems lead to challenging problems of connecting microstructural features (shape, spatial arrangement and size distribution of inclusions) to macroscopic materials response (permittivity, conductivity). In this paper, we briefly discuss an ab initio computational electrostatics approach, based either on the use of the field calculation package FLUX3D (or FLUX2D) and a conventional finite elements method, or the use of the field calculation package PHI3D and the resolution of boundary integral equations, for calculating the effective permittivity of two-component dielectric heterostructures. Numerical results concerning inclusions of permittivity \\varepsilon_1 with various geometrical shapes periodically arranged in a host matrix of permittivity \\varepsilon_2 are provided. Next we discuss these results in terms of phenomenological mixing laws, analytical theory and connectedness. During the pursuit of these activities, several interesting phenomena were discovered that will stimulate further investigation.

  15. Applying a cloud computing approach to storage architectures for spacecraft

    Science.gov (United States)

    Baldor, Sue A.; Quiroz, Carlos; Wood, Paul

    As sensor technologies, processor speeds, and memory densities increase, spacecraft command, control, processing, and data storage systems have grown in complexity to take advantage of these improvements and expand the possible missions of spacecraft. Spacecraft systems engineers are increasingly looking for novel ways to address this growth in complexity and mitigate associated risks. Looking to conventional computing, many solutions have been executed to solve both the problem of complexity and heterogeneity in systems. In particular, the cloud-based paradigm provides a solution for distributing applications and storage capabilities across multiple platforms. In this paper, we propose utilizing a cloud-like architecture to provide a scalable mechanism for providing mass storage in spacecraft networks that can be reused on multiple spacecraft systems. By presenting a consistent interface to applications and devices that request data to be stored, complex systems designed by multiple organizations may be more readily integrated. Behind the abstraction, the cloud storage capability would manage wear-leveling, power consumption, and other attributes related to the physical memory devices, critical components in any mass storage solution for spacecraft. Our approach employs SpaceWire networks and SpaceWire-capable devices, although the concept could easily be extended to non-heterogeneous networks consisting of multiple spacecraft and potentially the ground segment.

  16. A Novel Approach for Reduce Energy Consumption in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Najmeh Moghadasi

    2015-09-01

    Full Text Available In recent years, using mobile devices has a special place in human life and applicability of these devices leads to increased number of users. Business companies have integrated them with cloud computing technology and have provided mobile cloud in order to improve using mobile devices and overcome the energy consumption of mobile devices. In mobile cloud computing, computations and storages of mobile devices applications are transferred to cloud data centers and mobile devices are used merely as user interface to access services. Therefore, cloud computing will help to reduce energy consumption of mobile devices. In this paper, a new approach is given to reduce energy consumption of based on Learning Automata in mobile cloud computing. Simulation results show that our proposed approach dramatically saves energy consumption through determining the appropriate location for application.

  17. One approach for evaluating the Distributed Computing Design System (DCDS)

    Science.gov (United States)

    Ellis, J. T.

    1985-01-01

    The Distributed Computer Design System (DCDS) provides an integrated environment to support the life cycle of developing real-time distributed computing systems. The primary focus of DCDS is to significantly increase system reliability and software development productivity, and to minimize schedule and cost risk. DCDS consists of integrated methodologies, languages, and tools to support the life cycle of developing distributed software and systems. Smooth and well-defined transistions from phase to phase, language to language, and tool to tool provide a unique and unified environment. An approach to evaluating DCDS highlights its benefits.

  18. An evolutionary computational approach for the dynamic Stackelberg competition problems

    Directory of Open Access Journals (Sweden)

    Lorena Arboleda-Castro

    2016-06-01

    Full Text Available Stackelberg competition models are an important family of economical decision problems from game theory, in which the main goal is to find optimal strategies between two competitors taking into account their hierarchy relationship. Although these models have been widely studied in the past, it is important to note that very few works deal with uncertainty scenarios, especially those that vary over time. In this regard, the present research studies this topic and proposes a computational method for solving efficiently dynamic Stackelberg competition models. The computational experiments suggest that the proposed approach is effective for problems of this nature.

  19. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    Science.gov (United States)

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  20. Randomized benchmarking in measurement-based quantum computing

    Science.gov (United States)

    Alexander, Rafael N.; Turner, Peter S.; Bartlett, Stephen D.

    2016-09-01

    Randomized benchmarking is routinely used as an efficient method for characterizing the performance of sets of elementary logic gates in small quantum devices. In the measurement-based model of quantum computation, logic gates are implemented via single-site measurements on a fixed universal resource state. Here we adapt the randomized benchmarking protocol for a single qubit to a linear cluster state computation, which provides partial, yet efficient characterization of the noise associated with the target gate set. Applying randomized benchmarking to measurement-based quantum computation exhibits an interesting interplay between the inherent randomness associated with logic gates in the measurement-based model and the random gate sequences used in benchmarking. We consider two different approaches: the first makes use of the standard single-qubit Clifford group, while the second uses recently introduced (non-Clifford) measurement-based 2-designs, which harness inherent randomness to implement gate sequences.

  1. A Multi-step and Multi-level approach for Computer Aided Molecular Design

    DEFF Research Database (Denmark)

    A general multi-step approach for setting up, solving and solution analysis of computer aided molecular design (CAMD) problems is presented. The approach differs from previous work within the field of CAMD since it also addresses the need for a computer aided problem formulation and result analysis....... The problem formulation step incorporates a knowledge base for the identification and setup of the design criteria. Candidate compounds are identified using a multi-level generate and test CAMD solution algorithm capable of designing molecules having a high level of molecular detail. A post solution step...... using an Integrated Computer Aided System (ICAS) for result analysis and verification is included in the methodology. Keywords: CAMD, separation processes, knowledge base, molecular design, solvent selection, substitution, group contribution, property prediction, ICAS Introduction The use of Computer...

  2. Computational Approach for Multi Performances Optimization of EDM

    Directory of Open Access Journals (Sweden)

    Yusoff Yusliza

    2016-01-01

    Full Text Available This paper proposes a new computational approach employed in obtaining optimal parameters of multi performances EDM. Regression and artificial neural network (ANN are used as the modeling techniques meanwhile multi objective genetic algorithm (multiGA is used as the optimization technique. Orthogonal array L256 is implemented in the procedure of network function and network architecture selection. Experimental studies are carried out to verify the machining performances suggested by this approach. The highest MRR value obtained from OrthoANN – MPR – MultiGA is 205.619 mg/min and the lowest Ra value is 0.0223μm.

  3. COMPTEL skymapping: a new approach using parallel computing

    OpenAIRE

    Strong, A.W.; Bloemen, H.; Diehl, R.; Hermsen, W.; Schoenfelder, V.

    1998-01-01

    Large-scale skymapping with COMPTEL using the full survey database presents challenging problems on account of the complex response and time-variable background. A new approach which attempts to address some of these problems is described, in which the information about each observation is preserved throughout the analysis. In this method, a maximum-entropy algorithm is used to determine image and background simultaneously. Because of the extreme computing requirements, the method has been im...

  4. Review: the physiological and computational approaches for atherosclerosis treatment.

    Science.gov (United States)

    Wang, Wuchen; Lee, Yugyung; Lee, Chi H

    2013-09-01

    The cardiovascular disease has long been an issue that causes severe loss in population, especially those conditions associated with arterial malfunction, being attributable to atherosclerosis and subsequent thrombotic formation. This article reviews the physiological mechanisms that underline the transition from plaque formation in atherosclerotic process to platelet aggregation and eventually thrombosis. The physiological and computational approaches, such as percutaneous coronary intervention and stent design modeling, to detect, evaluate and mitigate this malicious progression were also discussed.

  5. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  6. Data Cloud Computing based on LINQ

    Institute of Scientific and Technical Information of China (English)

    Junwen Lu; Yongsheng Hao; Lubin Zheng; Guanfeng Liu

    2015-01-01

    Cloud computing has demonstrated that processing very large datasets over commodity cluster can be done simply given the right programming structure. Work to date, the many choices bring difficulty because it is difficult to make a best selection. The LINQ(Language Integrated Query) programming model can be extended to massively-paralel, data-driven computations. It not only provides a seamless transition path from computing on the top transitional stores like relational databases or XML to computing on the Cloud, but also ofers an object-oriented, compositional model. In this paper, we introduce LINQ into Cloud and discuss LINQ is a good selection for Data Cloud, and then the detail of file system management was described based on LINQ.

  7. Computer Mediated Learning: An Example of an Approach.

    Science.gov (United States)

    Arcavi, Abraham; Hadas, Nurit

    2000-01-01

    There are several possible approaches in which dynamic computerized environments play a significant and possibly unique role in supporting innovative learning trajectories in mathematics in general and geometry in particular. Describes an approach based on a problem situation and some experiences using it with students and teachers. (Contains 15…

  8. An approach to experimental evaluation of real-time fault-tolerant distributed computing schemes

    Science.gov (United States)

    Kim, K. H.

    1989-01-01

    A testbed-based approach to the evaluation of fault-tolerant distributed computing schemes is discussed. The approach is based on experimental incorporation of system structuring and design techniques into real-time distributed-computing testbeds centered around tightly coupled microcomputer networks. The effectiveness of this approach has been experimentally confirmed. Primary advantages of this approach include the accuracy of the timing and logical-complexity data and the degree of assurance of the practical effectiveness of the scheme evaluated. Various design issues encountered in the course of establishing the network testbed facilities are discussed, along with their augmentation to support some experiments. The shortcomings of the testbeds are also discussed together with the desired extensions of the testbeds.

  9. Understanding Computer-Based Digital Video.

    Science.gov (United States)

    Martindale, Trey

    2002-01-01

    Discussion of new educational media and technology focuses on producing and delivering computer-based digital video. Highlights include video standards, including international standards and aspect ratio; camera formats and features, including costs; shooting digital video; editing software; compression; and a list of informative Web sites. (LRW)

  10. Agent based computational model of trust

    NARCIS (Netherlands)

    A. Gorobets (Alexander); B. Nooteboom (Bart)

    2004-01-01

    textabstractThis paper employs the methodology of Agent-Based Computational Economics (ACE) to investigate under what conditions trust can be viable in markets. The emergence and breakdown of trust is modeled in a context of multiple buyers and suppliers. Agents adapt their trust in a partner, the w

  11. Single electron tunneling based arithmetic computation

    NARCIS (Netherlands)

    Lageweg, C.R.

    2004-01-01

    In this dissertation we investigate the implementation of computer arithmetic operations with Single Electron Tunneling (SET) technology based circuits. In our research we focus on the effective utilization of the SET technologys specific characteristic, i.e., the ability to control the transport of

  12. Educator Beliefs Regarding Computer-Based Instruction.

    Science.gov (United States)

    Swann, D. LaDon; Branson, Floyd, Jr.; Talbert, B. Allen

    2003-01-01

    Extension educators (n=17) completed two of five technical sections from an aquaculture CD-ROM tutorial. Evidence from pre/post-training questionnaires, content assessments, and follow-up interviews reveals favorable attitudes toward computer-based inservice training. The ability to spend less time out of their county and to review materials after…

  13. Evaluation of a Computer-Based Narrative

    Science.gov (United States)

    Sharf, Richard S.

    1978-01-01

    A computer-based narrative report integrating results from the Strong Vocational Interest Blank, the Opinion Attitude and Interest Survey, and the Cooperative English Test was compared with a standard profile format. No differences were found between the two methods for male and female. (Author)

  14. A computational toy model for shallow landslides: Molecular dynamics approach

    Science.gov (United States)

    Martelloni, Gianluca; Bagnoli, Franco; Massaro, Emanuele

    2013-09-01

    The aim of this paper is to propose a 2D computational algorithm for modeling the triggering and propagation of shallow landslides caused by rainfall. We used a molecular dynamics (MD) approach, similar to the discrete element method (DEM), that is suitable to model granular material and to observe the trajectory of a single particle, so to possibly identify its dynamical properties. We consider that the triggering of shallow landslides is caused by the decrease of the static friction along the sliding surface due to water infiltration by rainfall. Thence the triggering is caused by the two following conditions: (a) a threshold speed of the particles and (b) a condition on the static friction, between the particles and the slope surface, based on the Mohr-Coulomb failure criterion. The latter static condition is used in the geotechnical model to estimate the possibility of landslide triggering. The interaction force between particles is modeled, in the absence of experimental data, by means of a potential similar to the Lennard-Jones one. The viscosity is also introduced in the model and for a large range of values of the model's parameters, we observe a characteristic velocity pattern, with acceleration increments, typical of real landslides. The results of simulations are quite promising: the energy and time triggering distribution of local avalanches show a power law distribution, analogous to the observed Gutenberg-Richter and Omori power law distributions for earthquakes. Finally, it is possible to apply the method of the inverse surface displacement velocity [4] for predicting the failure time.

  15. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    Science.gov (United States)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  16. Form-based Approaches vs. Task-Based Approaches

    Directory of Open Access Journals (Sweden)

    Zahra Talebi

    2015-07-01

    Full Text Available This study aimed at investigating whether task-based approaches bear any superiority to that of more traditional ones evident in presentation-practice- and production phase .to fulfill the purpose of the study, the participants within the age range of 11-19, took part in the study. Following a pretest, treatment, and a posttest, the obtained data was analyzed using analysis of covariance (ANCOVA to examine the effects of the variables. The results of the analysis showed that participants in the PPP group did significantly better in the grammar recognition of the posttest than that of the task group. However, their counterparts in the task group gained better scores in the writing section of the test .this research study provided evidence in support of task proponents' claim in the merit of task-based activity in raising learners' implicit knowledge claiming to play the primary role in spontaneous speech.Keywords: Task-based language teaching, PPP model, focus on form, focus on meaning

  17. A Crisis Management Approach To Mission Survivability In Computational Multi-Agent Systems

    Directory of Open Access Journals (Sweden)

    Aleksander Byrski

    2010-01-01

    Full Text Available In this paper we present a biologically-inspired approach for mission survivability (consideredas the capability of fulfilling a task such as computation that allows the system to be aware ofthe possible threats or crises that may arise. This approach uses the notion of resources usedby living organisms to control their populations.We present the concept of energetic selectionin agent-based evolutionary systems as well as the means to manipulate the configuration ofthe computation according to the crises or user’s specific demands.

  18. A radial basis function network approach for the computation of inverse continuous time variant functions.

    Science.gov (United States)

    Mayorga, René V; Carrera, Jonathan

    2007-06-01

    This Paper presents an efficient approach for the fast computation of inverse continuous time variant functions with the proper use of Radial Basis Function Networks (RBFNs). The approach is based on implementing RBFNs for computing inverse continuous time variant functions via an overall damped least squares solution that includes a novel null space vector for singularities prevention. The singularities avoidance null space vector is derived from developing a sufficiency condition for singularities prevention that conduces to establish some characterizing matrices and an associated performance index.

  19. Computational neuroscience approach to biomarkers and treatments for mental disorders.

    Science.gov (United States)

    Yahata, Noriaki; Kasai, Kiyoto; Kawato, Mitsuo

    2017-04-01

    Psychiatry research has long experienced a stagnation stemming from a lack of understanding of the neurobiological underpinnings of phenomenologically defined mental disorders. Recently, the application of computational neuroscience to psychiatry research has shown great promise in establishing a link between phenomenological and pathophysiological aspects of mental disorders, thereby recasting current nosology in more biologically meaningful dimensions. In this review, we highlight recent investigations into computational neuroscience that have undertaken either theory- or data-driven approaches to quantitatively delineate the mechanisms of mental disorders. The theory-driven approach, including reinforcement learning models, plays an integrative role in this process by enabling correspondence between behavior and disorder-specific alterations at multiple levels of brain organization, ranging from molecules to cells to circuits. Previous studies have explicated a plethora of defining symptoms of mental disorders, including anhedonia, inattention, and poor executive function. The data-driven approach, on the other hand, is an emerging field in computational neuroscience seeking to identify disorder-specific features among high-dimensional big data. Remarkably, various machine-learning techniques have been applied to neuroimaging data, and the extracted disorder-specific features have been used for automatic case-control classification. For many disorders, the reported accuracies have reached 90% or more. However, we note that rigorous tests on independent cohorts are critically required to translate this research into clinical applications. Finally, we discuss the utility of the disorder-specific features found by the data-driven approach to psychiatric therapies, including neurofeedback. Such developments will allow simultaneous diagnosis and treatment of mental disorders using neuroimaging, thereby establishing 'theranostics' for the first time in clinical

  20. An Efficient Approach for Fast and Accurate Voltage Stability Margin Computation in Large Power Grids

    Directory of Open Access Journals (Sweden)

    Heng-Yi Su

    2016-11-01

    Full Text Available This paper proposes an efficient approach for the computation of voltage stability margin (VSM in a large-scale power grid. The objective is to accurately and rapidly determine the load power margin which corresponds to voltage collapse phenomena. The proposed approach is based on the impedance match-based technique and the model-based technique. It combines the Thevenin equivalent (TE network method with cubic spline extrapolation technique and the continuation technique to achieve fast and accurate VSM computation for a bulk power grid. Moreover, the generator Q limits are taken into account for practical applications. Extensive case studies carried out on Institute of Electrical and Electronics Engineers (IEEE benchmark systems and the Taiwan Power Company (Taipower, Taipei, Taiwan system are used to demonstrate the effectiveness of the proposed approach.

  1. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  2. Understanding Plant Nitrogen Metabolism through Metabolomics and Computational Approaches

    Directory of Open Access Journals (Sweden)

    Perrin H. Beatty

    2016-10-01

    Full Text Available A comprehensive understanding of plant metabolism could provide a direct mechanism for improving nitrogen use efficiency (NUE in crops. One of the major barriers to achieving this outcome is our poor understanding of the complex metabolic networks, physiological factors, and signaling mechanisms that affect NUE in agricultural settings. However, an exciting collection of computational and experimental approaches has begun to elucidate whole-plant nitrogen usage and provides an avenue for connecting nitrogen-related phenotypes to genes. Herein, we describe how metabolomics, computational models of metabolism, and flux balance analysis have been harnessed to advance our understanding of plant nitrogen metabolism. We introduce a model describing the complex flow of nitrogen through crops in a real-world agricultural setting and describe how experimental metabolomics data, such as isotope labeling rates and analyses of nutrient uptake, can be used to refine these models. In summary, the metabolomics/computational approach offers an exciting mechanism for understanding NUE that may ultimately lead to more effective crop management and engineered plants with higher yields.

  3. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  4. A Computationally Efficient and Adaptive Approach for Online Embedded Machinery Diagnosis in Harsh Environments

    Directory of Open Access Journals (Sweden)

    Chuan Jiang

    2013-01-01

    Full Text Available Condition-based monitoring (CBM has advanced to the stage where industry is now demanding machinery that possesses self-diagnosis ability. This need has spurred the CBM research to be applicable in more expanded areas over the past decades. There are two critical issues in implementing CBM in harsh environments using embedded systems: computational efficiency and adaptability. In this paper, a computationally efficient and adaptive approach including simple principal component analysis (SPCA for feature dimensionality reduction and K-means clustering for classification is proposed for online embedded machinery diagnosis. Compared with the standard principal component analysis (PCA and kernel principal component analysis (KPCA, SPCA is adaptive in nature and has lower algorithm complexity when dealing with a large amount of data. The effectiveness of the proposed approach is firstly validated using a standard rolling element bearing test dataset on a personal computer. It is then deployed on an embedded real-time controller and used to monitor a rotating shaft. It was found that the proposed approach scaled well, whereas the standard PCA-based approach broke down when data quantity increased to a certain level. Furthermore, the proposed approach achieved 90% accuracy when diagnosing an induced fault compared to 59% accuracy obtained using the standard PCA-based approach.

  5. Solubility of nonelectrolytes: a first-principles computational approach.

    Science.gov (United States)

    Jackson, Nicholas E; Chen, Lin X; Ratner, Mark A

    2014-05-15

    Using a combination of classical molecular dynamics and symmetry adapted intermolecular perturbation theory, we develop a high-accuracy computational method for examining the solubility energetics of nonelectrolytes. This approach is used to accurately compute the cohesive energy density and Hildebrand solubility parameters of 26 molecular liquids. The energy decomposition of symmetry adapted perturbation theory is then utilized to develop multicomponent Hansen-like solubility parameters. These parameters are shown to reproduce the solvent categorizations (nonpolar, polar aprotic, or polar protic) of all molecular liquids studied while lending quantitative rigor to these qualitative categorizations via the introduction of simple, easily computable parameters. Notably, we find that by monitoring the first-order exchange energy contribution to the total interaction energy, one can rigorously determine the hydrogen bonding character of a molecular liquid. Finally, this method is applied to compute explicitly the Flory interaction parameter and the free energy of mixing for two different small molecule mixtures, reproducing the known miscibilities. This methodology represents an important step toward the prediction of molecular solubility from first principles.

  6. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  7. WEB BASED LEARNING OF COMPUTER NETWORK COURSE

    Directory of Open Access Journals (Sweden)

    Hakan KAPTAN

    2004-04-01

    Full Text Available As a result of developing on Internet and computer fields, web based education becomes one of the area that many improving and research studies are done. In this study, web based education materials have been explained for multimedia animation and simulation aided Computer Networks course in Technical Education Faculties. Course content is formed by use of university course books, web based education materials and technology web pages of companies. Course content is formed by texts, pictures and figures to increase motivation of students and facilities of learning some topics are supported by animations. Furthermore to help working principles of routing algorithms and congestion control algorithms simulators are constructed in order to interactive learning

  8. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  9. Benchmarking of computer codes and approaches for modeling exposure scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R.R. [EG and G Idaho, Inc., Idaho Falls, ID (United States); Rittmann, P.D.; Wood, M.I. [Westinghouse Hanford Co., Richland, WA (United States); Cook, J.R. [Westinghouse Savannah River Co., Aiken, SC (United States)

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided.

  10. Computational approaches for rational design of proteins with novel functionalities

    Directory of Open Access Journals (Sweden)

    Manish Kumar Tiwari

    2012-09-01

    Full Text Available Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  11. Computational approaches for rational design of proteins with novel functionalities.

    Science.gov (United States)

    Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar; Kim, In-Won; Lee, Jung-Kul

    2012-01-01

    Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  12. Challenges and possible approaches: towards the petaflops computers

    Institute of Scientific and Technical Information of China (English)

    Depei QIAN; Danfeng ZHU

    2009-01-01

    In parallel with the R&D efforts in USA and Eu-rope, China's National High-tech R&D program has setup its goal in developing petaflops computers. Researchers and engineers world-wide are looking for appropriate methods and technologies to achieve the petaflops computer system. Based on discussion on important design issues in devel-oping the petafiops computer, this paper raises the major technological challenges including the memory wall, low power system design, interconnects, and programming sup-port, etc. Current efforts in addressing some of these chal-lenges and in pursuing possible solutions for developing the petaflops systems are presented. Several existing systems are briefly introduced as examples, including Roadrunner, Cray XT5 jaguar, Dawning 5000A/6000, and Lenovo DeepComp 7000. Architectures proposed by Chinese researchers for im-plementing the petaflops computer are also introduced. Ad-vantages of the architecture as well as the difficulties in its implementation are discussed. Finally, future research direc-tion in development of high productivity computing systems is discussed.

  13. A Comparative Evaluation of Computer Based and Non-Computer Based Instructional Strategies.

    Science.gov (United States)

    Emerson, Ian

    1988-01-01

    Compares the computer assisted instruction (CAI) tutorial with its non-computerized pedagogical roots: the Socratic Dialog with Skinner's Programmed Instruction. Tests the effectiveness of a CAI tutorial on diffusion and osmosis against four other interactive and non-interactive instructional strategies. Notes computer based strategies were…

  14. Ontology Partitioning: Clustering Based Approach

    Directory of Open Access Journals (Sweden)

    Soraya Setti Ahmed

    2015-05-01

    Full Text Available The semantic web goal is to share and integrate data across different domains and organizations. The knowledge representations of semantic data are made possible by ontology. As the usage of semantic web increases, construction of the semantic web ontologies is also increased. Moreover, due to the monolithic nature of the ontology various semantic web operations like query answering, data sharing, data matching, data reuse and data integration become more complicated as the size of ontology increases. Partitioning the ontology is the key solution to handle this scalability issue. In this work, we propose a revision and an enhancement of K-means clustering algorithm based on a new semantic similarity measure for partitioning given ontology into high quality modules. The results show that our approach produces meaningful clusters than the traditional algorithm of K-means.

  15. Computational systems biology approaches to anti-angiogenic cancer therapeutics.

    Science.gov (United States)

    Finley, Stacey D; Chu, Liang-Hui; Popel, Aleksander S

    2015-02-01

    Angiogenesis is an exquisitely regulated process that is required for physiological processes and is also important in numerous diseases. Tumors utilize angiogenesis to generate the vascular network needed to supply the cancer cells with nutrients and oxygen, and many cancer drugs aim to inhibit tumor angiogenesis. Anti-angiogenic therapy involves inhibiting multiple cell types, molecular targets, and intracellular signaling pathways. Computational tools are useful in guiding treatment strategies, predicting the response to treatment, and identifying new targets of interest. Here, we describe progress that has been made in applying mathematical modeling and bioinformatics approaches to study anti-angiogenic therapeutics in cancer.

  16. Approaches to Computer Modeling of Phosphate Hide-Out.

    Science.gov (United States)

    1984-06-28

    phosphate acts as a buffer to keep pH at a value above which acid corrosion occurs . and below which caustic corrosion becomes significant. Difficulties are...ionization of dihydrogen phosphate : HIPO - + + 1PO, K (B-7) H+ + - £Iao 1/1, (B-8) H , PO4 - + O- - H0 4 + H20 K/Kw (0-9) 19 * Such zero heat...OF STANDARDS-1963-A +. .0 0 0 9t~ - 4 NRL Memorandum Report 5361 4 Approaches to Computer Modeling of Phosphate Hide-Out K. A. S. HARDY AND J. C

  17. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  18. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  19. Exploiting Self-organization in Bioengineered Systems: A Computational Approach.

    Science.gov (United States)

    Davis, Delin; Doloman, Anna; Podgorski, Gregory J; Vargis, Elizabeth; Flann, Nicholas S

    2017-01-01

    The productivity of bioengineered cell factories is limited by inefficiencies in nutrient delivery and waste and product removal. Current solution approaches explore changes in the physical configurations of the bioreactors. This work investigates the possibilities of exploiting self-organizing vascular networks to support producer cells within the factory. A computational model simulates de novo vascular development of endothelial-like cells and the resultant network functioning to deliver nutrients and extract product and waste from the cell culture. Microbial factories with vascular networks are evaluated for their scalability, robustness, and productivity compared to the cell factories without a vascular network. Initial studies demonstrate that at least an order of magnitude increase in production is possible, the system can be scaled up, and the self-organization of an efficient vascular network is robust. The work suggests that bioengineered multicellularity may offer efficiency improvements difficult to achieve with physical engineering approaches.

  20. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    Institute of Scientific and Technical Information of China (English)

    Junaid Ali Khan; Muhammad Asif Zahoor Raja; Ijaz Mansoor Qureshi

    2011-01-01

    @@ We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs).The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error.The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique.The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations.We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods.The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy.With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.%We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed.

  1. Computational approach for calculating bound states in quantum field theory

    Science.gov (United States)

    Lv, Q. Z.; Norris, S.; Brennan, R.; Stefanovich, E.; Su, Q.; Grobe, R.

    2016-09-01

    We propose a nonperturbative approach to calculate bound-state energies and wave functions for quantum field theoretical models. It is based on the direct diagonalization of the corresponding quantum field theoretical Hamiltonian in an effectively discretized and truncated Hilbert space. We illustrate this approach for a Yukawa-like interaction between fermions and bosons in one spatial dimension and show where it agrees with the traditional method based on the potential picture and where it deviates due to recoil and radiative corrections. This method permits us also to obtain some insight into the spatial characteristics of the distribution of the fermions in the ground state, such as the bremsstrahlung-induced widening.

  2. Reconceptualizing Pedagogical Usability of and Teachers' Roles in Computer Game-Based Learning in School

    Science.gov (United States)

    Tzuo, Pei-Wen; Ling, Jennifer Isabelle Ong Pei; Yang, Chien-Hui; Chen, Vivian Hsueh-Hua

    2012-01-01

    At present, methods for the optimal use of two approaches to computer game-based learning in school to enhance students' learning, namely, computer game play and game design, are obscure because past research has been devoted more to designing rather than evaluating the implementation of these approaches in school. In addition, most studies…

  3. Reconceptualizing Pedagogical Usability of and Teachers' Roles in Computer Game-Based Learning in School

    Science.gov (United States)

    Tzuo, Pei-Wen; Ling, Jennifer Isabelle Ong Pei; Yang, Chien-Hui; Chen, Vivian Hsueh-Hua

    2012-01-01

    At present, methods for the optimal use of two approaches to computer game-based learning in school to enhance students' learning, namely, computer game play and game design, are obscure because past research has been devoted more to designing rather than evaluating the implementation of these approaches in school. In addition, most studies…

  4. Computing negentropy based signatures for texture recognition

    Directory of Open Access Journals (Sweden)

    Daniela COLTUC

    2007-12-01

    Full Text Available The proposed method aims to provide a new tool for texture recognition. For this purpose, a set of texture samples are decomposed by using the FastICA algorithm and characterized by a negentropy based signature. In order to do recognition, the texture signatures are compared by means of Minkowski distance. The recognition rates, computed for a set of 320 texture samples, show a medium recognition accuracy and the method may be further improved.

  5. Materiality in a Practice-Based Approach

    Science.gov (United States)

    Svabo, Connie

    2009-01-01

    Purpose: The paper aims to provide an overview of the vocabulary for materiality which is used by practice-based approaches to organizational knowing. Design/methodology/approach: The overview is theoretically generated and is based on the anthology Knowing in Organizations: A Practice-based Approach edited by Nicolini, Gherardi and Yanow. The…

  6. Wavelet based approach for facial expression recognition

    Directory of Open Access Journals (Sweden)

    Zaenal Abidin

    2015-03-01

    Full Text Available Facial expression recognition is one of the most active fields of research. Many facial expression recognition methods have been developed and implemented. Neural networks (NNs have capability to undertake such pattern recognition tasks. The key factor of the use of NN is based on its characteristics. It is capable in conducting learning and generalizing, non-linear mapping, and parallel computation. Backpropagation neural networks (BPNNs are the approach methods that mostly used. In this study, BPNNs were used as classifier to categorize facial expression images into seven-class of expressions which are anger, disgust, fear, happiness, sadness, neutral and surprise. For the purpose of feature extraction tasks, three discrete wavelet transforms were used to decompose images, namely Haar wavelet, Daubechies (4 wavelet and Coiflet (1 wavelet. To analyze the proposed method, a facial expression recognition system was built. The proposed method was tested on static images from JAFFE database.

  7. A MODEL BASED ALGORITHM FOR FAST DPIV COMPUTING

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Traditional DPIV (Digital Particle Image Velocimetry) methods aremostly based on area-correlation (Willert,C.E.,1991).Though proven to be very time-consuming and very much error prone,they are widely adopted because of they are conceptually simple and easily implemented,and also because there are few alternatives.This paper proposes a non-correlation,conceptually new,fast and efficient approach for DPIV,which takes the nature of flow into consideration.An Incompressible Affined Flow Model (IAFM) is introduced to describe a flow that incorporates rational restraints into the computation.This IAFM,combined with a modified optical flow method-named Total Optical Flow Computation (TOFC),provides a linear system solution to DPIV.Experimental results on real images showed our method to be a very promising approach for DPIV.

  8. ICOHR: intelligent computer based oral health record.

    Science.gov (United States)

    Peterson, L C; Cobb, D S; Reynolds, D C

    1995-01-01

    The majority of work on computer use in the dental field has focused on non-clinical practice management information needs. Very few computer-based dental information systems provide management support of the clinical care process, particularly with respect to quality management. Traditional quality assurance methods rely on the paper record and provide only retrospective analysis. Today, proactive quality management initiatives are on the rise. Computer-based dental information systems are being integrated into the care environment, actively providing decision support as patient care is being delivered. These new systems emphasize assessment and improvement of patient care at the time of treatment, thus building internal quality management into the caregiving process. The integration of real time quality management and patient care will be expedited by the introduction of an information system architecture that emulates the gathering and storage of clinical care data currently provided by the paper record. As a proposed solution to the problems associated with existing dental record systems, the computer-based patient record has emerged as a possible alternative to the paper dental record. The Institute of Medicine (IOM) recently conducted a study on improving the efficiency and accuracy of patient record keeping. As a result of this study, the IOM advocates the development and implementation of computer-based patient records as the standard for all patient care records. This project represents the ongoing efforts of The University of Iowa College of Dentistry's collaboration with the University of Uppsala Data Center, Uppsala, Sweden, on a computer-based patient dental record model. ICOHR (Intelligent Computer Based Oral Health Record) is an information system which brings together five important parts of the patient's dental record: medical and dental history; oral status; treatment planning; progress notes; and a Patient Care Database, generated from their

  9. Computational Model of Music Sight Reading: A Reinforcement Learning Approach

    CERN Document Server

    Yahya, Keyvan

    2010-01-01

    Although the Music Sight Reading process usually has been studied from the cognitive or neurological view points, but the computational learning methods like the Reinforcement Learning have not yet been used to modeling of such processes. In this paper with regards to essential properties of our specific problem, we consider the value function concept and will indicate that the optimum policy can be obtained by the method we offer without to be getting involved with computing of the complex value functions which are in most of cases inexact. Also, the algorithm we will offer here is somehow a PDE based algorithm which is associated with a stochastic optimization programming and we consider that in this case, this one is more applicable than the normative algorithms like temporal difference method.

  10. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  11. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt;

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...

  12. Computational approaches for microalgal biofuel optimization: a review.

    Science.gov (United States)

    Koussa, Joseph; Chaiboonchoe, Amphun; Salehi-Ashtiani, Kourosh

    2014-01-01

    The increased demand and consumption of fossil fuels have raised interest in finding renewable energy sources throughout the globe. Much focus has been placed on optimizing microorganisms and primarily microalgae, to efficiently produce compounds that can substitute for fossil fuels. However, the path to achieving economic feasibility is likely to require strain optimization through using available tools and technologies in the fields of systems and synthetic biology. Such approaches invoke a deep understanding of the metabolic networks of the organisms and their genomic and proteomic profiles. The advent of next generation sequencing and other high throughput methods has led to a major increase in availability of biological data. Integration of such disparate data can help define the emergent metabolic system properties, which is of crucial importance in addressing biofuel production optimization. Herein, we review major computational tools and approaches developed and used in order to potentially identify target genes, pathways, and reactions of particular interest to biofuel production in algae. As the use of these tools and approaches has not been fully implemented in algal biofuel research, the aim of this review is to highlight the potential utility of these resources toward their future implementation in algal research.

  13. Computational Approaches for Microalgal Biofuel Optimization: A Review

    Directory of Open Access Journals (Sweden)

    Joseph Koussa

    2014-01-01

    Full Text Available The increased demand and consumption of fossil fuels have raised interest in finding renewable energy sources throughout the globe. Much focus has been placed on optimizing microorganisms and primarily microalgae, to efficiently produce compounds that can substitute for fossil fuels. However, the path to achieving economic feasibility is likely to require strain optimization through using available tools and technologies in the fields of systems and synthetic biology. Such approaches invoke a deep understanding of the metabolic networks of the organisms and their genomic and proteomic profiles. The advent of next generation sequencing and other high throughput methods has led to a major increase in availability of biological data. Integration of such disparate data can help define the emergent metabolic system properties, which is of crucial importance in addressing biofuel production optimization. Herein, we review major computational tools and approaches developed and used in order to potentially identify target genes, pathways, and reactions of particular interest to biofuel production in algae. As the use of these tools and approaches has not been fully implemented in algal biofuel research, the aim of this review is to highlight the potential utility of these resources toward their future implementation in algal research.

  14. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    Science.gov (United States)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected

  15. A Novel Approach of Load Balancing in Cloud Computing using Computational Intelligence

    Directory of Open Access Journals (Sweden)

    Shabnam Sharma

    2016-02-01

    Full Text Available Nature Inspired Meta-Heuristic algorithms are proved to be beneficial for solving real world combinatorial problems such as minimum spanning tree, knapsack problem, process planning problems, load balancing and many more. In this research work, existing meta-heuristic approaches are discussed. Due to astonishing feature of echolocation, bat algorithm has drawn major attention in recent years and is applicable in different applications such vehicle routing optimization, time-tabling in railway optimization problems, load balancing in cloud computing etc. Later, the biological behaviour of bats is explored and various areas of further research are discussed. Finally, the main objective of the research paper is to propose an algorithm for one of the most important application, which is load balancing in cloud computing environment.

  16. An Approach to Computer Modeling of Geological Faults in 3D and an Application

    Institute of Scientific and Technical Information of China (English)

    ZHU Liang-feng; HE Zheng; PAN Xin; WU Xin-cai

    2006-01-01

    3D geological modeling, one of the most important applications in geosciences of 3D GIS, forms the basis and is a prerequisite for visualized representation and analysis of 3D geological data. Computer modeling of geological faults in 3D is currently a topical research area. Structural modeling techniques of complex geological entities containing reverse faults are discussed and a series of approaches are proposed. The geological concepts involved in computer modeling and visualization of geological fault in 3D are explained, the type of data of geological faults based on geological exploration is analyzed, and a normative database format for geological faults is designed. Two kinds of modeling approaches for faults are compared: a modeling technique of faults based on stratum recovery and a modeling technique of faults based on interpolation in subareas. A novel approach, called the Unified Modeling Technique for stratum and fault, is presented to solve the puzzling problems of reverse faults, syn-sedimentary faults and faults terminated within geological models. A case study of a fault model of bed rock in the Beijing Olympic Green District is presented in order to show the practical result of this method. The principle and the process of computer modeling of geological faults in 3D are discussed and a series of applied technical proposals established. It strengthens our profound comprehension of geological phenomena and the modeling approach, and establishes the basic techniques of 3D geological modeling for practical applications in the field of geosciences.

  17. A Security Based Data Mining Approach in Data Grid

    CERN Document Server

    Vidhya, S

    2010-01-01

    Grid computing is the next logical step to distributed computing. Main objective of grid computing is an innovative approach to share resources such as CPU usage; memory sharing and software sharing. Data Grids provide transparent access to semantically related data resources in a heterogeneous system. The system incorporates both data mining and grid computing techniques where Grid application reduces the time for sending results to several clients at the same time and Data mining application on computational grids gives fast and sophisticated results to users. In this work, grid based data mining technique is used to do automatic allocation based on probabilistic mining frequent sequence algorithm. It finds frequent sequences for many users at a time with accurate result. It also includes the trust management architecture for trust enhanced security.

  18. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  19. Computer Aided Interpretation Approach for Optical Tomographic Images

    CERN Document Server

    Klose, Christian D; Netz, Uwe; Beuthan, Juergen; Hielscher, Andreas H

    2010-01-01

    A computer-aided interpretation approach is proposed to detect rheumatic arthritis (RA) of human finger joints in optical tomographic images. The image interpretation method employs a multi-variate signal detection analysis aided by a machine learning classification algorithm, called Self-Organizing Mapping (SOM). Unlike in previous studies, this allows for combining multiple physical image parameters, such as minimum and maximum values of the absorption coefficient for identifying affected and not affected joints. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index, and mutual information. Different methods (i.e., clinical diagnostics, ultrasound imaging, magnet resonance imaging and inspection of optical tomographic images), were used as "ground truth"-benchmarks to determine the performance of image interpretations. Using data from 100 finger joints, findings suggest that some parameter combinations lead to higher sensitivities while...

  20. Computational Approaches for Probing the Formation of Atmospheric Molecular Clusters

    DEFF Research Database (Denmark)

    Elm, Jonas

    the performance of computational strategies in order to identify a sturdy methodology, which should be applicable for handling various issues related to atmospheric cluster formation. Density functional theory (DFT) is applied to study individual cluster formation steps. Utilizing large test sets of numerous...... atmospheric clusters I evaluate the performance of different DFT functionals, with a specific focus on how to control potential errors associated with the calculation of single point energies and evaluation of the thermal contribution to the Gibbs free energy. Using DFT I study two candidate systems (glycine...... acid could thereby enhance the further growth of an existing cluster by condensing on the surface. Conclusively, I find that the performance of a single DFT functional can lead to an inadequate description of investigated atmospheric systems and thereby recommend a joint DFT (J-DFT) approach...

  1. A computational approach to mechanistic and predictive toxicology of pesticides

    DEFF Research Database (Denmark)

    Kongsbak, Kristine Grønning; Vinggaard, Anne Marie; Hadrup, Niels

    2014-01-01

    Emerging challenges of managing and interpreting large amounts of complex biological data have given rise to the growing field of computational biology. We investigated the applicability of an integrated systems toxicology approach on five selected pesticides to get an overview of their modes...... of action in humans, to group them according to their modes of action, and to hypothesize on their potential effects on human health. We extracted human proteins associated to prochloraz, tebuconazole, epoxiconazole, procymidone, and mancozeb and enriched each protein set by using a high confidence human...... protein interactome. Then, we explored modes of action of the chemicals, by integrating protein-disease information to the resulting protein networks. The dominating human adverse effects affected were reproductive disorders followed by adrenal diseases. Our results indicated that prochloraz, tebuconazole...

  2. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  3. Computer Modeling of Violent Intent: A Content Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  4. A computational approach to evaluate the androgenic affinity of iprodione, procymidone, vinclozolin and their metabolites.

    Directory of Open Access Journals (Sweden)

    Corrado Lodovico Galli

    Full Text Available Our research is aimed at devising and assessing a computational approach to evaluate the affinity of endocrine active substances (EASs and their metabolites towards the ligand binding domain (LBD of the androgen receptor (AR in three distantly related species: human, rat, and zebrafish. We computed the affinity for all the selected molecules following a computational approach based on molecular modelling and docking. Three different classes of molecules with well-known endocrine activity (iprodione, procymidone, vinclozolin, and a selection of their metabolites were evaluated. Our approach was demonstrated useful as the first step of chemical safety evaluation since ligand-target interaction is a necessary condition for exerting any biological effect. Moreover, a different sensitivity concerning AR LBD was computed for the tested species (rat being the least sensitive of the three. This evidence suggests that, in order not to over-/under-estimate the risks connected with the use of a chemical entity, further in vitro and/or in vivo tests should be carried out only after an accurate evaluation of the most suitable cellular system or animal species. The introduction of in silico approaches to evaluate hazard can accelerate discovery and innovation with a lower economic effort than with a fully wet strategy.

  5. Milestones Toward Majorana-Based Quantum Computing

    Science.gov (United States)

    Aasen, David; Hell, Michael; Mishmash, Ryan V.; Higginbotham, Andrew; Danon, Jeroen; Leijnse, Martin; Jespersen, Thomas S.; Folk, Joshua A.; Marcus, Charles M.; Flensberg, Karsten; Alicea, Jason

    2016-07-01

    We introduce a scheme for preparation, manipulation, and read out of Majorana zero modes in semiconducting wires with mesoscopic superconducting islands. Our approach synthesizes recent advances in materials growth with tools commonly used in quantum-dot experiments, including gate control of tunnel barriers and Coulomb effects, charge sensing, and charge pumping. We outline a sequence of milestones interpolating between zero-mode detection and quantum computing that includes (1) detection of fusion rules for non-Abelian anyons using either proximal charge sensors or pumped current, (2) validation of a prototype topological qubit, and (3) demonstration of non-Abelian statistics by braiding in a branched geometry. The first two milestones require only a single wire with two islands, and additionally enable sensitive measurements of the system's excitation gap, quasiparticle poisoning rates, residual Majorana zero-mode splittings, and topological-qubit coherence times. These pre-braiding experiments can be adapted to other manipulation and read out schemes as well.

  6. A computational approach for deciphering the organization of glycosaminoglycans.

    Directory of Open Access Journals (Sweden)

    Jean L Spencer

    Full Text Available BACKGROUND: Increasing evidence has revealed important roles for complex glycans as mediators of normal and pathological processes. Glycosaminoglycans are a class of glycans that bind and regulate the function of a wide array of proteins at the cell-extracellular matrix interface. The specific sequence and chemical organization of these polymers likely define function; however, identification of the structure-function relationships of glycosaminoglycans has been met with challenges associated with the unique level of complexity and the nontemplate-driven biosynthesis of these biopolymers. METHODOLOGY/PRINCIPAL FINDINGS: To address these challenges, we have devised a computational approach to predict fine structure and patterns of domain organization of the specific glycosaminoglycan, heparan sulfate (HS. Using chemical composition data obtained after complete and partial digestion of mixtures of HS chains with specific degradative enzymes, the computational analysis produces populations of theoretical HS chains with structures that meet both biosynthesis and enzyme degradation rules. The model performs these operations through a modular format consisting of input/output sections and three routines called chainmaker, chainbreaker, and chainsorter. We applied this methodology to analyze HS preparations isolated from pulmonary fibroblasts and epithelial cells. Significant differences in the general organization of these two HS preparations were observed, with HS from epithelial cells having a greater frequency of highly sulfated domains. Epithelial HS also showed a higher density of specific HS domains that have been associated with inhibition of neutrophil elastase. Experimental analysis of elastase inhibition was consistent with the model predictions and demonstrated that HS from epithelial cells had greater inhibitory activity than HS from fibroblasts. CONCLUSIONS/SIGNIFICANCE: This model establishes the conceptual framework for a new class of

  7. A computational approach for deciphering the organization of glycosaminoglycans.

    Science.gov (United States)

    Spencer, Jean L; Bernanke, Joel A; Buczek-Thomas, Jo Ann; Nugent, Matthew A

    2010-02-23

    Increasing evidence has revealed important roles for complex glycans as mediators of normal and pathological processes. Glycosaminoglycans are a class of glycans that bind and regulate the function of a wide array of proteins at the cell-extracellular matrix interface. The specific sequence and chemical organization of these polymers likely define function; however, identification of the structure-function relationships of glycosaminoglycans has been met with challenges associated with the unique level of complexity and the nontemplate-driven biosynthesis of these biopolymers. To address these challenges, we have devised a computational approach to predict fine structure and patterns of domain organization of the specific glycosaminoglycan, heparan sulfate (HS). Using chemical composition data obtained after complete and partial digestion of mixtures of HS chains with specific degradative enzymes, the computational analysis produces populations of theoretical HS chains with structures that meet both biosynthesis and enzyme degradation rules. The model performs these operations through a modular format consisting of input/output sections and three routines called chainmaker, chainbreaker, and chainsorter. We applied this methodology to analyze HS preparations isolated from pulmonary fibroblasts and epithelial cells. Significant differences in the general organization of these two HS preparations were observed, with HS from epithelial cells having a greater frequency of highly sulfated domains. Epithelial HS also showed a higher density of specific HS domains that have been associated with inhibition of neutrophil elastase. Experimental analysis of elastase inhibition was consistent with the model predictions and demonstrated that HS from epithelial cells had greater inhibitory activity than HS from fibroblasts. This model establishes the conceptual framework for a new class of computational tools to use to assess patterns of domain organization within

  8. An Organic Computing Approach to Self-organising Robot Ensembles

    Directory of Open Access Journals (Sweden)

    Sebastian Albrecht von Mammen

    2016-11-01

    Full Text Available Similar to the Autonomous Computing initiative, that has mainly been advancing techniques for self-optimisation focussing on computing systems and infrastructures, Organic Computing (OC has been driving the development of system design concepts and algorithms for self-adaptive systems at large. Examples of application domains include, for instance, traffic management and control, cloud services, communication protocols, and robotic systems. Such an OC system typically consists of a potentially large set of autonomous and self-managed entities, where each entity acts with a local decision horizon. By means of cooperation of the individual entities, the behaviour of the entire ensemble system is derived. In this article, we present our work on how autonomous, adaptive robot ensembles can benefit from OC technology. Our elaborations are aligned with the different layers of an observer/controller framework which provides the foundation for the individuals' adaptivity at system design-level. Relying on an extended Learning Classifier System (XCS in combination with adequate simulation techniques, this basic system design empowers robot individuals to improve their individual and collaborative performances, e.g. by means of adapting to changing goals and conditions.Not only for the sake of generalisability, but also because of its enormous transformative potential, we stage our research in the domain of robot ensembles that are typically comprised of several quad-rotors and that organise themselves to fulfil spatial tasks such as maintenance of building facades or the collaborative search for mobile targets. Our elaborations detail the architectural concept, provide examples of individual self-optimisation as well as of the optimisation of collaborative efforts, and we show how the user can control the ensembles at multiple levels of abstraction. We conclude with a summary of our approach and an outlook on possible future steps.

  9. A computer simulation approach to measurement of human control strategy

    Science.gov (United States)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  10. Secure information transfer based on computing reservoir

    Science.gov (United States)

    Szmoski, R. M.; Ferrari, F. A. S.; de S. Pinto, S. E.; Baptista, M. S.; Viana, R. L.

    2013-04-01

    There is a broad area of research to ensure that information is transmitted securely. Within this scope, chaos-based cryptography takes a prominent role due to its nonlinear properties. Using these properties, we propose a secure mechanism for transmitting data that relies on chaotic networks. We use a nonlinear on-off device to cipher the message, and the transfer entropy to retrieve it. We analyze the system capability for sending messages, and we obtain expressions for the operating time. We demonstrate the system efficiency for a wide range of parameters. We find similarities between our method and the reservoir computing.

  11. Prestandardisation Activities for Computer Based Safety Systems

    DEFF Research Database (Denmark)

    Taylor, J. R.; Bologna, S.; Ehrenberger, W.

    1981-01-01

    Questions of technical safety become more and more important. Due to the higher complexity of their functions computer based safety systems have special problems. Researchers, producers, licensing personnel and customers have met on a European basis to exchange knowledge and formulate positions....... The Commission of the european Community supports the work. Major topics comprise hardware configuration and self supervision, software design, verification and testing, documentation, system specification and concurrent processing. Preliminary results have been used for the draft of an IEC standard and for some...

  12. Computer-Based Grammar Instruction in an EFL Context: Improving the Effectiveness of Teaching Adverbial Clauses

    Science.gov (United States)

    Kiliçkaya, Ferit

    2015-01-01

    This study aims to find out whether there are any statistically significant differences in participants' achievements on three different types of instruction: computer-based instruction, teacher-driven instruction, and teacher-driven grammar supported by computer-based instruction. Each type of instruction follows the deductive approach. The…

  13. Computer-Based Grammar Instruction in an EFL Context: Improving the Effectiveness of Teaching Adverbial Clauses

    Science.gov (United States)

    Kiliçkaya, Ferit

    2015-01-01

    This study aims to find out whether there are any statistically significant differences in participants' achievements on three different types of instruction: computer-based instruction, teacher-driven instruction, and teacher-driven grammar supported by computer-based instruction. Each type of instruction follows the deductive approach. The…

  14. Real-time computing without stable states: a new framework for neural computation based on perturbations.

    Science.gov (United States)

    Maass, Wolfgang; Natschläger, Thomas; Markram, Henry

    2002-11-01

    A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.

  15. Using Case-Based Reasoning for detecting computer virus

    Directory of Open Access Journals (Sweden)

    Abdellatif Berkat

    2011-07-01

    Full Text Available The typical antivirus approach consists of waiting for a number of computers to be infected, detecting the virus, designing a solution, delivering and deploying a solution. In such a situation, it is very difficult to prevent every machine from being compromised by viruses. In this paper, we propose a new method, for detecting computer viruses, that is based on the technique of Case-Based Reasoning (CBR. In this method: (1 even new viruses that do not exist in the database can be detected (2 The updating of the virus database is done automatically without connecting to the Internet. Whenever a new virus is detected, it will be automatically added to the database used by our application. This presents a major advantage

  16. A Computational Differential Geometry Approach to Grid Generation

    CERN Document Server

    Liseikin, Vladimir D

    2007-01-01

    The process of breaking up a physical domain into smaller sub-domains, known as meshing, facilitates the numerical solution of partial differential equations used to simulate physical systems. This monograph gives a detailed treatment of applications of geometric methods to advanced grid technology. It focuses on and describes a comprehensive approach based on the numerical solution of inverted Beltramian and diffusion equations with respect to monitor metrics for generating both structured and unstructured grids in domains and on surfaces. In this second edition the author takes a more detailed and practice-oriented approach towards explaining how to implement the method by: Employing geometric and numerical analyses of monitor metrics as the basis for developing efficient tools for controlling grid properties. Describing new grid generation codes based on finite differences for generating both structured and unstructured surface and domain grids. Providing examples of applications of the codes to the genera...

  17. Computer Profiling Based Model for Investigation

    Directory of Open Access Journals (Sweden)

    Neeraj Choudhary

    2011-10-01

    Full Text Available Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a computer system. The computer profiling object model can be implemented so as to support automated analysis to provide an investigator with the informationneeded to decide whether manual analysis is required.

  18. A Bayesian Approach for Parameter Estimation and Prediction using a Computationally Intensive Model

    CERN Document Server

    Higdon, Dave; Schunck, Nicolas; Sarich, Jason; Wild, Stefan M

    2014-01-01

    Bayesian methods have been very successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model $\\eta(\\theta)$ where $\\theta$ denotes the uncertain, best input setting. Hence the statistical model is of the form $y = \\eta(\\theta) + \\epsilon$, where $\\epsilon$ accounts for measurement, and possibly other error sources. When non-linearity is present in $\\eta(\\cdot)$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and non-standard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. While quite generally applicable, MCMC requires thousands, or even millions of evaluations of the physics model $\\eta(\\cdot)$. This is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we pr...

  19. A Programmable Approach to Maintenance of a Finite Knowledge Base

    Institute of Scientific and Technical Information of China (English)

    LUAN ShangMin(栾尚敏); DAI GuoZhong(戴国忠); LI Wei(李未)

    2003-01-01

    In this paper, we present a programmable method of revising a finite clause set. We first present a procedure whose formal parameters are a consistent clause set Γ and a clause A and whose output is a set of minimal subsets of Γ which are inconsistent with A. The maximal consistent subsets can be generated from all minimal inconsistent subsets. We develop a prototype system based on the above procedure, and discuss the implementation of knowledge base maintenance. At last, we compare the approach presented in this paper with other related approaches. The main characteristic of the approach is that it can be implemented by a computer program.

  20. Age Classification Based On Integrated Approach

    Directory of Open Access Journals (Sweden)

    Pullela. SVVSR Kumar

    2014-05-01

    Full Text Available The present paper presents a new age classification method by integrating the features derived from Grey Level Co-occurrence Matrix (GLCM with a new structural approach derived from four distinct LBP's (4-DLBP on a 3 x 3 image. The present paper derived four distinct patterns called Left Diagonal (LD, Right diagonal (RD, vertical centre (VC and horizontal centre (HC LBP's. For all the LBP's the central pixel value of the 3 x 3 neighbourhood is significant. That is the reason in the present research LBP values are evaluated by comparing all 9 pixels of the 3 x 3 neighbourhood with the average value of the neighbourhood. The four distinct LBP's are grouped into two distinct LBP's. Based on these two distinct LBP's GLCM is computed and features are evaluated to classify the human age into four age groups i.e: Child (0-15, Young adult (16-30, Middle aged adult (31-50 and senior adult (>50. The co-occurrence features extracted from the 4-DLBP provides complete texture information about an image which is useful for classification. The proposed 4-DLBP reduces the size of the LBP from 6561 to 79 in the case of original texture spectrum and 2020 to 79 in the case of Fuzzy Texture approach.

  1. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  2. A Moving Human Tracking Approach Based on Semantic Interaction

    Institute of Scientific and Technical Information of China (English)

    ZHOU Ning; FANG Bao-hong; SUN Fu-liang

    2007-01-01

    In order to deal with partical occlusion, a semantic interaction based moving human tracking approach is put forward. Firstly human is modeled as moving blobs which are described as blob descriptions. Then moving blobs are updated and verified by projecting these descriptions. The approach exploits improved fast gauss transform and chooses source and target samples to reduce compute cost. Multi-moving human can be tracked simply and part occlusion can be done well.

  3. An Approach for Composing Services Based on Environment Ontology

    Directory of Open Access Journals (Sweden)

    Guangjun Cai

    2013-01-01

    Full Text Available Service-oriented computing is revolutionizing the modern computing paradigms with its aim to boost software reuse and enable business agility. Under this paradigm, new services are fabricated by composing available services. The problem arises as how to effectively and efficiently compose heterogeneous services facing the high complexity of service composition. Based on environment ontology, this paper introduces a requirement-driven service composition approach. We propose the algorithms to decompose the requirement, the rules to deduct the relation between services, and the algorithm for composing service. The empirical results and the comparison with other services’ composition methodologies show that this approach is feasible and efficient.

  4. Some Computational Aspects of the Brain Computer Interfaces Based on Inner Music

    Science.gov (United States)

    Klonowski, Wlodzimierz; Duch, Wlodzisław; Perovic, Aleksandar; Jovanovic, Aleksandar

    2009-01-01

    We discuss the BCI based on inner tones and inner music. We had some success in the detection of inner tones, the imagined tones which are not sung aloud. Rather easily imagined and controlled, they offer a set of states usable for BCI, with high information capacity and high transfer rates. Imagination of sounds or musical tunes could provide a multicommand language for BCI, as if using the natural language. Moreover, this approach could be used to test musical abilities. Such BCI interface could be superior when there is a need for a broader command language. Some computational estimates and unresolved difficulties are presented. PMID:19503802

  5. Computer Animation Based on Particle Methods

    Directory of Open Access Journals (Sweden)

    Rafal Wcislo

    1999-01-01

    Full Text Available The paper presents the main issues of a computer animation of a set of elastic macroscopic objects based on the particle method. The main assumption of the generated animations is to achieve very realistic movements in a scene observed on the computer display. The objects (solid bodies interact mechanically with each other, The movements and deformations of solids are calculated using the particle method. Phenomena connected with the behaviour of solids in the gravitational field, their defomtations caused by collisions and interactions with the optional liquid medium are simulated. The simulation ofthe liquid is performed using the cellular automata method. The paper presents both simulation schemes (particle method and cellular automata rules an the method of combining them in the single animation program. ln order to speed up the execution of the program the parallel version based on the network of workstation was developed. The paper describes the methods of the parallelization and it considers problems of load-balancing, collision detection, process synchronization and distributed control of the animation.

  6. HiPPI-based parallel computing

    Science.gov (United States)

    Jung, Charles C.

    1993-02-01

    The IBM Enhanced Clustered Fortran (ECF) advanced technology project combines parallel computing technology with a HiPPI-based LAN network. The ECF environment is a clustered, parallel computing environment which consists of IBM ES/90001 complexes and possibly other parallel machines connected by HiPPI. The ECF software, including the language processor, is independent of hardware architectures, operating systems, and the Fortran compiler and runtime library. The ECF software is highly portable because it is based on well-known, standard technology and transport protocols such as Remote Procedure Call (RPC), X/Open Transport Interface (XTI), and TCP/IP. The ECF software is transport-independent, and can accommodate other transport protocols concurrently. This paper describes the IBM ECF environment including the language extensions, the programming model, and the software layers and components. Also, this paper explains how to achieve portability and scalability. Lastly, this paper describes how effective task communication is accomplished in ECF through RPC, XTI, TCP/IP, and a customized enhancement over HiPPI. An analysis of network performance in terms of bottleneck conditions is presented, and empirical data indicating improved throughput is provided. Comparisons to alternative methodologies and technologies are also presented.

  7. THE DISCIPLINE «COMPUTER-BASED PRODUCT LIFECYCLE MANAGEMENT» AND ITS ROLE IN TECHNICAL UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Alena V. Fedotova

    2015-01-01

    Full Text Available The paper is devoted to the creation of complex discipline «Computer-based Product Lifecycle Management» in accordance to competency approach and specification of its role in education process for technical university. 

  8. Computer Based Information Systems and the Middle Manager.

    Science.gov (United States)

    Why do some computer based information systems succeed while others fail. It concludes with eleven recommended areas that middle management must...understand in order to effectively use computer based information systems . (Modified author abstract)

  9. Computing Gröbner Bases within Linear Algebra

    Science.gov (United States)

    Suzuki, Akira

    In this paper, we present an alternative algorithm to compute Gröbner bases, which is based on computations on sparse linear algebra. Both of S-polynomial computations and monomial reductions are computed in linear algebra simultaneously in this algorithm. So it can be implemented to any computational system which can handle linear algebra. For a given ideal in a polynomial ring, it calculates a Gröbner basis along with the corresponding term order appropriately.

  10. Model based feature fusion approach

    NARCIS (Netherlands)

    Schwering, P.B.W.

    2001-01-01

    In recent years different sensor data fusion approaches have been analyzed and evaluated in the field of mine detection. In various studies comparisons have been made between different techniques. Although claims can be made for advantages for using certain techniques, until now there has been no si

  11. A uniform approach for programming distributed heterogeneous computing systems.

    Science.gov (United States)

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-12-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.

  12. COMPUTER-BASED REASONING SYSTEMS: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    CIPRIAN CUCU

    2012-12-01

    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  13. Job shop scheduling problem based on DNA computing

    Institute of Scientific and Technical Information of China (English)

    Yin Zhixiang; Cui Jianzhong; Yang Yan; Ma Ying

    2006-01-01

    To solve job shop scheduling problem, a new approach-DNA computing is used in solving job shop scheduling problem. The approach using DNA computing to solve job shop scheduling is divided into three stands. Finally, optimum solutions are obtained by sequencing. A small job shop scheduling problem is solved in DNA computing, and the "operations" of the computation were performed with standard protocols, as ligation, synthesis, electrophoresis etc. This work represents further evidence for the ability of DNA computing to solve NP-complete search problems.

  14. Computational Approaches for Modeling the Multiphysics in Pultrusion Process

    DEFF Research Database (Denmark)

    Carlone, P.; Baran, Ismet; Hattel, Jesper Henri;

    2013-01-01

    Pultrusion is a continuousmanufacturing process used to produce high strength composite profiles with constant cross section.The mutual interactions between heat transfer, resin flow and cure reaction, variation in the material properties, and stress/distortion evolutions strongly affect...... the process dynamics together with the mechanical properties and the geometrical precision of the final product. In the present work, pultrusion process simulations are performed for a unidirectional (UD) graphite/epoxy composite rod including several processing physics, such as fluid flow, heat transfer......, chemical reaction, and solid mechanics. The pressure increase and the resin flow at the tapered inlet of the die are calculated by means of a computational fluid dynamics (CFD) finite volume model. Several models, based on different homogenization levels and solution schemes, are proposed and compared...

  15. Crack Propagation in Honeycomb Cellular Materials: A Computational Approach

    Directory of Open Access Journals (Sweden)

    Marco Paggi

    2012-02-01

    Full Text Available Computational models based on the finite element method and linear or nonlinear fracture mechanics are herein proposed to study the mechanical response of functionally designed cellular components. It is demonstrated that, via a suitable tailoring of the properties of interfaces present in the meso- and micro-structures, the tensile strength can be substantially increased as compared to that of a standard polycrystalline material. Moreover, numerical examples regarding the structural response of these components when subjected to loading conditions typical of cutting operations are provided. As a general trend, the occurrence of tortuous crack paths is highly favorable: stable crack propagation can be achieved in case of critical crack growth, whereas an increased fatigue life can be obtained for a sub-critical crack propagation.

  16. An Approach for Indoor Path Computation among Obstacles that Considers User Dimension

    Directory of Open Access Journals (Sweden)

    Liu Liu

    2015-12-01

    Full Text Available People often transport objects within indoor environments, who need enough space for the motion. In such cases, the accessibility of indoor spaces relies on the dimensions, which includes a person and her/his operated objects. This paper proposes a new approach to avoid obstacles and compute indoor paths with respect to the user dimension. The approach excludes inaccessible spaces for a user in five steps: (1 compute the minimum distance between obstacles and find the inaccessible gaps; (2 group obstacles according to the inaccessible gaps; (3 identify groups of obstacles that influence the path between two locations; (4 compute boundaries for the selected groups; and (5 build a network in the accessible area around the obstacles in the room. Compared to the Minkowski sum method for outlining inaccessible spaces, the proposed approach generates simpler polygons for groups of obstacles that do not contain inner rings. The creation of a navigation network becomes easier based on these simple polygons. By using this approach, we can create user- and task-specific networks in advance. Alternatively, the accessible path can be generated on the fly before the user enters a room.

  17. Parallel computing-based sclera recognition for human identification

    Science.gov (United States)

    Lin, Yong; Du, Eliza Y.; Zhou, Zhi

    2012-06-01

    Compared to iris recognition, sclera recognition which uses line descriptor can achieve comparable recognition accuracy in visible wavelengths. However, this method is too time-consuming to be implemented in a real-time system. In this paper, we propose a GPU-based parallel computing approach to reduce the sclera recognition time. We define a new descriptor in which the information of KD tree structure and sclera edge are added. Registration and matching task is divided into subtasks in various sizes according to their computation complexities. Every affine transform parameters are generated by searching on KD tree. Texture memory, constant memory, and shared memory are used to store templates and transform matrixes. The experiment results show that the proposed method executed on GPU can dramatically improve the sclera matching speed in hundreds of times without accuracy decreasing.

  18. Intelligent Image Based Computer Aided Education (IICAE)

    Science.gov (United States)

    David, Amos A.; Thiery, Odile; Crehange, Marion

    1989-03-01

    Artificial Intelligence (AI) has found its way into Computer Aided Education (CAE), and there are several systems constructed to put in evidence its interesting advantages. We believe that images (graphic or real) play an important role in learning. However, the use of images, outside their use as illustration, makes it necessary to have applications such as AI. We shall develop the application of AI in an image based CAE and briefly present the system under construction to put in evidence our concept. We shall also elaborate a methodology for constructing such a system. Futhermore we shall briefly present the pedagogical and psychological activities in a learning process. Under the pedagogical and psychological aspect of learning, we shall develop areas such as the importance of image in learning both as pedagogical objects as well as means for obtaining psychological information about the learner. We shall develop the learner's model, its use, what to build into it and how. Under the application of AI in an image based CAE, we shall develop the importance of AI in exploiting the knowledge base in the learning environment and its application as a means of implementing pedagogical strategies.

  19. A modular approach to computer-aided auscultation: analysis and parametric characterization of murmur acoustic qualities.

    Science.gov (United States)

    Shen, Chia-Hsuan; Choy, Fred K; Chen, Yuerong; Wang, Shengyong

    2013-07-01

    In the present work, a modularized approach to computer-aided auscultation based on the traditional cardiac auscultation of murmur is proposed. Under such an approach, the present paper concerns the task of evaluating murmur acoustic quality character. The murmurs were analyzed in their time-series representation, frequency representation as well as time-frequency representation, allowing extraction of interpretable features based on their signal structural and spectral characters. The features were evaluated using scatter plots, receiver operating characteristic curves (ROC), and numerical experiments using a KNN classifier. The possible physiological and hemodynamical associations with the feature set are made. The implication and advantage of the modular approach are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. A Computational Approach for Analyzing and Detecting Emotions in Arabic Text

    Directory of Open Access Journals (Sweden)

    Amira F. El Gohary, Torky I. Sultan, Maha A. Hana, Mohamed M. El Dosoky

    2013-05-01

    Full Text Available The field of Affective Computing (AC expects to narrow the communicative gap between the highly emotional human and the emotionally challenged computer by developing computational systems that recognize and respond to the affective states of the user. Affect-sensitive interfaces are being developed in number of domains, including gaming, mental health, and learning technologies. Emotions are part of human life. Recently, interest has been growing among researchers to find ways of detecting subjective information used in blogs and other online social media. This paper concerned with the automatic detection of emotions in Arabic text. This construction is based on a moderate sized Arabic emotion lexicon used to annotate Arabic children stories for the six basic emotions: Joy, Fear, Sadness, Anger, Disgust, and Surprise.Our approach achieves 65% accuracy for emotion detection in Arabic text.

  1. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hong; Dwaraknath, Shyam S.; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A.

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available.

  2. Computer-aided interpretation approach for optical tomographic images

    Science.gov (United States)

    Klose, Christian D.; Klose, Alexander D.; Netz, Uwe J.; Scheel, Alexander K.; Beuthan, Jürgen; Hielscher, Andreas H.

    2010-11-01

    A computer-aided interpretation approach is proposed to detect rheumatic arthritis (RA) in human finger joints using optical tomographic images. The image interpretation method employs a classification algorithm that makes use of a so-called self-organizing mapping scheme to classify fingers as either affected or unaffected by RA. Unlike in previous studies, this allows for combining multiple image features, such as minimum and maximum values of the absorption coefficient for identifying affected and not affected joints. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index, and mutual information. Different methods (i.e., clinical diagnostics, ultrasound imaging, magnet resonance imaging, and inspection of optical tomographic images), were used to produce ground truth benchmarks to determine the performance of image interpretations. Using data from 100 finger joints, findings suggest that some parameter combinations lead to higher sensitivities, while others to higher specificities when compared to single parameter classifications employed in previous studies. Maximum performances are reached when combining the minimum/maximum ratio of the absorption coefficient and image variance. In this case, sensitivities and specificities over 0.9 can be achieved. These values are much higher than values obtained when only single parameter classifications were used, where sensitivities and specificities remained well below 0.8.

  3. Lexical is as lexical does: computational approaches to lexical representation

    Science.gov (United States)

    Woollams, Anna M.

    2015-01-01

    In much of neuroimaging and neuropsychology, regions of the brain have been associated with ‘lexical representation’, with little consideration as to what this cognitive construct actually denotes. Within current computational models of word recognition, there are a number of different approaches to the representation of lexical knowledge. Structural lexical representations, found in original theories of word recognition, have been instantiated in modern localist models. However, such a representational scheme lacks neural plausibility in terms of economy and flexibility. Connectionist models have therefore adopted distributed representations of form and meaning. Semantic representations in connectionist models necessarily encode lexical knowledge. Yet when equipped with recurrent connections, connectionist models can also develop attractors for familiar forms that function as lexical representations. Current behavioural, neuropsychological and neuroimaging evidence shows a clear role for semantic information, but also suggests some modality- and task-specific lexical representations. A variety of connectionist architectures could implement these distributed functional representations, and further experimental and simulation work is required to discriminate between these alternatives. Future conceptualisations of lexical representations will therefore emerge from a synergy between modelling and neuroscience. PMID:25893204

  4. Milestones Toward Majorana-Based Quantum Computing

    Directory of Open Access Journals (Sweden)

    David Aasen

    2016-08-01

    Full Text Available We introduce a scheme for preparation, manipulation, and read out of Majorana zero modes in semiconducting wires with mesoscopic superconducting islands. Our approach synthesizes recent advances in materials growth with tools commonly used in quantum-dot experiments, including gate control of tunnel barriers and Coulomb effects, charge sensing, and charge pumping. We outline a sequence of milestones interpolating between zero-mode detection and quantum computing that includes (1 detection of fusion rules for non-Abelian anyons using either proximal charge sensors or pumped current, (2 validation of a prototype topological qubit, and (3 demonstration of non-Abelian statistics by braiding in a branched geometry. The first two milestones require only a single wire with two islands, and additionally enable sensitive measurements of the system’s excitation gap, quasiparticle poisoning rates, residual Majorana zero-mode splittings, and topological-qubit coherence times. These pre-braiding experiments can be adapted to other manipulation and read out schemes as well.

  5. Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.

    Science.gov (United States)

    Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  6. Approaches to Affective Computing and Learning towards Interactive Decision Making in Process Control Engineering

    Institute of Scientific and Technical Information of China (English)

    SU Chong; LI Hong-Guang

    2013-01-01

    Numerous multi-objective decision-making problems related to industrial process control engineering such as control and operation performance evaluation are being resolved through human-computer interactions.With regard to the problems that traditional interactive evolutionary computing approaches suffer i.e.,limited searching ability and human's strong subjectivity in multi-objective-attribute decision-making,a novel affective computing and learning solution adapted to human-computer interaction mechanism is explicitly proposed.Therein,a kind of stimulating response based affective computing model (STAM) is constructed,along with quantitative relations between affective space and human's subjective preferences.Thereafter,affective learning strategies based on genetic algorithms are introduced which are responsible for gradually grasping essentials in human's subjective judgments in decision-making,reducing human's subjective fatigue as well as making the decisions more objective and scientific.Affective learning algorithm's complexity and convergence analysis are shown in Appendices A and B.To exemplify applications of the proposed methods,ad-hoc test functions and PID parameter tuning are suggested as case studies,giving rise to satisfying results and showing validity of the contributions.

  7. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  8. Development of a personal-computer-based intelligent tutoring system

    Science.gov (United States)

    Mueller, Stephen J.

    1988-01-01

    A large number of Intelligent Tutoring Systems (ITSs) have been built since they were first proposed in the early 1970's. Research conducted on the use of the best of these systems has demonstrated their effectiveness in tutoring in selected domains. A prototype ITS for tutoring students in the use of CLIPS language: CLIPSIT (CLIPS Intelligent Tutor) was developed. For an ITS to be widely accepted, not only must it be effective, flexible, and very responsive, it must also be capable of functioning on readily available computers. While most ITSs have been developed on powerful workstations, CLIPSIT is designed for use on the IBM PC/XT/AT personal computer family (and their clones). There are many issues to consider when developing an ITS on a personal computer such as the teaching strategy, user interface, knowledge representation, and program design methodology. Based on experiences in developing CLIPSIT, results on how to address some of these issues are reported and approaches are suggested for maintaining a powerful learning environment while delivering robust performance within the speed and memory constraints of the personal computer.

  9. A Monomial Chaos Approach for Efficient Uncertainty Quantification in Computational Fluid Dynamics

    NARCIS (Netherlands)

    Witteveen, J.A.S.; Bijl, H.

    2006-01-01

    A monomial chaos approach is proposed for efficient uncertainty quantification in nonlinear computational problems. Propagating uncertainty through nonlinear equations can still be computationally intensive for existing uncertainty quantification methods. It usually results in a set of nonlinear equ

  10. Rough K-means Outlier Factor Based on Entropy Computation

    Directory of Open Access Journals (Sweden)

    Djoko Budiyanto Setyohadi

    2014-07-01

    Full Text Available Many studies of outlier detection have been developed based on the cluster-based outlier detection approach, since it does not need any prior knowledge of the dataset. However, the previous studies only regard the outlier factor computation with respect to a single point or a small cluster, which reflects its deviates from a common cluster. Furthermore, all objects within outlier cluster are assumed to be similar. The outlier objects intuitively can be grouped into the outlier clusters and the outlier factors of each object within the outlier cluster should be different gradually. It is not natural if the outlierness of each object within outlier cluster is similar. This study proposes the new outlier detection method based on the hybrid of the Rough K-Means clustering algorithm and the entropy computation. We introduce the outlier degree measure namely the entropy outlier factor for the cluster based outlier detection. The proposed algorithm sequentially finds the outlier cluster and calculates the outlier factor degree of the objects within outlier cluster. Each object within outlier cluster is evaluated using entropy cluster-based to a whole cluster. The performance of the algorithm has been tested on four UCI benchmark data sets and show outperform especially in detection rate.

  11. New Approaches to the Computer Simulation of Amorphous Alloys: A Review

    Directory of Open Access Journals (Sweden)

    Fernando Alvarez-Ramirez

    2011-04-01

    Full Text Available In this work we review our new methods to computer generate amorphous atomic topologies of several binary alloys: SiH, SiN, CN; binary systems based on group IV elements like SiC; the GeSe2 chalcogenide; aluminum-based systems: AlN and AlSi, and the CuZr amorphous alloy. We use an ab initio approach based on density functionals and computationally thermally-randomized periodically-continued cells with at least 108 atoms. The computational thermal process to generate the amorphous alloys is the undermelt-quench approach, or one of its variants, that consists in linearly heating the samples to just below their melting (or liquidus temperatures, and then linearly cooling them afterwards. These processes are carried out from initial crystalline conditions using short and long time steps. We find that a step four-times the default time step is adequate for most of the simulations. Radial distribution functions (partial and total are calculated and compared whenever possible with experimental results, and the agreement is very good. For some materials we report studies of the effect of the topological disorder on their electronic and vibrational densities of states and on their optical properties.

  12. Interactive computer-assisted approach for evaluation of ultrastructural cilia abnormalities

    Science.gov (United States)

    Palm, Christoph; Siegmund, Heiko; Semmelmann, Matthias; Grafe, Claudia; Evert, Matthias; Schroeder, Josef A.

    2016-03-01

    Introduction - Diagnosis of abnormal cilia function is based on ultrastructural analysis of axoneme defects, especialy the features of inner and outer dynein arms which are the motors of ciliar motility. Sub-optimal biopsy material, methodical, and intrinsic electron microscopy factors pose difficulty in ciliary defects evaluation. We present a computer-assisted approach based on state-of-the-art image analysis and object recognition methods yielding a time-saving and efficient diagnosis of cilia dysfunction. Method - The presented approach is based on a pipeline of basal image processing methods like smoothing, thresholding and ellipse fitting. However, integration of application specific knowledge results in robust segmentations even in cases of image artifacts. The method is build hierarchically starting with the detection of cilia within the image, followed by the detection of nine doublets within each analyzable cilium, and ending with the detection of dynein arms of each doublet. The process is concluded by a rough classification of the dynein arms as basis for a computer-assisted diagnosis. Additionally, the interaction possibilities are designed in a way, that the results are still reproducible given the completion report. Results - A qualitative evaluation showed reasonable detection results for cilia, doublets and dynein arms. However, since a ground truth is missing, the variation of the computer-assisted diagnosis should be within the subjective bias of human diagnosticians. The results of a first quantitative evaluation with five human experts and six images with 12 analyzable cilia showed, that with default parameterization 91.6% of the cilia and 98% of the doublets were found. The computer-assisted approach rated 66% of those inner and outer dynein arms correct, where all human experts agree. However, especially the quality of the dynein arm classification may be improved in future work.

  13. A Granular Computing Model Based on Tolerance relation

    Institute of Scientific and Technical Information of China (English)

    WANG Guo-yin; HU Feng; HUANG Hai; WU Yu

    2005-01-01

    Granular computing is a new intelligent computing theory based on partition of problem concepts. It is an important problem in Rough Set theory to process incomplete information systems directly. In this paper, a granular computing model based on tolerance relation for processing incomplete information systems is developed. Furthermore, a criteria condition for attribution necessity is proposed in this model.

  14. Computer-Based Cognitive Tools: Description and Design.

    Science.gov (United States)

    Kennedy, David; McNaught, Carmel

    With computers, tangible tools are represented by the hardware (e.g., the central processing unit, scanners, and video display unit), while intangible tools are represented by the software. There is a special category of computer-based software tools (CBSTs) that have the potential to mediate cognitive processes--computer-based cognitive tools…

  15. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    Science.gov (United States)

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  16. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    Science.gov (United States)

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  17. Evaluating face trustworthiness: a model based approach.

    Science.gov (United States)

    Todorov, Alexander; Baron, Sean G; Oosterhof, Nikolaas N

    2008-06-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response-as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic--strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension.

  18. Energy based Efficient Resource Scheduling in Green Computing

    Directory of Open Access Journals (Sweden)

    B.Vasumathi,

    2015-11-01

    Full Text Available Cloud Computing is an evolving area of efficient utilization of computing resources. Data centers accommodating Cloud applications ingest massive quantities of energy, contributing to high functioning expenditures and carbon footprints to the atmosphere. Hence, Green Cloud computing resolutions are required not only to save energy for the environment but also to decrease operating charges. In this paper, we emphasis on the development of energy based resource scheduling framework and present an algorithm that consider the synergy between various data center infrastructures (i.e., software, hardware, etc., and performance. In specific, this paper proposes (a architectural principles for energy efficient management of Clouds; (b energy efficient resource allocation strategies and scheduling algorithm considering Quality of Service (QoS outlooks. The performance of the proposed algorithm has been evaluated with the existing energy based scheduling algorithms. The experimental results demonstrate that this approach is effective in minimizing the cost and energy consumption of Cloud applications thus moving towards the achievement of Green Clouds.

  19. A computational intelligence approach to the Mars Precision Landing problem

    Science.gov (United States)

    Birge, Brian Kent, III

    Various proposed Mars missions, such as the Mars Sample Return Mission (MRSR) and the Mars Smart Lander (MSL), require precise re-entry terminal position and velocity states. This is to achieve mission objectives including rendezvous with a previous landed mission, or reaching a particular geographic landmark. The current state of the art footprint is in the magnitude of kilometers. For this research a Mars Precision Landing is achieved with a landed footprint of no more than 100 meters, for a set of initial entry conditions representing worst guess dispersions. Obstacles to reducing the landed footprint include trajectory dispersions due to initial atmospheric entry conditions (entry angle, parachute deployment height, etc.), environment (wind, atmospheric density, etc.), parachute deployment dynamics, unavoidable injection error (propagated error from launch on), etc. Weather and atmospheric models have been developed. Three descent scenarios have been examined. First, terminal re-entry is achieved via a ballistic parachute with concurrent thrusting events while on the parachute, followed by a gravity turn. Second, terminal re-entry is achieved via a ballistic parachute followed by gravity turn to hover and then thrust vector to desired location. Third, a guided parafoil approach followed by vectored thrusting to reach terminal velocity is examined. The guided parafoil is determined to be the best architecture. The purpose of this study is to examine the feasibility of using a computational intelligence strategy to facilitate precision planetary re-entry, specifically to take an approach that is somewhat more intuitive and less rigid, and see where it leads. The test problems used for all research are variations on proposed Mars landing mission scenarios developed by NASA. A relatively recent method of evolutionary computation is Particle Swarm Optimization (PSO), which can be considered to be in the same general class as Genetic Algorithms. An improvement over

  20. Peptide-Based Vaccinology: Experimental and Computational Approaches to Target Hypervariable Viruses through the Fine Characterization of Protective Epitopes Recognized by Monoclonal Antibodies and the Identification of T-Cell-Activating Peptides

    Directory of Open Access Journals (Sweden)

    Matteo Castelli

    2013-01-01

    Full Text Available Defining immunogenic domains of viral proteins capable of eliciting a protective immune response is crucial in the development of novel epitope-based prophylactic strategies. This is particularly important for the selective targeting of conserved regions shared among hypervariable viruses. Studying postinfection and postimmunization sera, as well as cloning and characterization of monoclonal antibodies (mAbs, still represents the best approach to identify protective epitopes. In particular, a protective mAb directed against conserved regions can play a key role in immunogen design and in human therapy as well. Experimental approaches aiming to characterize protective mAb epitopes or to identify T-cell-activating peptides are often burdened by technical limitations and can require long time to be correctly addressed. Thus, in the last decade many epitope predictive algorithms have been developed. These algorithms are continually evolving, and their use to address the empirical research is widely increasing. Here, we review several strategies based on experimental techniques alone or addressed by in silico analysis that are frequently used to predict immunogens to be included in novel epitope-based vaccine approaches. We will list the main strategies aiming to design a new vaccine preparation conferring the protection of a neutralizing mAb combined with an effective cell-mediated response.

  1. Peptide Based Radiopharmaceuticals: Specific Construct Approach

    Energy Technology Data Exchange (ETDEWEB)

    Som, P; Rhodes, B A; Sharma, S S

    1997-10-21

    The objective of this project was to develop receptor based peptides for diagnostic imaging and therapy. A series of peptides related to cell adhesion molecules (CAM) and immune regulation were designed for radiolabeling with 99mTc and evaluated in animal models as potential diagnostic imaging agents for various disease conditions such as thrombus (clot), acute kidney failure, and inflection/inflammation imaging. The peptides for this project were designed by the industrial partner, Palatin Technologies, (formerly Rhomed, Inc.) using various peptide design approaches including a newly developed rational computer assisted drug design (CADD) approach termed MIDAS (Metal ion Induced Distinctive Array of Structures). In this approach, the biological function domain and the 99mTc complexing domain are fused together so that structurally these domains are indistinguishable. This approach allows construction of conformationally rigid metallo-peptide molecules (similar to cyclic peptides) that are metabolically stable in-vivo. All the newly designed peptides were screened in various in vitro receptor binding and functional assays to identify a lead compound. The lead compounds were formulated in a one-step 99mTc labeling kit form which were studied by BNL for detailed in-vivo imaging using various animals models of human disease. Two main peptides usingMIDAS approach evolved and were investigated: RGD peptide for acute renal failure and an immunomodulatory peptide derived from tuftsin (RMT-1) for infection/inflammation imaging. Various RGD based metallopeptides were designed, synthesized and assayed for their efficacy in inhibiting ADP-induced human platelet aggregation. Most of these peptides displayed biological activity in the 1-100 µM range. Based on previous work by others, RGD-I and RGD-II were evaluated in animal models of acute renal failure. These earlier studies showed that after acute ischemic injury the renal cortex displays

  2. A Stochastic Approach for Blurred Image Restoration and Optical Flow Computation on Field Image Sequence

    Institute of Scientific and Technical Information of China (English)

    高文; 陈熙霖

    1997-01-01

    The blur in target images caused by camera vibration due to robot motion or hand shaking and by object(s) moving in the background scene is different to deal with in the computer vision system.In this paper,the authors study the relation model between motion and blur in the case of object motion existing in video image sequence,and work on a practical computation algorithm for both motion analysis and blut image restoration.Combining the general optical flow and stochastic process,the paper presents and approach by which the motion velocity can be calculated from blurred images.On the other hand,the blurred image can also be restored using the obtained motion information.For solving a problem with small motion limitation on the general optical flow computation,a multiresolution optical flow algoritm based on MAP estimation is proposed. For restoring the blurred image ,an iteration algorithm and the obtained motion velocity are used.The experiment shows that the proposed approach for both motion velocity computation and blurred image restoration works well.

  3. Use of computational modeling approaches in studying the binding interactions of compounds with human estrogen receptors.

    Science.gov (United States)

    Wang, Pan; Dang, Li; Zhu, Bao-Ting

    2016-01-01

    Estrogens have a whole host of physiological functions in many human organs and systems, including the reproductive, cardiovascular, and central nervous systems. Many naturally-occurring compounds with estrogenic or antiestrogenic activity are present in our environment and food sources. Synthetic estrogens and antiestrogens are also important therapeutic agents. At the molecular level, estrogen receptors (ERs) mediate most of the well-known actions of estrogens. Given recent advances in computational modeling tools, it is now highly practical to use these tools to study the interaction of human ERs with various types of ligands. There are two common categories of modeling techniques: one is the quantitative structure activity relationship (QSAR) analysis, which uses the structural information of the interacting ligands to predict the binding site properties of a macromolecule, and the other one is molecular docking-based computational analysis, which uses the 3-dimensional structural information of both the ligands and the receptor to predict the binding interaction. In this review, we discuss recent results that employed these and other related computational modeling approaches to characterize the binding interaction of various estrogens and antiestrogens with the human ERs. These examples clearly demonstrate that the computational modeling approaches, when used in combination with other experimental methods, are powerful tools that can precisely predict the binding interaction of various estrogenic ligands and their derivatives with the human ERs.

  4. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  5. Aircraft Engine Noise Scattering by Fuselage and Wings: A Computational Approach

    Science.gov (United States)

    Stanescu, D.; Hussaini, M. Y.; Farassat, F.

    2003-01-01

    The paper presents a time-domain method for computation of sound radiation from aircraft engine sources to the far-field. The effects of nonuniform flow around the aircraft and scattering of sound by fuselage and wings are accounted for in the formulation. The approach is based on the discretization of the inviscid flow equations through a collocation form of the Discontinuous Galerkin spectral element method. An isoparametric representation of the underlying geometry is used in order to take full advantage of the spectral accuracy of the method. Large-scale computations are made possible by a parallel implementation based on message passing. Results obtained for radiation from an axisymmetric nacelle alone are compared with those obtained when the same nacelle is installed in a generic configuration, with and without a wing.

  6. A Fluid Dynamics Approach for the Computation of Non-linear Force-Free Magnetic Field

    Institute of Scientific and Technical Information of China (English)

    Jing-Qun Li; Jing-Xiu Wang; Feng-Si Wei

    2003-01-01

    Inspired by the analogy between the magnetic field and velocity fieldof incompressible fluid flow, we propose a fluid dynamics approach for comput-ing nonlinear force-free magnetic fields. This method has the advantage that thedivergence-free condition is automatically satisfied, which is a sticky issue for manyother algorithms, and we can take advantage of modern high resolution algorithmsto process the force-free magnetic field. Several tests have been made based on thewell-known analytic solution proposed by Low & Lou. The numerical results arein satisfactory agreement with the analytic ones. It is suggested that the newlyproposed method is promising in extrapolating the active region or the whole sunmagnetic fields in the solar atmosphere based on the observed vector magnetic fieldon the photosphere.

  7. Hamilton Graph Based on DNA Computing

    Institute of Scientific and Technical Information of China (English)

    ZHANGJia-xiu

    2004-01-01

    DNA computing is a novel method for solving a class of intractable computationalproblems in which the computing can grow exponentially with problem size. Up to now, manyaccomplishments have been achieved to improve its performance and increase its reliability.Hamilton Graph Problem has been solved by means of molecular biology techniques. A smallgraph was encoded in molecules of DNA, and the “operations” of the computation wereperformed with standard protocols and enzymes. This work represents further evidence forthe ability of DNA computing to solve NP-complete search problems.

  8. Decomposition and Cross-Product-Based Method for Computing the Dynamic Equation of Robots

    Directory of Open Access Journals (Sweden)

    Ching-Long Shih

    2012-08-01

    Full Text Available This paper aims to demonstrate a clear relationship between Lagrange equations and Newton-Euler equations regarding computational methods for robot dynamics, from which we derive a systematic method for using either symbolic or on-line numerical computations. Based on the decomposition approach and cross-product operation, a computing method for robot dynamics can be easily developed. The advantages of this computing framework are that: it can be used for both symbolic and on-line numeric computation purposes, and it can also be applied to biped systems, as well as some simple closed-chain robot systems.

  9. Decomposition and Cross-Product-Based Method for Computing the Dynamic Equation of Robots

    Directory of Open Access Journals (Sweden)

    Ching-Long Shih

    2012-08-01

    Full Text Available This paper aims to demonstrate a clear relationship between Lagrange equations and Newton‐Euler equations regarding computational methods for robot dynamics, from which we derive a systematic method for using either symbolic or on‐line numerical computations. Based on the decomposition approach and cross‐product operation, a computing method for robot dynamics can be easily developed. The advantages of this computing framework are that: it can be used for both symbolic and on‐line numeric computation purposes, and it can also be applied to biped systems, as well as some simple closed‐chain robot systems.

  10. A quantum computer based on recombination processes in microelectronic devices

    Science.gov (United States)

    Theodoropoulos, K.; Ntalaperas, D.; Petras, I.; Konofaos, N.

    2005-01-01

    In this paper a quantum computer based on the recombination processes happening in semiconductor devices is presented. A "data element" and a "computational element" are derived based on Schokley-Read-Hall statistics and they can later be used to manifest a simple and known quantum computing process. Such a paradigm is shown by the application of the proposed computer onto a well known physical system involving traps in semiconductor devices.

  11. A quantum computer based on recombination processes in microelectronic devices

    Energy Technology Data Exchange (ETDEWEB)

    Theodoropoulos, K [Computer Engineering and Informatics Department, University of Patras, Patras (Greece); Ntalaperas, D [Computer Engineering and Informatics Department, University of Patras, Patras (Greece); Research Academic Computer Technology Institute, Riga Feraiou 61, 26110, Patras (Greece); Petras, I [Computer Engineering and Informatics Department, University of Patras, Patras (Greece); Konofaos, N [Computer Engineering and Informatics Department, University of Patras, Patras (Greece)

    2005-01-01

    In this paper a quantum computer based on the recombination processes happening in semiconductor devices is presented. A 'data element' and a 'computational element' are derived based on Schokley-Read-Hall statistics and they can later be used to manifest a simple and known quantum computing process. Such a paradigm is shown by the application of the proposed computer onto a well known physical system involving traps in semiconductor devices.

  12. Natural Languages Processing for Building Computer-based Learning Tools

    Institute of Scientific and Technical Information of China (English)

    张颖; 李娜

    2015-01-01

    This paper outlines a framework to use computer and natural language techniques for various levels of learners to learn foreign languages in Computer-based Learning environment. We propose some ideas for using the computer as a practical tool for learning foreign language where the most of courseware is generated automatically. We then describe how to build Computer-based Learning tools, discuss its effectiveness, and conclude with some possibilities using on-line resources.

  13. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  14. The Development of a Robot-Based Learning Companion: A User-Centered Design Approach

    Science.gov (United States)

    Hsieh, Yi-Zeng; Su, Mu-Chun; Chen, Sherry Y.; Chen, Gow-Dong

    2015-01-01

    A computer-vision-based method is widely employed to support the development of a variety of applications. In this vein, this study uses a computer-vision-based method to develop a playful learning system, which is a robot-based learning companion named RobotTell. Unlike existing playful learning systems, a user-centered design (UCD) approach is…

  15. Flexibility and practicality graz brain-computer interface approach.

    Science.gov (United States)

    Scherer, Reinhold; Müller-Putz, Gernot R; Pfurtscheller, Gert

    2009-01-01

    "Graz brain-computer interface (BCI)" transforms changes in oscillatory electroencephalogram (EEG) activity into control signals for external devices and feedback. Steady-state evoked potentials (SSEPs) and event-related desynchronization (ERD) are employed to encode user messages. User-specific setup and training are important issues for robust and reliable classification. Furthermore, in order to implement small and thus affordable systems, focus is put on the minimization of the number of EEG sensors. The system also supports the self-paced operation mode, that is, users have on-demand access to the system at any time and can autonomously initiate communication. Flexibility, usability, and practicality are essential to increase user acceptance. Here, we illustrate the possibilities offered by now from EEG-based communication. Results of several studies with able-bodied and disabled individuals performed inside the laboratory and in real-world environments are presented; their characteristics are shown and open issues are mentioned. The applications include the control of neuroprostheses and spelling devices, the interaction with Virtual Reality, and the operation of off-the-shelf software such as Google Earth.

  16. A computational toy model for shallow landslides: Molecular Dynamics approach

    CERN Document Server

    Martelloni, Gianluca; Massaro, Emanuele

    2012-01-01

    The aim of this paper is to propose a 2D computational algorithm for modeling of the trigger and the propagation of shallow landslides caused by rainfall. We used a Molecular Dynamics (MD) inspired model, similar to discrete element method (DEM), that is suitable to model granular material and to observe the trajectory of single particle, so to identify its dynamical properties. We consider that the triggering of shallow landslides is caused by the decrease of the static friction along the sliding surface due to water infiltration by rainfall. Thence the triggering is caused by two following conditions: (a) a threshold speed of the particles and (b) a condition on the static friction, between particles and slope surface, based on the Mohr-Coulomb failure criterion. The latter static condition is used in the geotechnical model to estimate the possibility of landslide triggering. Finally the interaction force between particles is defined trough a potential that, in the absence of experimental data, we have mode...

  17. Data Mining Based on Computational Intelligence

    Institute of Scientific and Technical Information of China (English)

    WANG Yuan-zhen; ZHANG Zhi-bing; YI Bao-lin; LI Hua-yang

    2005-01-01

    This paper combines computational intelligence tools: neural network, fuzzy logic, and genetic algorithm to develop a data mining architecture (NFGDM), which discov ers patterns and represents them in understandable forms. In the NFGDM, input data are preprocessed by fuzzification, the preprocessed data of input variables are then used to train a radial basis probabilistic neural network to classify the dataset according to the classes considered. A rule extraction technique is then applied in order to extract explicit knowledge from the trained neural networks and represent it in the form of fuzzy if-then rules. In the final stage, genetic algorithm is used as a rule-pruning module to eliminate those weak rules that are still in the rule bases. Comparison with some knownneural network classifier, the architecture has fast learning speed, and it is characterized by the incorporation of the possibility information into the consequents of classification rules in human understandable forms. The experiments show that the NFGDM is more efficient and more robust than traditional decision tree method.

  18. Attraction-Based Computation of Hyperbolic Lagrangian Coherent Structures

    CERN Document Server

    Karrasch, Daniel; Haller, George

    2014-01-01

    Recent advances enable the simultaneous computation of both attracting and repelling families of Lagrangian Coherent Structures (LCS) at the same initial or final time of interest. Obtaining LCS positions at intermediate times, however, has been problematic, because either the repelling or the attracting family is unstable with respect to numerical advection in a given time direction. Here we develop a new approach to compute arbitrary positions of hyperbolic LCS in a numerically robust fashion. Our approach only involves the advection of attracting material surfaces, thereby providing accurate LCS tracking at low computational cost. We illustrate the advantages of this approach on a simple model and on a turbulent velocity data set.

  19. Gesture Recognition by Computer Vision: An Integral Approach

    NARCIS (Netherlands)

    Lichtenauer, J.F.

    2009-01-01

    The fundamental objective of this Ph.D. thesis is to gain more insight into what is involved in the practical application of a computer vision system, when the conditions of use cannot be controlled completely. The basic assumption is that research on isolated aspects of computer vision often leads

  20. Computer Science Contests for Secondary School Students: Approaches to Classification

    Directory of Open Access Journals (Sweden)

    Wolfgang POHL

    2006-04-01

    Full Text Available The International Olympiad in Informatics currently provides a model which is imitated by the majority of contests for secondary school students in Informatics or Computer Science. However, the IOI model can be criticized, and alternative contest models exist. To support the discussion about contests in Computer Science, several dimensions for characterizing and classifying contests are suggested.