WorldWideScience

Sample records for dna-based computation solving

  1. Solving the 0/1 Knapsack Problem by a Biomolecular DNA Computer

    Directory of Open Access Journals (Sweden)

    Hassan Taghipour

    2013-01-01

    Full Text Available Solving some mathematical problems such as NP-complete problems by conventional silicon-based computers is problematic and takes so long time. DNA computing is an alternative method of computing which uses DNA molecules for computing purposes. DNA computers have massive degrees of parallel processing capability. The massive parallel processing characteristic of DNA computers is of particular interest in solving NP-complete and hard combinatorial problems. NP-complete problems such as knapsack problem and other hard combinatorial problems can be easily solved by DNA computers in a very short period of time comparing to conventional silicon-based computers. Sticker-based DNA computing is one of the methods of DNA computing. In this paper, the sticker based DNA computing was used for solving the 0/1 knapsack problem. At first, a biomolecular solution space was constructed by using appropriate DNA memory complexes. Then, by the application of a sticker-based parallel algorithm using biological operations, knapsack problem was resolved in polynomial time.

  2. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. A new fast algorithm for solving the minimum spanning tree problem based on DNA molecules computation.

    Science.gov (United States)

    Wang, Zhaocai; Huang, Dongmei; Meng, Huajun; Tang, Chengpei

    2013-10-01

    The minimum spanning tree (MST) problem is to find minimum edge connected subsets containing all the vertex of a given undirected graph. It is a vitally important NP-complete problem in graph theory and applied mathematics, having numerous real life applications. Moreover in previous studies, DNA molecular operations usually were used to solve NP-complete head-to-tail path search problems, rarely for NP-hard problems with multi-lateral path solutions result, such as the minimum spanning tree problem. In this paper, we present a new fast DNA algorithm for solving the MST problem using DNA molecular operations. For an undirected graph with n vertex and m edges, we reasonably design flexible length DNA strands representing the vertex and edges, take appropriate steps and get the solutions of the MST problem in proper length range and O(3m+n) time complexity. We extend the application of DNA molecular operations and simultaneity simplify the complexity of the computation. Results of computer simulative experiments show that the proposed method updates some of the best known values with very short time and that the proposed method provides a better performance with solution accuracy over existing algorithms. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  4. A Parallel Biological Optimization Algorithm to Solve the Unbalanced Assignment Problem Based on DNA Molecular Computing.

    Science.gov (United States)

    Wang, Zhaocai; Pu, Jun; Cao, Liling; Tan, Jian

    2015-10-23

    The unbalanced assignment problem (UAP) is to optimally resolve the problem of assigning n jobs to m individuals (m applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn) time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.

  5. A Parallel Biological Optimization Algorithm to Solve the Unbalanced Assignment Problem Based on DNA Molecular Computing

    Directory of Open Access Journals (Sweden)

    Zhaocai Wang

    2015-10-01

    Full Text Available The unbalanced assignment problem (UAP is to optimally resolve the problem of assigning n jobs to m individuals (m < n, such that minimum cost or maximum profit obtained. It is a vitally important Non-deterministic Polynomial (NP complete problem in operation management and applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.

  6. Fast parallel molecular algorithms for DNA-based computation: solving the elliptic curve discrete logarithm problem over GF2.

    Science.gov (United States)

    Li, Kenli; Zou, Shuting; Xv, Jin

    2008-01-01

    Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2(n)), n in Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2(n)) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.

  7. A DNA Computing Model for the Graph Vertex Coloring Problem Based on a Probe Graph

    Directory of Open Access Journals (Sweden)

    Jin Xu

    2018-02-01

    Full Text Available The biggest bottleneck in DNA computing is exponential explosion, in which the DNA molecules used as data in information processing grow exponentially with an increase of problem size. To overcome this bottleneck and improve the processing speed, we propose a DNA computing model to solve the graph vertex coloring problem. The main points of the model are as follows: ① The exponential explosion problem is solved by dividing subgraphs, reducing the vertex colors without losing the solutions, and ordering the vertices in subgraphs; and ② the bio-operation times are reduced considerably by a designed parallel polymerase chain reaction (PCR technology that dramatically improves the processing speed. In this article, a 3-colorable graph with 61 vertices is used to illustrate the capability of the DNA computing model. The experiment showed that not only are all the solutions of the graph found, but also more than 99% of false solutions are deleted when the initial solution space is constructed. The powerful computational capability of the model was based on specific reactions among the large number of nanoscale oligonucleotide strands. All these tiny strands are operated by DNA self-assembly and parallel PCR. After thousands of accurate PCR operations, the solutions were found by recognizing, splicing, and assembling. We also prove that the searching capability of this model is up to O(359. By means of an exhaustive search, it would take more than 896 000 years for an electronic computer (5 × 1014 s−1 to achieve this enormous task. This searching capability is the largest among both the electronic and non-electronic computers that have been developed since the DNA computing model was proposed by Adleman’s research group in 2002 (with a searching capability of O(220. Keywords: DNA computing, Graph vertex coloring problem, Polymerase chain reaction

  8. Exploring the Feasibility of a DNA Computer: Design of an ALU Using Sticker-Based DNA Model.

    Science.gov (United States)

    Sarkar, Mayukh; Ghosal, Prasun; Mohanty, Saraju P

    2017-09-01

    Since its inception, DNA computing has advanced to offer an extremely powerful, energy-efficient emerging technology for solving hard computational problems with its inherent massive parallelism and extremely high data density. This would be much more powerful and general purpose when combined with other existing well-known algorithmic solutions that exist for conventional computing architectures using a suitable ALU. Thus, a specifically designed DNA Arithmetic and Logic Unit (ALU) that can address operations suitable for both domains can mitigate the gap between these two. An ALU must be able to perform all possible logic operations, including NOT, OR, AND, XOR, NOR, NAND, and XNOR; compare, shift etc., integer and floating point arithmetic operations (addition, subtraction, multiplication, and division). In this paper, design of an ALU has been proposed using sticker-based DNA model with experimental feasibility analysis. Novelties of this paper may be in manifold. First, the integer arithmetic operations performed here are 2s complement arithmetic, and the floating point operations follow the IEEE 754 floating point format, resembling closely to a conventional ALU. Also, the output of each operation can be reused for any next operation. So any algorithm or program logic that users can think of can be implemented directly on the DNA computer without any modification. Second, once the basic operations of sticker model can be automated, the implementations proposed in this paper become highly suitable to design a fully automated ALU. Third, proposed approaches are easy to implement. Finally, these approaches can work on sufficiently large binary numbers.

  9. Analysis of problem solving on project based learning with resource based learning approach computer-aided program

    Science.gov (United States)

    Kuncoro, K. S.; Junaedi, I.; Dwijanto

    2018-03-01

    This study aimed to reveal the effectiveness of Project Based Learning with Resource Based Learning approach computer-aided program and analyzed problem-solving abilities in terms of problem-solving steps based on Polya stages. The research method used was mixed method with sequential explanatory design. The subject of this research was the students of math semester 4. The results showed that the S-TPS (Strong Top Problem Solving) and W-TPS (Weak Top Problem Solving) had good problem-solving abilities in each problem-solving indicator. The problem-solving ability of S-MPS (Strong Middle Problem Solving) and (Weak Middle Problem Solving) in each indicator was good. The subject of S-BPS (Strong Bottom Problem Solving) had a difficulty in solving the problem with computer program, less precise in writing the final conclusion and could not reflect the problem-solving process using Polya’s step. While the Subject of W-BPS (Weak Bottom Problem Solving) had not been able to meet almost all the indicators of problem-solving. The subject of W-BPS could not precisely made the initial table of completion so that the completion phase with Polya’s step was constrained.

  10. Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story

    Science.gov (United States)

    Gunbas, N.

    2015-01-01

    The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…

  11. Transforming bases to bytes: Molecular computing with DNA

    Indian Academy of Sciences (India)

    Despite the popular image of silicon-based computers for computation, an embryonic field of mole- cular computation is emerging, where molecules in solution perform computational ..... [4] Mao C, Sun W, Shen Z and Seeman N C 1999. A nanomechanical device based on the B-Z transition of DNA; Nature 397 144–146.

  12. AI tools in computer based problem solving

    Science.gov (United States)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  13. Solving Vertex Cover Problem Using DNA Tile Assembly Model

    Directory of Open Access Journals (Sweden)

    Zhihua Chen

    2013-01-01

    Full Text Available DNA tile assembly models are a class of mathematically distributed and parallel biocomputing models in DNA tiles. In previous works, tile assembly models have been proved be Turing-universal; that is, the system can do what Turing machine can do. In this paper, we use tile systems to solve computational hard problem. Mathematically, we construct three tile subsystems, which can be combined together to solve vertex cover problem. As a result, each of the proposed tile subsystems consists of Θ(1 types of tiles, and the assembly process is executed in a parallel way (like DNA’s biological function in cells; thus the systems can generate the solution of the problem in linear time with respect to the size of the graph.

  14. Fast parallel DNA-based algorithms for molecular computation: the set-partition problem.

    Science.gov (United States)

    Chang, Weng-Long

    2007-12-01

    This paper demonstrates that basic biological operations can be used to solve the set-partition problem. In order to achieve this, we propose three DNA-based algorithms, a signed parallel adder, a signed parallel subtractor and a signed parallel comparator, that formally verify our designed molecular solutions for solving the set-partition problem.

  15. The use of gold nanoparticle aggregation for DNA computing and logic-based biomolecular detection

    International Nuclear Information System (INIS)

    Lee, In-Hee; Yang, Kyung-Ae; Zhang, Byoung-Tak; Lee, Ji-Hoon; Park, Ji-Yoon; Chai, Young Gyu; Lee, Jae-Hoon

    2008-01-01

    The use of DNA molecules as a physical computational material has attracted much interest, especially in the area of DNA computing. DNAs are also useful for logical control and analysis of biological systems if efficient visualization methods are available. Here we present a quick and simple visualization technique that displays the results of the DNA computing process based on a colorimetric change induced by gold nanoparticle aggregation, and we apply it to the logic-based detection of biomolecules. Our results demonstrate its effectiveness in both DNA-based logical computation and logic-based biomolecular detection

  16. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    Science.gov (United States)

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  17. Emotion Oriented Programming: Computational Abstractions for AI Problem Solving

    OpenAIRE

    Darty , Kevin; Sabouret , Nicolas

    2012-01-01

    International audience; In this paper, we present a programming paradigm for AI problem solving based on computational concepts drawn from Affective Computing. It is believed that emotions participate in human adaptability and reactivity, in behaviour selection and in complex and dynamic environments. We propose to define a mechanism inspired from this observation for general AI problem solving. To this purpose, we synthesize emotions as programming abstractions that represent the perception ...

  18. Fast parallel DNA-based algorithms for molecular computation: quadratic congruence and factoring integers.

    Science.gov (United States)

    Chang, Weng-Long

    2012-03-01

    Assume that n is a positive integer. If there is an integer such that M (2) ≡ C (mod n), i.e., the congruence has a solution, then C is said to be a quadratic congruence (mod n). If the congruence does not have a solution, then C is said to be a quadratic noncongruence (mod n). The task of solving the problem is central to many important applications, the most obvious being cryptography. In this article, we describe a DNA-based algorithm for solving quadratic congruence and factoring integers. In additional to this novel contribution, we also show the utility of our encoding scheme, and of the algorithm's submodules. We demonstrate how a variety of arithmetic, shifted and comparative operations, namely bitwise and full addition, subtraction, left shifter and comparison perhaps are performed using strands of DNA.

  19. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  20. Computational modeling of a carbon nanotube-based DNA nanosensor

    Energy Technology Data Exchange (ETDEWEB)

    Kalantari-Nejad, R; Bahrami, M [Mechanical Engineering Department, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Rafii-Tabar, H [Department of Medical Physics and Biomedical Engineering and Research Centre for Medical Nanotechnology and Tissue Engineering, Shahid Beheshti University of Medical Sciences, Evin, Tehran (Iran, Islamic Republic of); Rungger, I; Sanvito, S, E-mail: mbahrami@aut.ac.ir [School of Physics and CRANN, Trinity College, Dublin 2 (Ireland)

    2010-11-05

    During the last decade the design of biosensors, based on quantum transport in one-dimensional nanostructures, has developed as an active area of research. Here we investigate the sensing capabilities of a DNA nanosensor, designed as a semiconductor single walled carbon nanotube (SWCNT) connected to two gold electrodes and functionalized with a DNA strand acting as a bio-receptor probe. In particular, we have considered both covalent and non-covalent bonding between the DNA probe and the SWCNT. The optimized atomic structure of the sensor is computed both before and after the receptor attaches itself to the target, which consists of another DNA strand. The sensor's electrical conductance and transmission coefficients are calculated at the equilibrium geometries via the non-equilibrium Green's function scheme combined with the density functional theory in the linear response limit. We demonstrate a sensing efficiency of 70% for the covalently bonded bio-receptor probe, which drops to about 19% for the non-covalently bonded one. These results suggest that a SWCNT may be a promising candidate for a bio-molecular FET sensor.

  1. Computational modeling of a carbon nanotube-based DNA nanosensor

    International Nuclear Information System (INIS)

    Kalantari-Nejad, R; Bahrami, M; Rafii-Tabar, H; Rungger, I; Sanvito, S

    2010-01-01

    During the last decade the design of biosensors, based on quantum transport in one-dimensional nanostructures, has developed as an active area of research. Here we investigate the sensing capabilities of a DNA nanosensor, designed as a semiconductor single walled carbon nanotube (SWCNT) connected to two gold electrodes and functionalized with a DNA strand acting as a bio-receptor probe. In particular, we have considered both covalent and non-covalent bonding between the DNA probe and the SWCNT. The optimized atomic structure of the sensor is computed both before and after the receptor attaches itself to the target, which consists of another DNA strand. The sensor's electrical conductance and transmission coefficients are calculated at the equilibrium geometries via the non-equilibrium Green's function scheme combined with the density functional theory in the linear response limit. We demonstrate a sensing efficiency of 70% for the covalently bonded bio-receptor probe, which drops to about 19% for the non-covalently bonded one. These results suggest that a SWCNT may be a promising candidate for a bio-molecular FET sensor.

  2. Solving a Hamiltonian Path Problem with a bacterial computer

    Directory of Open Access Journals (Sweden)

    Treece Jessica

    2009-07-01

    Full Text Available Abstract Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node

  3. Solving a Hamiltonian Path Problem with a bacterial computer

    Science.gov (United States)

    Baumgardner, Jordan; Acker, Karen; Adefuye, Oyinade; Crowley, Samuel Thomas; DeLoache, Will; Dickson, James O; Heard, Lane; Martens, Andrew T; Morton, Nickolaus; Ritter, Michelle; Shoecraft, Amber; Treece, Jessica; Unzicker, Matthew; Valencia, Amanda; Waters, Mike; Campbell, A Malcolm; Heyer, Laurie J; Poet, Jeffrey L; Eckdahl, Todd T

    2009-01-01

    Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node directed graph. This proof

  4. Statistical length of DNA based on AFM image measured by a computer

    International Nuclear Information System (INIS)

    Chen Xinqing; Qiu Xijun; Zhang Yi; Hu Jun; Wu Shiying; Huang Yibo; Ai Xiaobai; Li Minqian

    2001-01-01

    Taking advantage of image processing technology, the contour length of DNA molecule was measured automatically by a computer. Based on the AFM image of DNA, the topography of DNA was simulated into a curve. Then the DNA length was measured automatically by inserting mode. It was shown that the experimental length of a naturally deposited DNA (180.4 +- 16.4 nm) was well consistent with the theoretical length (185.0 nm). Comparing to other methods, the present approach had advantages of precision and automatism. The stretched DNA was also measured. It present approach had advantages of precision and automatism. The stretched DNA was also measured. It was shown that the experimental length (343.6 +- 20.7 nm) was much longer than the theoretical length (307.0 nm). This result indicated that the stretching process had a distinct effect on the DNA length. However, the method provided here avoided the DNA-stretching effect

  5. Second International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Deep, Kusum; Pant, Millie; Bansal, Jagdish; Ray, Kanad; Gupta, Umesh

    2014-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2012), held at JK Lakshmipat University, Jaipur, India. This book provides the latest developments in the area of soft computing and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining, etc. The objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  6. Solving the Stokes problem on a massively parallel computer

    DEFF Research Database (Denmark)

    Axelsson, Owe; Barker, Vincent A.; Neytcheva, Maya

    2001-01-01

    boundary value problem for each velocity component, are solved by the conjugate gradient method with a preconditioning based on the algebraic multi‐level iteration (AMLI) technique. The velocity is found from the computed pressure. The method is optimal in the sense that the computational work...... is proportional to the number of unknowns. Further, it is designed to exploit a massively parallel computer with distributed memory architecture. Numerical experiments on a Cray T3E computer illustrate the parallel performance of the method....

  7. Effects of computer-based graphic organizers to solve one-step word problems for middle school students with mild intellectual disability: A preliminary study.

    Science.gov (United States)

    Sheriff, Kelli A; Boon, Richard T

    2014-08-01

    The purpose of this study was to examine the effects of computer-based graphic organizers, using Kidspiration 3© software, to solve one-step word problems. Participants included three students with mild intellectual disability enrolled in a functional academic skills curriculum in a self-contained classroom. A multiple probe single-subject research design (Horner & Baer, 1978) was used to evaluate the effectiveness of computer-based graphic organizers to solving mathematical one-step word problems. During the baseline phase, the students completed a teacher-generated worksheet that consisted of nine functional word problems in a traditional format using a pencil, paper, and a calculator. In the intervention and maintenance phases, the students were instructed to complete the word problems using a computer-based graphic organizer. Results indicated that all three of the students improved in their ability to solve the one-step word problems using computer-based graphic organizers compared to traditional instructional practices. Limitations of the study and recommendations for future research directions are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Third International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Nagar, Atulya; Bansal, Jagdish

    2014-01-01

    The present book is based on the research papers presented in the 3rd International Conference on Soft Computing for Problem Solving (SocProS 2013), held as a part of the golden jubilee celebrations of the Saharanpur Campus of IIT Roorkee, at the Noida Campus of Indian Institute of Technology Roorkee, India. This book is divided into two volumes and covers a variety of topics including mathematical modelling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining etc. Particular emphasis is laid on soft computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems, which are otherwise difficult to solve by the usual and traditional methods. The book is directed ...

  9. Nonlinear evolution equations and solving algebraic systems: the importance of computer algebra

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Kostov, N.A.

    1989-01-01

    In the present paper we study the application of computer algebra to solve the nonlinear polynomial systems which arise in investigation of nonlinear evolution equations. We consider several systems which are obtained in classification of integrable nonlinear evolution equations with uniform rank. Other polynomial systems are related with the finding of algebraic curves for finite-gap elliptic potentials of Lame type and generalizations. All systems under consideration are solved using the method based on construction of the Groebner basis for corresponding polynomial ideals. The computations have been carried out using computer algebra systems. 20 refs

  10. The Role of the Goal in Solving Hard Computational Problems: Do People Really Optimize?

    Science.gov (United States)

    Carruthers, Sarah; Stege, Ulrike; Masson, Michael E. J.

    2018-01-01

    The role that the mental, or internal, representation plays when people are solving hard computational problems has largely been overlooked to date, despite the reality that this internal representation drives problem solving. In this work we investigate how performance on versions of two hard computational problems differs based on what internal…

  11. Fast parallel molecular algorithms for DNA-based computation: factoring integers.

    Science.gov (United States)

    Chang, Weng-Long; Guo, Minyi; Ho, Michael Shan-Hui

    2005-06-01

    The RSA public-key cryptosystem is an algorithm that converts input data to an unrecognizable encryption and converts the unrecognizable data back into its original decryption form. The security of the RSA public-key cryptosystem is based on the difficulty of factoring the product of two large prime numbers. This paper demonstrates to factor the product of two large prime numbers, and is a breakthrough in basic biological operations using a molecular computer. In order to achieve this, we propose three DNA-based algorithms for parallel subtractor, parallel comparator, and parallel modular arithmetic that formally verify our designed molecular solutions for factoring the product of two large prime numbers. Furthermore, this work indicates that the cryptosystems using public-key are perhaps insecure and also presents clear evidence of the ability of molecular computing to perform complicated mathematical operations.

  12. Computational applications of DNA physical scales

    DEFF Research Database (Denmark)

    Baldi, Pierre; Chauvin, Yves; Brunak, Søren

    1998-01-01

    that these scales provide an alternative or complementary compact representation of DNA sequences. As an example we construct a strand invariant representation of DNA sequences. The scales can also be used to analyze and discover new DNA structural patterns, especially in combinations with hidden Markov models......The authors study from a computational standpoint several different physical scales associated with structural features of DNA sequences, including dinucleotide scales such as base stacking energy and propellor twist, and trinucleotide scales such as bendability and nucleosome positioning. We show...

  13. Computational applications of DNA structural scales

    DEFF Research Database (Denmark)

    Baldi, P.; Chauvin, Y.; Brunak, Søren

    1998-01-01

    that these scales provide an alternative or complementary compact representation of DNA sequences. As an example, we construct a strand-invariant representation of DNA sequences. The scales can also be used to analyze and discover new DNA structural patterns, especially in combination with hidden Markov models......Studies several different physical scales associated with the structural features of DNA sequences from a computational standpoint, including dinucleotide scales, such as base stacking energy and propeller twist, and trinucleotide scales, such as bendability and nucleosome positioning. We show...

  14. Computer problem-solving coaches for introductory physics: Design and usability studies

    Science.gov (United States)

    Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew

    2016-06-01

    The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how effective such coaches might be, they will only be useful if they are attractive to students. We describe the design and testing of a set of web-based computer programs that act as personal coaches to students while they practice solving problems from introductory physics. The coaches are designed to supplement regular human instruction, giving students access to effective forms of practice outside class. We present results from large-scale usability tests of the computer coaches and discuss their implications for future versions of the coaches.

  15. Experimental quantum computing to solve systems of linear equations.

    Science.gov (United States)

    Cai, X-D; Weedbrook, C; Su, Z-E; Chen, M-C; Gu, Mile; Zhu, M-J; Li, Li; Liu, Nai-Le; Lu, Chao-Yang; Pan, Jian-Wei

    2013-06-07

    Solving linear systems of equations is ubiquitous in all areas of science and engineering. With rapidly growing data sets, such a task can be intractable for classical computers, as the best known classical algorithms require a time proportional to the number of variables N. A recently proposed quantum algorithm shows that quantum computers could solve linear systems in a time scale of order log(N), giving an exponential speedup over classical computers. Here we realize the simplest instance of this algorithm, solving 2×2 linear equations for various input vectors on a quantum computer. We use four quantum bits and four controlled logic gates to implement every subroutine required, demonstrating the working principle of this algorithm.

  16. Tyramine Hydrochloride Based Label-Free System for Operating Various DNA Logic Gates and a DNA Caliper for Base Number Measurements.

    Science.gov (United States)

    Fan, Daoqing; Zhu, Xiaoqing; Dong, Shaojun; Wang, Erkang

    2017-07-05

    DNA is believed to be a promising candidate for molecular logic computation, and the fluorogenic/colorimetric substrates of G-quadruplex DNAzyme (G4zyme) are broadly used as label-free output reporters of DNA logic circuits. Herein, for the first time, tyramine-HCl (a fluorogenic substrate of G4zyme) is applied to DNA logic computation and a series of label-free DNA-input logic gates, including elementary AND, OR, and INHIBIT logic gates, as well as a two to one encoder, are constructed. Furthermore, a DNA caliper that can measure the base number of target DNA as low as three bases is also fabricated. This DNA caliper can also perform concatenated AND-AND logic computation to fulfil the requirements of sophisticated logic computing. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Internet computer coaches for introductory physics problem solving

    Science.gov (United States)

    Xu Ryan, Qing

    The ability to solve problems in a variety of contexts is becoming increasingly important in our rapidly changing technological society. Problem-solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving skills throughout the educational system, national studies have shown that the majority of students emerge from such courses having made little progress toward developing good problem-solving skills. The Physics Education Research Group at the University of Minnesota has been developing Internet computer coaches to help students become more expert-like problem solvers. During the Fall 2011 and Spring 2013 semesters, the coaches were introduced into large sections (200+ students) of the calculus based introductory mechanics course at the University of Minnesota. This dissertation, will address the research background of the project, including the pedagogical design of the coaches and the assessment of problem solving. The methodological framework of conducting experiments will be explained. The data collected from the large-scale experimental studies will be discussed from the following aspects: the usage and usability of these coaches; the usefulness perceived by students; and the usefulness measured by final exam and problem solving rubric. It will also address the implications drawn from this study, including using this data to direct future coach design and difficulties in conducting authentic assessment of problem-solving.

  18. Engineering and Computing Portal to Solve Environmental Problems

    Science.gov (United States)

    Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.

    2018-01-01

    This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.

  19. Proceedings of the International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Pant, Millie; Bansal, Jagdish

    2012-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  20. DNA Processing and Reassembly on General Purpose FPGA-based Development Boards

    Directory of Open Access Journals (Sweden)

    SZÁSZ Csaba

    2017-05-01

    Full Text Available The great majority of researchers involved in microelectronics generally agree that many scientific challenges in life sciences have associated with them a powerful computational requirement that must be solved before scientific progress can be made. The current trend in Deoxyribonucleic Acid (DNA computing technologies is to develop special hardware platforms capable to provide the needed processing performance at lower cost. In this endeavor the FPGA-based (Field Programmable Gate Arrays configurations aimed to accelerate genome sequencing and reassembly plays a leading role. This paper emphasizes benefits and advantages using general purpose FPGA-based development boards in DNA reassembly applications beside the special hardware architecture solutions. An original approach is unfolded which outlines the versatility of high performance ready-to-use manufacturer development platforms endowed with powerful hardware resources fully optimized for high speed processing applications. The theoretical arguments are supported via an intuitive implementation example where the designer it is discharged from any hardware development effort and completely assisted in exclusive concentration only on software design issues providing greatly reduced application development cycles. The experiments prove that such boards available on the market are suitable to fulfill in all a wide range of DNA sequencing and reassembly applications.

  1. Proceedings of the International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Nagar, Atulya; Pant, Millie; Bansal, Jagdish

    2012-01-01

    The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.

  2. Applying natural evolution for solving computational problems - Lecture 1

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  3. Applying natural evolution for solving computational problems - Lecture 2

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  4. A homotopy method for solving Riccati equations on a shared memory parallel computer

    International Nuclear Information System (INIS)

    Zigic, D.; Watson, L.T.; Collins, E.G. Jr.; Davis, L.D.

    1993-01-01

    Although there are numerous algorithms for solving Riccati equations, there still remains a need for algorithms which can operate efficiently on large problems and on parallel machines. This paper gives a new homotopy-based algorithm for solving Riccati equations on a shared memory parallel computer. The central part of the algorithm is the computation of the kernel of the Jacobian matrix, which is essential for the corrector iterations along the homotopy zero curve. Using a Schur decomposition the tensor product structure of various matrices can be efficiently exploited. The algorithm allows for efficient parallelization on shared memory machines

  5. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills

    Science.gov (United States)

    Polyak, Stephen T.; von Davier, Alina A.; Peterschmidt, Kurt

    2017-01-01

    This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses. PMID:29238314

  6. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills

    Directory of Open Access Journals (Sweden)

    Stephen T. Polyak

    2017-11-01

    Full Text Available This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.

  7. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills.

    Science.gov (United States)

    Polyak, Stephen T; von Davier, Alina A; Peterschmidt, Kurt

    2017-01-01

    This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.

  8. A light weight secure image encryption scheme based on chaos & DNA computing

    Directory of Open Access Journals (Sweden)

    Bhaskar Mondal

    2017-10-01

    Full Text Available This paper proposed a new light weight secure cryptographic scheme for secure image communication. In this scheme the plain image is permuted first using a sequence of pseudo random number (PRN and encrypted by DeoxyriboNucleic Acid (DNA computation. Two PRN sequences are generated by a Pseudo Random Number Generator (PRNG based on cross coupled chaotic logistic map using two sets of keys. The first PRN sequence is used for permuting the plain image whereas the second PRN sequence is used for generating random DNA sequence. The number of rounds of permutation and encryption may be variable to increase security. The scheme is proposed for gray label images but the scheme may be extended for color images and text data. Simulation results exhibit that the proposed scheme can defy any kind of attack.

  9. A Cognitive Model for Problem Solving in Computer Science

    Science.gov (United States)

    Parham, Jennifer R.

    2009-01-01

    According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…

  10. Knowledge-Based Instruction: Teaching Problem Solving in a Logo Learning Environment.

    Science.gov (United States)

    Swan, Karen; Black, John B.

    1993-01-01

    Discussion of computer programming and knowledge-based instruction focuses on three studies of elementary and secondary school students which show that five particular problem-solving strategies can be developed in students explicitly taught the strategies and given practice applying them to solve LOGO programming problems. (Contains 53…

  11. Computational method and system for modeling, analyzing, and optimizing DNA amplification and synthesis

    Science.gov (United States)

    Vandersall, Jennifer A.; Gardner, Shea N.; Clague, David S.

    2010-05-04

    A computational method and computer-based system of modeling DNA synthesis for the design and interpretation of PCR amplification, parallel DNA synthesis, and microarray chip analysis. The method and system include modules that address the bioinformatics, kinetics, and thermodynamics of DNA amplification and synthesis. Specifically, the steps of DNA selection, as well as the kinetics and thermodynamics of DNA hybridization and extensions, are addressed, which enable the optimization of the processing and the prediction of the products as a function of DNA sequence, mixing protocol, time, temperature and concentration of species.

  12. Engineering Courses on Computational Thinking Through Solving Problems in Artificial Intelligence

    Directory of Open Access Journals (Sweden)

    Piyanuch Silapachote

    2017-09-01

    Full Text Available Computational thinking sits at the core of every engineering and computing related discipline. It has increasingly emerged as its own subject in all levels of education. It is a powerful cornerstone for cognitive development, creative problem solving, algorithmic thinking and designs, and programming. How to effectively teach computational thinking skills poses real challenges and creates opportunities. Targeting entering computer science and engineering undergraduates, we resourcefully integrate elements from artificial intelligence (AI into introductory computing courses. In addition to comprehension of the essence of computational thinking, practical exercises in AI enable inspirations of collaborative problem solving beyond abstraction, logical reasoning, critical and analytical thinking. Problems in machine intelligence systems intrinsically connect students to algorithmic oriented computing and essential mathematical foundations. Beyond knowledge representation, AI fosters a gentle introduction to data structures and algorithms. Focused on engaging mental tool, a computer is never a necessity. Neither coding nor programming is ever required. Instead, students enjoy constructivist classrooms designed to always be active, flexible, and highly dynamic. Learning to learn and reflecting on cognitive experiences, they rigorously construct knowledge from collectively solving exciting puzzles, competing in strategic games, and participating in intellectual discussions.

  13. Problem-Solving Test: Analysis of DNA Damage Recognizing Proteins in Yeast and Human Cells

    Science.gov (United States)

    Szeberenyi, Jozsef

    2013-01-01

    The experiment described in this test was aimed at identifying DNA repair proteins in human and yeast cells. Terms to be familiar with before you start to solve the test: DNA repair, germline mutation, somatic mutation, inherited disease, cancer, restriction endonuclease, radioactive labeling, [alpha-[superscript 32]P]ATP, [gamma-[superscript…

  14. DNA-based machines.

    Science.gov (United States)

    Wang, Fuan; Willner, Bilha; Willner, Itamar

    2014-01-01

    The base sequence in nucleic acids encodes substantial structural and functional information into the biopolymer. This encoded information provides the basis for the tailoring and assembly of DNA machines. A DNA machine is defined as a molecular device that exhibits the following fundamental features. (1) It performs a fuel-driven mechanical process that mimics macroscopic machines. (2) The mechanical process requires an energy input, "fuel." (3) The mechanical operation is accompanied by an energy consumption process that leads to "waste products." (4) The cyclic operation of the DNA devices, involves the use of "fuel" and "anti-fuel" ingredients. A variety of DNA-based machines are described, including the construction of "tweezers," "walkers," "robots," "cranes," "transporters," "springs," "gears," and interlocked cyclic DNA structures acting as reconfigurable catenanes, rotaxanes, and rotors. Different "fuels", such as nucleic acid strands, pH (H⁺/OH⁻), metal ions, and light, are used to trigger the mechanical functions of the DNA devices. The operation of the devices in solution and on surfaces is described, and a variety of optical, electrical, and photoelectrochemical methods to follow the operations of the DNA machines are presented. We further address the possible applications of DNA machines and the future perspectives of molecular DNA devices. These include the application of DNA machines as functional structures for the construction of logic gates and computing, for the programmed organization of metallic nanoparticle structures and the control of plasmonic properties, and for controlling chemical transformations by DNA machines. We further discuss the future applications of DNA machines for intracellular sensing, controlling intracellular metabolic pathways, and the use of the functional nanostructures for drug delivery and medical applications.

  15. Manage Your Life Online (MYLO): a pilot trial of a conversational computer-based intervention for problem solving in a student sample.

    Science.gov (United States)

    Gaffney, Hannah; Mansell, Warren; Edwards, Rachel; Wright, Jason

    2014-11-01

    Computerized self-help that has an interactive, conversational format holds several advantages, such as flexibility across presenting problems and ease of use. We designed a new program called MYLO that utilizes the principles of METHOD of Levels (MOL) therapy--based upon Perceptual Control Theory (PCT). We tested the efficacy of MYLO, tested whether the psychological change mechanisms described by PCT mediated its efficacy, and evaluated effects of client expectancy. Forty-eight student participants were randomly assigned to MYLO or a comparison program ELIZA. Participants discussed a problem they were currently experiencing with their assigned program and completed measures of distress, resolution and expectancy preintervention, postintervention and at 2-week follow-up. MYLO and ELIZA were associated with reductions in distress, depression, anxiety and stress. MYLO was considered more helpful and led to greater problem resolution. The psychological change processes predicted higher ratings of MYLO's helpfulness and reductions in distress. Positive expectancies towards computer-based problem solving correlated with MYLO's perceived helpfulness and greater problem resolution, and this was partly mediated by the psychological change processes identified. The findings provide provisional support for the acceptability of the MYLO program in a non-clinical sample although its efficacy as an innovative computer-based aid to problem solving remains unclear. Nevertheless, the findings provide tentative early support for the mechanisms of psychological change identified within PCT and highlight the importance of client expectations on predicting engagement in computer-based self-help.

  16. 5th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Bansal, Jagdish; Nagar, Atulya; Das, Kedar

    2016-01-01

    This two volume book is based on the research papers presented at the 5th International Conference on Soft Computing for Problem Solving (SocProS 2015) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.

  17. 4th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Deep, Kusum; Pant, Millie; Bansal, Jagdish; Nagar, Atulya

    2015-01-01

    This two volume book is based on the research papers presented at the 4th International Conference on Soft Computing for Problem Solving (SocProS 2014) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and healthcare, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.

  18. Solving satisfiability problems by the ground-state quantum computer

    International Nuclear Information System (INIS)

    Mao Wenjin

    2005-01-01

    A quantum algorithm is proposed to solve the satisfiability (SAT) problems by the ground-state quantum computer. The scale of the energy gap of the ground-state quantum computer is analyzed for the 3-bit exact cover problem. The time cost of this algorithm on the general SAT problems is discussed

  19. Using Computer Simulations in Chemistry Problem Solving

    Science.gov (United States)

    Avramiotis, Spyridon; Tsaparlis, Georgios

    2013-01-01

    This study is concerned with the effects of computer simulations of two novel chemistry problems on the problem solving ability of students. A control-experimental group, equalized by pair groups (n[subscript Exp] = n[subscript Ctrl] = 78), research design was used. The students had no previous experience of chemical practical work. Student…

  20. The benefits of computer-generated feedback for mathematics problem solving.

    Science.gov (United States)

    Fyfe, Emily R; Rittle-Johnson, Bethany

    2016-07-01

    The goal of the current research was to better understand when and why feedback has positive effects on learning and to identify features of feedback that may improve its efficacy. In a randomized experiment, second-grade children received instruction on a correct problem-solving strategy and then solved a set of relevant problems. Children were assigned to receive no feedback, immediate feedback, or summative feedback from the computer. On a posttest the following day, feedback resulted in higher scores relative to no feedback for children who started with low prior knowledge. Immediate feedback was particularly effective, facilitating mastery of the material for children with both low and high prior knowledge. Results suggest that minimal computer-generated feedback can be a powerful form of guidance during problem solving. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. A detailed experimental study of a DNA computer with two endonucleases.

    Science.gov (United States)

    Sakowski, Sebastian; Krasiński, Tadeusz; Sarnik, Joanna; Blasiak, Janusz; Waldmajer, Jacek; Poplawski, Tomasz

    2017-07-14

    Great advances in biotechnology have allowed the construction of a computer from DNA. One of the proposed solutions is a biomolecular finite automaton, a simple two-state DNA computer without memory, which was presented by Ehud Shapiro's group at the Weizmann Institute of Science. The main problem with this computer, in which biomolecules carry out logical operations, is its complexity - increasing the number of states of biomolecular automata. In this study, we constructed (in laboratory conditions) a six-state DNA computer that uses two endonucleases (e.g. AcuI and BbvI) and a ligase. We have presented a detailed experimental verification of its feasibility. We described the effect of the number of states, the length of input data, and the nondeterminism on the computing process. We also tested different automata (with three, four, and six states) running on various accepted input words of different lengths such as ab, aab, aaab, ababa, and of an unaccepted word ba. Moreover, this article presents the reaction optimization and the methods of eliminating certain biochemical problems occurring in the implementation of a biomolecular DNA automaton based on two endonucleases.

  2. Molecular computing towards a novel computing architecture for complex problem solving

    CERN Document Server

    Chang, Weng-Long

    2014-01-01

    This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...

  3. Superimposed Code Theorectic Analysis of DNA Codes and DNA Computing

    Science.gov (United States)

    2010-03-01

    that the hybridization that occurs between a DNA strand and its Watson - Crick complement can be used to perform mathematical computation. This research...ssDNA single stranded DNA WC Watson – Crick A Adenine C Cytosine G Guanine T Thymine ... Watson - Crick (WC) duplex, e.g., TCGCA TCGCA . Note that non-WC duplexes can form and such a formation is called a cross-hybridization. Cross

  4. Ontology Design for Solving Computationally-Intensive Problems on Heterogeneous Architectures

    Directory of Open Access Journals (Sweden)

    Hossam M. Faheem

    2018-02-01

    Full Text Available Viewing a computationally-intensive problem as a self-contained challenge with its own hardware, software and scheduling strategies is an approach that should be investigated. We might suggest assigning heterogeneous hardware architectures to solve a problem, while parallel computing paradigms may play an important role in writing efficient code to solve the problem; moreover, the scheduling strategies may be examined as a possible solution. Depending on the problem complexity, finding the best possible solution using an integrated infrastructure of hardware, software and scheduling strategy can be a complex job. Developing and using ontologies and reasoning techniques play a significant role in reducing the complexity of identifying the components of such integrated infrastructures. Undertaking reasoning and inferencing regarding the domain concepts can help to find the best possible solution through a combination of hardware, software and scheduling strategies. In this paper, we present an ontology and show how we can use it to solve computationally-intensive problems from various domains. As a potential use for the idea, we present examples from the bioinformatics domain. Validation by using problems from the Elastic Optical Network domain has demonstrated the flexibility of the suggested ontology and its suitability for use with any other computationally-intensive problem domain.

  5. Genomic signal processing methods for computation of alignment-free distances from DNA sequences.

    Science.gov (United States)

    Borrayo, Ernesto; Mendizabal-Ruiz, E Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P; Morales, J Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments.

  6. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  7. A heterogeneous computing environment to solve the 768-bit RSA challenge

    OpenAIRE

    Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz

    2010-01-01

    In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.

  8. Using Computer Symbolic Algebra to Solve Differential Equations.

    Science.gov (United States)

    Mathews, John H.

    1989-01-01

    This article illustrates that mathematical theory can be incorporated into the process to solve differential equations by a computer algebra system, muMATH. After an introduction to functions of muMATH, several short programs for enhancing the capabilities of the system are discussed. Listed are six references. (YP)

  9. A Novel Image Encryption Algorithm Based on a Fractional-Order Hyperchaotic System and DNA Computing

    Directory of Open Access Journals (Sweden)

    Taiyong Li

    2017-01-01

    Full Text Available In the era of the Internet, image encryption plays an important role in information security. Chaotic systems and DNA operations have been proven to be powerful for image encryption. To further enhance the security of image, in this paper, we propose a novel algorithm that combines the fractional-order hyperchaotic Lorenz system and DNA computing (FOHCLDNA for image encryption. Specifically, the algorithm consists of four parts: firstly, we use a fractional-order hyperchaotic Lorenz system to generate a pseudorandom sequence that will be utilized during the whole encryption process; secondly, a simple but effective diffusion scheme is performed to spread the little change in one pixel to all the other pixels; thirdly, the plain image is encoded by DNA rules and corresponding DNA operations are performed; finally, global permutation and 2D and 3D permutation are performed on pixels, bits, and acid bases. The extensive experimental results on eight publicly available testing images demonstrate that the encryption algorithm can achieve state-of-the-art performance in terms of security and robustness when compared with some existing methods, showing that the FOHCLDNA is promising for image encryption.

  10. Computer science. Heads-up limit hold'em poker is solved.

    Science.gov (United States)

    Bowling, Michael; Burch, Neil; Johanson, Michael; Tammelin, Oskari

    2015-01-09

    Poker is a family of games that exhibit imperfect information, where players do not have full knowledge of past events. Whereas many perfect-information games have been solved (e.g., Connect Four and checkers), no nontrivial imperfect-information game played competitively by humans has previously been solved. Here, we announce that heads-up limit Texas hold'em is now essentially weakly solved. Furthermore, this computation formally proves the common wisdom that the dealer in the game holds a substantial advantage. This result was enabled by a new algorithm, CFR(+), which is capable of solving extensive-form games orders of magnitude larger than previously possible. Copyright © 2015, American Association for the Advancement of Science.

  11. Research on Image Encryption Based on DNA Sequence and Chaos Theory

    Science.gov (United States)

    Tian Zhang, Tian; Yan, Shan Jun; Gu, Cheng Yan; Ren, Ran; Liao, Kai Xin

    2018-04-01

    Nowadays encryption is a common technique to protect image data from unauthorized access. In recent years, many scientists have proposed various encryption algorithms based on DNA sequence to provide a new idea for the design of image encryption algorithm. Therefore, a new method of image encryption based on DNA computing technology is proposed in this paper, whose original image is encrypted by DNA coding and 1-D logistic chaotic mapping. First, the algorithm uses two modules as the encryption key. The first module uses the real DNA sequence, and the second module is made by one-dimensional logistic chaos mapping. Secondly, the algorithm uses DNA complementary rules to encode original image, and uses the key and DNA computing technology to compute each pixel value of the original image, so as to realize the encryption of the whole image. Simulation results show that the algorithm has good encryption effect and security.

  12. An efficient computer based wavelets approximation method to solve Fuzzy boundary value differential equations

    Science.gov (United States)

    Alam Khan, Najeeb; Razzaq, Oyoon Abdul

    2016-03-01

    In the present work a wavelets approximation method is employed to solve fuzzy boundary value differential equations (FBVDEs). Essentially, a truncated Legendre wavelets series together with the Legendre wavelets operational matrix of derivative are utilized to convert FB- VDE into a simple computational problem by reducing it into a system of fuzzy algebraic linear equations. The capability of scheme is investigated on second order FB- VDE considered under generalized H-differentiability. Solutions are represented graphically showing competency and accuracy of this method.

  13. Superimposed Code Theoretic Analysis of Deoxyribonucleic Acid (DNA) Codes and DNA Computing

    Science.gov (United States)

    2010-01-01

    DNA strand and its Watson - Crick complement can be used to perform mathematical computation. This research addresses how the...Acid dsDNA double stranded DNA MOSAIC Mobile Stream Processing Cluster PCR Polymerase Chain Reaction RAM Random Access Memory ssDNA single stranded DNA WC Watson – Crick A Adenine C Cytosine G Guanine T Thymine ...are 5′→3′ and strands with strikethrough are 3′→5′. A dsDNA duplex formed between a strand and its reverse complement is called a

  14. Towards a Standard-based Domain-specific Platform to Solve Machine Learning-based Problems

    Directory of Open Access Journals (Sweden)

    Vicente García-Díaz

    2015-12-01

    Full Text Available Machine learning is one of the most important subfields of computer science and can be used to solve a variety of interesting artificial intelligence problems. There are different languages, framework and tools to define the data needed to solve machine learning-based problems. However, there is a great number of very diverse alternatives which makes it difficult the intercommunication, portability and re-usability of the definitions, designs or algorithms that any developer may create. In this paper, we take the first step towards a language and a development environment independent of the underlying technologies, allowing developers to design solutions to solve machine learning-based problems in a simple and fast way, automatically generating code for other technologies. That can be considered a transparent bridge among current technologies. We rely on Model-Driven Engineering approach, focusing on the creation of models to abstract the definition of artifacts from the underlying technologies.

  15. DNA-Enabled Integrated Molecular Systems for Computation and Sensing

    Science.gov (United States)

    2014-05-21

    Computational devices can be chemically conjugated to different strands of DNA that are then self-assembled according to strict Watson − Crick binding rules... DNA -Enabled Integrated Molecular Systems for Computation and Sensing Craig LaBoda,† Heather Duschl,† and Chris L. Dwyer*,†,‡ †Department of...guided folding of DNA , inspired by nature, allows designs to manipulate molecular-scale processes unlike any other material system. Thus, DNA can be

  16. 6th International Conference on Soft Computing for Problem Solving

    CERN Document Server

    Bansal, Jagdish; Das, Kedar; Lal, Arvind; Garg, Harish; Nagar, Atulya; Pant, Millie

    2017-01-01

    This two-volume book gathers the proceedings of the Sixth International Conference on Soft Computing for Problem Solving (SocProS 2016), offering a collection of research papers presented during the conference at Thapar University, Patiala, India. Providing a veritable treasure trove for scientists and researchers working in the field of soft computing, it highlights the latest developments in the broad area of “Computational Intelligence” and explores both theoretical and practical aspects using fuzzy logic, artificial neural networks, evolutionary algorithms, swarm intelligence, soft computing, computational intelligence, etc.

  17. DNA & Protein detection based on microbead agglutination

    KAUST Repository

    Kodzius, Rimantas

    2012-06-06

    We report a simple and rapid room temperature assay for point-of-care (POC) testing that is based on specific agglutination. Agglutination tests are based on aggregation of microparticles in the presence of a specific analyte thus enabling the macroscopic observation. Agglutination-based tests are most often used to explore the antibody-antigen reactions. Agglutination has been used for mode protein assays using a biotin/streptavidin two-component system, as well as a hybridization based two-component assay; however, as our work shows, two-component systems are prone to self-termination of the linking analyte and thus have a lower sensitivity. Three component systems have also been used with DNA hybridization, as in our work; however, their assay requires 48 hours for incubation, while our assay is performed in 5 minutes making it a real candidate for POC testing. We demonstrate three assays: a two-component biotin/streptavidin assay, a three-component hybridization assay using single stranded DNA (ssDNA) molecules and a stepped three-component hybridization assay. The comparison of these three assays shows our simple stepped three-component agglutination assay to be rapid at room temperature and more sensitive than the two-component version by an order of magnitude. An agglutination assay was also performed in a PDMS microfluidic chip where agglutinated beads were trapped by filter columns for easy observation. We developed a rapid (5 minute) room temperature assay, which is based on microbead agglutination. Our three-component assay solves the linker self-termination issue allowing an order of magnitude increase in sensitivity over two–component assays. Our stepped version of the three-component assay solves the issue with probe site saturation thus enabling a wider range of detection. Detection of the agglutinated beads with the naked eye by trapping in microfluidic channels has been shown.

  18. Experimental realization of a one-way quantum computer algorithm solving Simon's problem.

    Science.gov (United States)

    Tame, M S; Bell, B A; Di Franco, C; Wadsworth, W J; Rarity, J G

    2014-11-14

    We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's problem-a black-box period-finding problem that has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.

  19. Backtrack Programming: A Computer-Based Approach to Group Problem Solving.

    Science.gov (United States)

    Scott, Michael D.; Bodaken, Edward M.

    Backtrack problem-solving appears to be a viable alternative to current problem-solving methodologies. It appears to have considerable heuristic potential as a conceptual and operational framework for small group communication research, as well as functional utility for the student group in the small group class or the management team in the…

  20. Solving the Coupled System Improves Computational Efficiency of the Bidomain Equations

    KAUST Repository

    Southern, J.A.

    2009-10-01

    The bidomain equations are frequently used to model the propagation of cardiac action potentials across cardiac tissue. At the whole organ level, the size of the computational mesh required makes their solution a significant computational challenge. As the accuracy of the numerical solution cannot be compromised, efficiency of the solution technique is important to ensure that the results of the simulation can be obtained in a reasonable time while still encapsulating the complexities of the system. In an attempt to increase efficiency of the solver, the bidomain equations are often decoupled into one parabolic equation that is computationally very cheap to solve and an elliptic equation that is much more expensive to solve. In this study, the performance of this uncoupled solution method is compared with an alternative strategy in which the bidomain equations are solved as a coupled system. This seems counterintuitive as the alternative method requires the solution of a much larger linear system at each time step. However, in tests on two 3-D rabbit ventricle benchmarks, it is shown that the coupled method is up to 80% faster than the conventional uncoupled method-and that parallel performance is better for the larger coupled problem.

  1. Solving the Coupled System Improves Computational Efficiency of the Bidomain Equations

    KAUST Repository

    Southern, J.A.; Plank, G.; Vigmond, E.J.; Whiteley, J.P.

    2009-01-01

    The bidomain equations are frequently used to model the propagation of cardiac action potentials across cardiac tissue. At the whole organ level, the size of the computational mesh required makes their solution a significant computational challenge. As the accuracy of the numerical solution cannot be compromised, efficiency of the solution technique is important to ensure that the results of the simulation can be obtained in a reasonable time while still encapsulating the complexities of the system. In an attempt to increase efficiency of the solver, the bidomain equations are often decoupled into one parabolic equation that is computationally very cheap to solve and an elliptic equation that is much more expensive to solve. In this study, the performance of this uncoupled solution method is compared with an alternative strategy in which the bidomain equations are solved as a coupled system. This seems counterintuitive as the alternative method requires the solution of a much larger linear system at each time step. However, in tests on two 3-D rabbit ventricle benchmarks, it is shown that the coupled method is up to 80% faster than the conventional uncoupled method-and that parallel performance is better for the larger coupled problem.

  2. Regressive Imagery in Creative Problem-Solving: Comparing Verbal Protocols of Expert and Novice Visual Artists and Computer Programmers

    Science.gov (United States)

    Kozbelt, Aaron; Dexter, Scott; Dolese, Melissa; Meredith, Daniel; Ostrofsky, Justin

    2015-01-01

    We applied computer-based text analyses of regressive imagery to verbal protocols of individuals engaged in creative problem-solving in two domains: visual art (23 experts, 23 novices) and computer programming (14 experts, 14 novices). Percentages of words involving primary process and secondary process thought, plus emotion-related words, were…

  3. Computer Problem-Solving Coaches for Introductory Physics: Design and Usability Studies

    Science.gov (United States)

    Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew

    2016-01-01

    The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how…

  4. In vitro molecular machine learning algorithm via symmetric internal loops of DNA.

    Science.gov (United States)

    Lee, Ji-Hoon; Lee, Seung Hwan; Baek, Christina; Chun, Hyosun; Ryu, Je-Hwan; Kim, Jin-Woo; Deaton, Russell; Zhang, Byoung-Tak

    2017-08-01

    Programmable biomolecules, such as DNA strands, deoxyribozymes, and restriction enzymes, have been used to solve computational problems, construct large-scale logic circuits, and program simple molecular games. Although studies have shown the potential of molecular computing, the capability of computational learning with DNA molecules, i.e., molecular machine learning, has yet to be experimentally verified. Here, we present a novel molecular learning in vitro model in which symmetric internal loops of double-stranded DNA are exploited to measure the differences between training instances, thus enabling the molecules to learn from small errors. The model was evaluated on a data set of twenty dialogue sentences obtained from the television shows Friends and Prison Break. The wet DNA-computing experiments confirmed that the molecular learning machine was able to generalize the dialogue patterns of each show and successfully identify the show from which the sentences originated. The molecular machine learning model described here opens the way for solving machine learning problems in computer science and biology using in vitro molecular computing with the data encoded in DNA molecules. Copyright © 2017. Published by Elsevier B.V.

  5. An Improved Parallel DNA Algorithm of 3-SAT

    Directory of Open Access Journals (Sweden)

    Wei Liu

    2007-09-01

    Full Text Available There are many large-size and difficult computational problems in mathematics and computer science. For many of these problems, traditional computers cannot handle the mass of data in acceptable timeframes, which we call an NP problem. DNA computing is a means of solving a class of intractable computational problems in which the computing time grows exponentially with problem size. This paper proposes a parallel algorithm model for the universal 3-SAT problem based on the Adleman-Lipton model and applies biological operations to handling the mass of data in solution space. In this manner, we can control the run time of the algorithm to be finite and approximately constant.

  6. Industrial application of a graphics computer-based training system

    International Nuclear Information System (INIS)

    Klemm, R.W.

    1985-01-01

    Graphics Computer Based Training (GCBT) roles include drilling, tutoring, simulation and problem solving. Of these, Commonwealth Edison uses mainly tutoring, simulation and problem solving. These roles are not separate in any particular program. They are integrated to provide tutoring and part-task simulation, part-task simulation and problem solving, or problem solving tutoring. Commonwealth's Graphics Computer Based Training program was a result of over a year's worth of research and planning. The keys to the program are it's flexibility and control. Flexibility is maintained through stand alone units capable of program authoring and modification for plant/site specific users. Yet, the system has the capability to support up to 31 terminals with a 40 mb hard disk drive. Control of the GCBT program is accomplished through establishment of development priorities and a central development facility (Commonwealth Edison's Production Training Center)

  7. Data science in R a case studies approach to computational reasoning and problem solving

    CERN Document Server

    Nolan, Deborah

    2015-01-01

    Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar

  8. Does Solving Insight-Based Problems Differ from Solving Learning-Based Problems? Some Evidence from an ERP Study

    Science.gov (United States)

    Leikin, Roza; Waisman, Ilana; Leikin, Mark

    2016-01-01

    We asked: "What are the similarities and differences in mathematical processing associated with solving learning-based and insight-based problems?" To answer this question, the ERP research procedure was employed with 69 male adolescent subjects who solved specially designed insight-based and learning-based tests. Solutions of…

  9. The semantic system is involved in mathematical problem solving.

    Science.gov (United States)

    Zhou, Xinlin; Li, Mengyi; Li, Leinian; Zhang, Yiyun; Cui, Jiaxin; Liu, Jie; Chen, Chuansheng

    2018-02-01

    Numerous studies have shown that the brain regions around bilateral intraparietal cortex are critical for number processing and arithmetical computation. However, the neural circuits for more advanced mathematics such as mathematical problem solving (with little routine arithmetical computation) remain unclear. Using functional magnetic resonance imaging (fMRI), this study (N = 24 undergraduate students) compared neural bases of mathematical problem solving (i.e., number series completion, mathematical word problem solving, and geometric problem solving) and arithmetical computation. Direct subject- and item-wise comparisons revealed that mathematical problem solving typically had greater activation than arithmetical computation in all 7 regions of the semantic system (which was based on a meta-analysis of 120 functional neuroimaging studies on semantic processing). Arithmetical computation typically had greater activation in the supplementary motor area and left precentral gyrus. The results suggest that the semantic system in the brain supports mathematical problem solving. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  11. Logo Programming, Problem Solving, and Knowledge-Based Instruction.

    Science.gov (United States)

    Swan, Karen; Black, John B.

    The research reported in this paper was designed to investigate the hypothesis that computer programming may support the teaching and learning of problem solving, but that to do so, problem solving must be explicitly taught. Three studies involved students in several grades: 4th, 6th, 8th, 11th, and 12th. Findings collectively show that five…

  12. DNA-programmed dynamic assembly of quantum dots for molecular computation.

    Science.gov (United States)

    He, Xuewen; Li, Zhi; Chen, Muzi; Ma, Nan

    2014-12-22

    Despite the widespread use of quantum dots (QDs) for biosensing and bioimaging, QD-based bio-interfaceable and reconfigurable molecular computing systems have not yet been realized. DNA-programmed dynamic assembly of multi-color QDs is presented for the construction of a new class of fluorescence resonance energy transfer (FRET)-based QD computing systems. A complete set of seven elementary logic gates (OR, AND, NOR, NAND, INH, XOR, XNOR) are realized using a series of binary and ternary QD complexes operated by strand displacement reactions. The integration of different logic gates into a half-adder circuit for molecular computation is also demonstrated. This strategy is quite versatile and straightforward for logical operations and would pave the way for QD-biocomputing-based intelligent molecular diagnostics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. DENA: A Configurable Microarchitecture and Design Flow for Biomedical DNA-Based Logic Design.

    Science.gov (United States)

    Beiki, Zohre; Jahanian, Ali

    2017-10-01

    DNA is known as the building block for storing the life codes and transferring the genetic features through the generations. However, it is found that DNA strands can be used for a new type of computation that opens fascinating horizons in computational medicine. Significant contributions are addressed on design of DNA-based logic gates for medical and computational applications but there are serious challenges for designing the medium and large-scale DNA circuits. In this paper, a new microarchitecture and corresponding design flow is proposed to facilitate the design of multistage large-scale DNA logic systems. Feasibility and efficiency of the proposed microarchitecture are evaluated by implementing a full adder and, then, its cascadability is determined by implementing a multistage 8-bit adder. Simulation results show the highlight features of the proposed design style and microarchitecture in terms of the scalability, implementation cost, and signal integrity of the DNA-based logic system compared to the traditional approaches.

  14. Solving large sets of coupled equations iteratively by vector processing on the CYBER 205 computer

    International Nuclear Information System (INIS)

    Tolsma, L.D.

    1985-01-01

    The set of coupled linear second-order differential equations which has to be solved for the quantum-mechanical description of inelastic scattering of atomic and nuclear particles can be rewritten as an equivalent set of coupled integral equations. When some type of functions is used as piecewise analytic reference solutions, the integrals that arise in this set can be evaluated analytically. The set of integral equations can be solved iteratively. For the results mentioned an inward-outward iteration scheme has been applied. A concept of vectorization of coupled-channel Fortran programs, based on this integral method, is presented for the use on the Cyber 205 computer. It turns out that, for two heavy ion nuclear scattering test cases, this vector algorithm gives an overall speed-up of about a factor of 2 to 3 compared to a highly optimized scalar algorithm for a one vector pipeline computer

  15. Computer-Presented Organizational/Memory Aids as Instruction for Solving Pico-Fomi Problems.

    Science.gov (United States)

    Steinberg, Esther R.; And Others

    1985-01-01

    Describes investigation of effectiveness of computer-presented organizational/memory aids (matrix and verbal charts controlled by computer or learner) as instructional technique for solving Pico-Fomi problems, and the acquisition of deductive inference rules when such aids are present. Results indicate chart use control should be adapted to…

  16. Improving the learning of clinical reasoning through computer-based cognitive representation.

    Science.gov (United States)

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  17. Encrypted Objects and Decryption Processes: Problem-Solving with Functions in a Learning Environment Based on Cryptography

    Science.gov (United States)

    White, Tobin

    2009-01-01

    This paper introduces an applied problem-solving task, set in the context of cryptography and embedded in a network of computer-based tools. This designed learning environment engaged students in a series of collaborative problem-solving activities intended to introduce the topic of functions through a set of linked representations. In a…

  18. Development of GPU Based Parallel Computing Module for Solving Pressure Equation in the CUPID Component Thermo-Fluid Analysis Code

    International Nuclear Information System (INIS)

    Lee, Jin Pyo; Joo, Han Gyu

    2010-01-01

    In the thermo-fluid analysis code named CUPID, the linear system of pressure equations must be solved in each iteration step. The time for repeatedly solving the linear system can be quite significant because large sparse matrices of Rank more than 50,000 are involved and the diagonal dominance of the system is hardly hold. Therefore parallelization of the linear system solver is essential to reduce the computing time. Meanwhile, Graphics Processing Units (GPU) have been developed as highly parallel, multi-core processors for the global demand of high quality 3D graphics. If a suitable interface is provided, parallelization using GPU can be available to engineering computing. NVIDIA provides a Software Development Kit(SDK) named CUDA(Compute Unified Device Architecture) to code developers so that they can manage GPUs for parallelization using the C language. In this research, we implement parallel routines for the linear system solver using CUDA, and examine the performance of the parallelization. In the next section, we will describe the method of CUDA parallelization for the CUPID code, and then the performance of the CUDA parallelization will be discussed

  19. Exploring hadronic physics by solving QCD with a teraflops computer

    International Nuclear Information System (INIS)

    Negele, J.

    1993-01-01

    Quantum chromodynamics, the theory believed to govern the nucleons, mesons, and other strongly interacting particles making up most of the known mass of the universe is such a challenging, nonlinear many-body problem that it has never been solved using conventional analytical techniques. This talk will describe how this theory can be solved numerically on a space-time lattice, show what has already been understood about the structure of hadrons and the quark gluon phase transition. and describe an exciting initiative to build a dedicated Teraflops computer capable of performing 10 12 operations per second to make fundamental advances in QCD

  20. An iterative algorithm for solving the multidimensional neutron diffusion nodal method equations on parallel computers

    International Nuclear Information System (INIS)

    Kirk, B.L.; Azmy, Y.Y.

    1992-01-01

    In this paper the one-group, steady-state neutron diffusion equation in two-dimensional Cartesian geometry is solved using the nodal integral method. The discrete variable equations comprise loosely coupled sets of equations representing the nodal balance of neutrons, as well as neutron current continuity along rows or columns of computational cells. An iterative algorithm that is more suitable for solving large problems concurrently is derived based on the decomposition of the spatial domain and is accelerated using successive overrelaxation. This algorithm is very well suited for parallel computers, especially since the spatial domain decomposition occurs naturally, so that the number of iterations required for convergence does not depend on the number of processors participating in the calculation. Implementation of the authors' algorithm on the Intel iPSC/2 hypercube and Sequent Balance 8000 parallel computer is presented, and measured speedup and efficiency for test problems are reported. The results suggest that the efficiency of the hypercube quickly deteriorates when many processors are used, while the Sequent Balance retains very high efficiency for a comparable number of participating processors. This leads to the conjecture that message-passing parallel computers are not as well suited for this algorithm as shared-memory machines

  1. Desktop Grid Computing with BOINC and its Use for Solving the RND telecommunication Problem

    International Nuclear Information System (INIS)

    Vega-Rodriguez, M. A.; Vega-Perez, D.; Gomez-Pulido, J. A.; Sanchez-Perez, J. M.

    2007-01-01

    An important problem in mobile/cellular technology is trying to cover a certain geographical area by using the smallest number of radio antennas, and looking for the biggest cover rate. This is the well known Telecommunication problem identified as Radio Network Design (RND). This optimization problem can be solved by bio-inspired algorithms, among other options. In this work we use the PBIL (Population-Based Incremental Learning) algorithm, that has been little studied in this field but we have obtained very good results with it. PBIL is based on genetic algorithms and competitive learning (typical in neural networks), being a population evolution model based on probabilistic models. Due to the high number of configuration parameters of the PBIL, and because we want to test the RND problem with numerous variants, we have used grid computing with BOINC (Berkeley Open Infrastructure for Network Computing). In this way, we have been able to execute thousands of experiments in few days using around 100 computers at the same time. In this paper we present the most interesting results from our work. (Author)

  2. Solving Inventory Routing Problems Using Location Based Heuristics

    Directory of Open Access Journals (Sweden)

    Paweł Hanczar

    2014-01-01

    Full Text Available Inventory routing problems (IRPs occur where vendor managed inventory replenishment strategies are implemented in supply chains. These problems are characterized by the presence of both transportation and inventory considerations, either as parameters or constraints. The research presented in this paper aims at extending IRP formulation developed on the basis of location based heuristics proposed by Bramel and Simchi-Levi and continued by Hanczar. In the first phase of proposed algorithms, mixed integer programming is used to determine the partitioning of customers as well as dates and quantities of deliveries. Then, using 2-opt algorithm for solving the traveling sales-person problem the optimal routes for each partition are determined. In the main part of research the classical formulation is extended by additional constraints (visit spacing, vehicle filling rate, driver (vehicle consistency, and heterogeneous fleet of vehicles as well as the additional criteria are discussed. Then the impact of using each of proposed extensions for solution possibilities is evaluated. The results of computational tests are presented and discussed. Obtained results allow to conclude that the location based heuristics should be considered when solving real life instances of IRP. (original abstract

  3. Improving the learning of clinical reasoning through computer-based cognitive representation

    Directory of Open Access Journals (Sweden)

    Bian Wu

    2014-12-01

    Full Text Available Objective: Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods: Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results: A significant improvement was found in students’ learning products from the beginning to the end of the study, consistent with students’ report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions: The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge

  4. HMMBinder: DNA-Binding Protein Prediction Using HMM Profile Based Features.

    Science.gov (United States)

    Zaman, Rianon; Chowdhury, Shahana Yasmin; Rashid, Mahmood A; Sharma, Alok; Dehzangi, Abdollah; Shatabda, Swakkhar

    2017-01-01

    DNA-binding proteins often play important role in various processes within the cell. Over the last decade, a wide range of classification algorithms and feature extraction techniques have been used to solve this problem. In this paper, we propose a novel DNA-binding protein prediction method called HMMBinder. HMMBinder uses monogram and bigram features extracted from the HMM profiles of the protein sequences. To the best of our knowledge, this is the first application of HMM profile based features for the DNA-binding protein prediction problem. We applied Support Vector Machines (SVM) as a classification technique in HMMBinder. Our method was tested on standard benchmark datasets. We experimentally show that our method outperforms the state-of-the-art methods found in the literature.

  5. HMMBinder: DNA-Binding Protein Prediction Using HMM Profile Based Features

    Directory of Open Access Journals (Sweden)

    Rianon Zaman

    2017-01-01

    Full Text Available DNA-binding proteins often play important role in various processes within the cell. Over the last decade, a wide range of classification algorithms and feature extraction techniques have been used to solve this problem. In this paper, we propose a novel DNA-binding protein prediction method called HMMBinder. HMMBinder uses monogram and bigram features extracted from the HMM profiles of the protein sequences. To the best of our knowledge, this is the first application of HMM profile based features for the DNA-binding protein prediction problem. We applied Support Vector Machines (SVM as a classification technique in HMMBinder. Our method was tested on standard benchmark datasets. We experimentally show that our method outperforms the state-of-the-art methods found in the literature.

  6. Measuring scientific reasoning through behavioral analysis in a computer-based problem solving exercise

    Science.gov (United States)

    Mead, C.; Horodyskyj, L.; Buxner, S.; Semken, S. C.; Anbar, A. D.

    2016-12-01

    Developing scientific reasoning skills is a common learning objective for general-education science courses. However, effective assessments for such skills typically involve open-ended questions or tasks, which must be hand-scored and may not be usable online. Using computer-based learning environments, reasoning can be assessed automatically by analyzing student actions within the learning environment. We describe such an assessment under development and present pilot results. In our content-neutral instrument, students solve a problem by collecting and interpreting data in a logical, systematic manner. We then infer reasoning skill automatically based on student actions. Specifically, students investigate why Earth has seasons, a scientifically simple but commonly misunderstood topic. Students are given three possible explanations and asked to select a set of locations on a world map from which to collect temperature data. They then explain how the data support or refute each explanation. The best approaches will use locations in both the Northern and Southern hemispheres to argue that the contrasting seasonality of the hemispheres supports only the correct explanation. We administered a pilot version to students at the beginning of an online, introductory science course (n = 223) as an optional extra credit exercise. We were able to categorize students' data collection decisions as more and less logically sound. Students who choose the most logical measurement locations earned higher course grades, but not significantly higher. This result is encouraging, but not definitive. In the future, we will clarify our results in two ways. First, we plan to incorporate more open-ended interactions into the assessment to improve the resolving power of this tool. Second, to avoid relying on course grades, we will independently measure reasoning skill with one of the existing hand-scored assessments (e.g., Critical Thinking Assessment Test) to cross-validate our new

  7. Personalized Computer-Assisted Mathematics Problem-Solving Program and Its Impact on Taiwanese Students

    Science.gov (United States)

    Chen, Chiu-Jung; Liu, Pei-Lin

    2007-01-01

    This study evaluated the effects of a personalized computer-assisted mathematics problem-solving program on the performance and attitude of Taiwanese fourth grade students. The purpose of this study was to determine whether the personalized computer-assisted program improved student performance and attitude over the nonpersonalized program.…

  8. Validation of DNA-based identification software by computation of pedigree likelihood ratios.

    Science.gov (United States)

    Slooten, K

    2011-08-01

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  9. The cellular environment in computer simulations of radiation-induced damage to DNA

    International Nuclear Information System (INIS)

    Moiseenko, V.V.; Waker, A.J.; Prestwich, W.V.

    1998-01-01

    Radiation-induced DNA single- and double-strand breaks were modeled for 660 keV photon radiation and scavenger capacity mimicking the cellular environment. Atomistic representation of DNA in B form with a first hydration shell was utilized to model direct and indirect damage. Monte Carlo generated electron tracks were used to model energy deposition in matter and to derive initial spatial distributions of species which appear in the medium following radiolysis. Diffusion of species was followed with time, and their reactions with DNA and each other were modeled in an encounter-controlled manner. Three methods to account for hydroxyl radical diffusion in a cellular environment were tested: assumed exponential survival, time-limited modeling and modeling of reactions between hydroxyl radicals and scavengers in an encounter-controlled manner. Although the method based on modeling scavenging in an encounter-controlled manner is more precise, it requires substantially more computer resources than either the exponential or time-limiting method. Scavenger concentrations of 0.5 and 0.15 M were considered using exponential and encounter-controlled methods with reaction rate set at 3 x 10 9 dm 3 mol -1 s -1 . Diffusion length and strand break yields, predicted by these two methods for the same scavenger molarity, were different by 20%-30%. The method based on limiting time of chemistry follow-up to 10 -9 s leads to DNA damage and radical diffusion estimates similar to 0.5 M scavenger concentration in the other two methods. The difference observed in predictions made by the methods considered could be tolerated in computer simulations of DNA damage. (orig.)

  10. The cellular environment in computer simulations of radiation-induced damage to DNA

    International Nuclear Information System (INIS)

    Moiseenko, V.V.; Hamm, R.N.; Waker, A.J.; Prestwich, W.V.

    1988-01-01

    Radiation-induced DNA single- and double-strand breaks were modeled for 660 keV photon radiation and scavenger capacity mimicking the cellular environment. Atomistic representation of DNA in B form with a first hydration shell was utilized to model direct and indirect damage. Monte Carlo generated electron tracks were used to model energy deposition in matter and to derive initial spatial distributions of species which appear in the medium following radiolysis. Diffusion of species was followed with time, and their reactions with DNA and each other were modeled in an encounter-controlled manner. Three methods to account for hydroxyl radical diffusion in cellular environment were tested: assumed exponential survival, time-limited modeling and modeling of reactions between hydroxyl radicals and scavengers in an encounter-controlled manner. Although the method based on modeling scavenging in an encounter-controlled manner is more precise, it requires substantially more computer resources than either the exponential or time-limiting method. Scavenger concentrations of 0.5 and 0.15 M were considered using exponential and encounter-controlled methods with reaction rate set at 3x10 9 dm 3 mol -1 s-1. Diffusion length and strand break yields, predicted by these two methods for the same scavenger molarity, were different by 20%-30%. The method based on limiting time of chemistry follow-up to 10 -9 s leads to DNA damage and radical diffusion estimates similar to 0.5 M scavenger concentration in the other two methods. The difference observed in predictions made by the methods considered could be tolerated in computer simulations of DNA damage. (author)

  11. Supporting Problem Solving with Case-Stories Learning Scenario and Video-based Collaborative Learning Technology

    Directory of Open Access Journals (Sweden)

    Chun Hu

    2004-04-01

    Full Text Available In this paper, we suggest that case-based resources, which are used for assisting cognition during problem solving, can be structured around the work of narratives in social cultural psychology. Theories and other research methods have proposed structures within narratives and stories which may be useful to the design of case-based resources. Moreover, embedded within cases are stories which are contextually rich, supporting the epistemological groundings of situated cognition. Therefore the purposes of this paper are to discuss possible frameworks of case-stories; derive design principles as to “what” constitutes a good case story or narrative; and suggest how technology can support story-based learning. We adopt video-based Computer-Supported Collaborative Learning (CSCL technology to support problem solving with case-stories learning scenarios. Our hypothesis in this paper is that well-designed case-based resources are able to aid in the cognitive processes undergirding problem solving and meaning making. We also suggest the use of an emerging video-based collaborative learning technology to support such an instructional strategy.

  12. Inquiry-based problem solving in introductory physics

    Science.gov (United States)

    Koleci, Carolann

    What makes problem solving in physics difficult? How do students solve physics problems, and how does this compare to an expert physicist's strategy? Over the past twenty years, physics education research has revealed several differences between novice and expert problem solving. The work of Chi, Feltovich, and Glaser demonstrates that novices tend to categorize problems based on surface features, while experts categorize according to theory, principles, or concepts1. If there are differences between how problems are categorized, then are there differences between how physics problems are solved? Learning more about the problem solving process, including how students like to learn and what is most effective, requires both qualitative and quantitative analysis. In an effort to learn how novices and experts solve introductory electricity problems, a series of in-depth interviews were conducted, transcribed, and analyzed, using both qualitative and quantitative methods. One-way ANOVA tests were performed in order to learn if there are any significant problem solving differences between: (a) novices and experts, (b) genders, (c) students who like to answer questions in class and those who don't, (d) students who like to ask questions in class and those who don't, (e) students employing an interrogative approach to problem solving and those who don't, and (f) those who like physics and those who dislike it. The results of both the qualitative and quantitative methods reveal that inquiry-based problem solving is prevalent among novices and experts, and frequently leads to the correct physics. These findings serve as impetus for the third dimension of this work: the development of Choose Your Own Adventure Physics(c) (CYOAP), an innovative teaching tool in physics which encourages inquiry-based problem solving. 1Chi, M., P. Feltovich, R. Glaser, "Categorization and Representation of Physics Problems by Experts and Novices", Cognitive Science, 5, 121--152 (1981).

  13. Computer-assisted design for scaling up systems based on DNA reaction networks.

    Science.gov (United States)

    Aubert, Nathanaël; Mosca, Clément; Fujii, Teruo; Hagiya, Masami; Rondelez, Yannick

    2014-04-06

    In the past few years, there have been many exciting advances in the field of molecular programming, reaching a point where implementation of non-trivial systems, such as neural networks or switchable bistable networks, is a reality. Such systems require nonlinearity, be it through signal amplification, digitalization or the generation of autonomous dynamics such as oscillations. The biochemistry of DNA systems provides such mechanisms, but assembling them in a constructive manner is still a difficult and sometimes counterintuitive process. Moreover, realistic prediction of the actual evolution of concentrations over time requires a number of side reactions, such as leaks, cross-talks or competitive interactions, to be taken into account. In this case, the design of a system targeting a given function takes much trial and error before the correct architecture can be found. To speed up this process, we have created DNA Artificial Circuits Computer-Assisted Design (DACCAD), a computer-assisted design software that supports the construction of systems for the DNA toolbox. DACCAD is ultimately aimed to design actual in vitro implementations, which is made possible by building on the experimental knowledge available on the DNA toolbox. We illustrate its effectiveness by designing various systems, from Montagne et al.'s Oligator or Padirac et al.'s bistable system to new and complex networks, including a two-bit counter or a frequency divider as well as an example of very large system encoding the game Mastermind. In the process, we highlight a variety of behaviours, such as enzymatic saturation and load effect, which would be hard to handle or even predict with a simpler model. We also show that those mechanisms, while generally seen as detrimental, can be used in a positive way, as functional part of a design. Additionally, the number of parameters included in these simulations can be large, especially in the case of complex systems. For this reason, we included the

  14. Solving computationally expensive engineering problems

    CERN Document Server

    Leifsson, Leifur; Yang, Xin-She

    2014-01-01

    Computational complexity is a serious bottleneck for the design process in virtually any engineering area. While migration from prototyping and experimental-based design validation to verification using computer simulation models is inevitable and has a number of advantages, high computational costs of accurate, high-fidelity simulations can be a major issue that slows down the development of computer-aided design methodologies, particularly those exploiting automated design improvement procedures, e.g., numerical optimization. The continuous increase of available computational resources does not always translate into shortening of the design cycle because of the growing demand for higher accuracy and necessity to simulate larger and more complex systems. Accurate simulation of a single design of a given system may be as long as several hours, days or even weeks, which often makes design automation using conventional methods impractical or even prohibitive. Additional problems include numerical noise often pr...

  15. A Computer Algebra Approach to Solving Chemical Equilibria in General Chemistry

    Science.gov (United States)

    Kalainoff, Melinda; Lachance, Russ; Riegner, Dawn; Biaglow, Andrew

    2012-01-01

    In this article, we report on a semester-long study of the incorporation into our general chemistry course, of advanced algebraic and computer algebra techniques for solving chemical equilibrium problems. The method presented here is an alternative to the commonly used concentration table method for describing chemical equilibria in general…

  16. Design and Analysis of Compact DNA Strand Displacement Circuits for Analog Computation Using Autocatalytic Amplifiers.

    Science.gov (United States)

    Song, Tianqi; Garg, Sudhanshu; Mokhtar, Reem; Bui, Hieu; Reif, John

    2018-01-19

    A main goal in DNA computing is to build DNA circuits to compute designated functions using a minimal number of DNA strands. Here, we propose a novel architecture to build compact DNA strand displacement circuits to compute a broad scope of functions in an analog fashion. A circuit by this architecture is composed of three autocatalytic amplifiers, and the amplifiers interact to perform computation. We show DNA circuits to compute functions sqrt(x), ln(x) and exp(x) for x in tunable ranges with simulation results. A key innovation in our architecture, inspired by Napier's use of logarithm transforms to compute square roots on a slide rule, is to make use of autocatalytic amplifiers to do logarithmic and exponential transforms in concentration and time. In particular, we convert from the input that is encoded by the initial concentration of the input DNA strand, to time, and then back again to the output encoded by the concentration of the output DNA strand at equilibrium. This combined use of strand-concentration and time encoding of computational values may have impact on other forms of molecular computation.

  17. High-speed DNA-based rolling motors powered by RNase H

    Science.gov (United States)

    Yehl, Kevin; Mugler, Andrew; Vivek, Skanda; Liu, Yang; Zhang, Yun; Fan, Mengzhen; Weeks, Eric R.

    2016-01-01

    DNA-based machines that walk by converting chemical energy into controlled motion could be of use in applications such as next generation sensors, drug delivery platforms, and biological computing. Despite their exquisite programmability, DNA-based walkers are, however, challenging to work with due to their low fidelity and slow rates (~1 nm/min). Here, we report DNA-based machines that roll rather than walk, and consequently have a maximum speed and processivity that is three-orders of magnitude greater than conventional DNA motors. The motors are made from DNA-coated spherical particles that hybridise to a surface modified with complementary RNA; motion is achieved through the addition of RNase H, which selectively hydrolyses hybridised RNA. Spherical motors move in a self-avoiding manner, whereas anisotropic particles, such as dimerised particles or rod-shaped particles travel linearly without a track or external force. Finally, we demonstrate detection of single nucleotide polymorphism by measuring particle displacement using a smartphone camera. PMID:26619152

  18. TrueAllele casework on Virginia DNA mixture evidence: computer and manual interpretation in 72 reported criminal cases.

    Directory of Open Access Journals (Sweden)

    Mark W Perlin

    Full Text Available Mixtures are a commonly encountered form of biological evidence that contain DNA from two or more contributors. Laboratory analysis of mixtures produces data signals that usually cannot be separated into distinct contributor genotypes. Computer modeling can resolve the genotypes up to probability, reflecting the uncertainty inherent in the data. Human analysts address the problem by simplifying the quantitative data in a threshold process that discards considerable identification information. Elevated stochastic threshold levels potentially discard more information. This study examines three different mixture interpretation methods. In 72 criminal cases, 111 genotype comparisons were made between 92 mixture items and relevant reference samples. TrueAllele computer modeling was done on all the evidence samples, and documented in DNA match reports that were provided as evidence for each case. Threshold-based Combined Probability of Inclusion (CPI and stochastically modified CPI (mCPI analyses were performed as well. TrueAllele's identification information in 101 positive matches was used to assess the reliability of its modeling approach. Comparison was made with 81 CPI and 53 mCPI DNA match statistics that were manually derived from the same data. There were statistically significant differences between the DNA interpretation methods. TrueAllele gave an average match statistic of 113 billion, CPI averaged 6.68 million, and mCPI averaged 140. The computer was highly specific, with a false positive rate under 0.005%. The modeling approach was precise, having a factor of two within-group standard deviation. TrueAllele accuracy was indicated by having uniformly distributed match statistics over the data set. The computer could make genotype comparisons that were impossible or impractical using manual methods. TrueAllele computer interpretation of DNA mixture evidence is sensitive, specific, precise, accurate and more informative than manual

  19. Investigating the Usability and Efficacy of Customizable Computer Coaches for Introductory Physics Problem Solving

    Science.gov (United States)

    Aryal, Bijaya

    2016-03-01

    We have studied the impacts of web-based Computer Coaches on educational outputs and outcomes. This presentation will describe the technical and conceptual framework related to the Coaches and discuss undergraduate students' favorability of the Coaches. Moreover, its impacts on students' physics problem solving performance and on their conceptual understanding of physics will be reported. We used a qualitative research technique to collect and analyze interview data from 19 undergraduate students who used the Coaches in the interview setting. The empirical results show that the favorability and efficacy of the Computer Coaches differ considerably across students of different educational backgrounds, preparation levels, attitudes and epistemologies about physics learning. The interview data shows that female students tend to have more favorability supporting the use of the Coach. Likewise, our assessment suggests that female students seem to benefit more from the Coaches in their problem solving performance and in conceptual learning of physics. Finally, the analysis finds evidence that the Coach has potential for increasing efficiency in usage and for improving students' educational outputs and outcomes under its customized usage. This work was partially supported by the Center for Educational Innovation, Office of the Senior Vice President for Academic Affairs and Provost, University of Minnesota.

  20. Parallel computation with molecular-motor-propelled agents in nanofabricated networks.

    Science.gov (United States)

    Nicolau, Dan V; Lard, Mercy; Korten, Till; van Delft, Falco C M J M; Persson, Malin; Bengtsson, Elina; Månsson, Alf; Diez, Stefan; Linke, Heiner; Nicolau, Dan V

    2016-03-08

    The combinatorial nature of many important mathematical problems, including nondeterministic-polynomial-time (NP)-complete problems, places a severe limitation on the problem size that can be solved with conventional, sequentially operating electronic computers. There have been significant efforts in conceiving parallel-computation approaches in the past, for example: DNA computation, quantum computation, and microfluidics-based computation. However, these approaches have not proven, so far, to be scalable and practical from a fabrication and operational perspective. Here, we report the foundations of an alternative parallel-computation system in which a given combinatorial problem is encoded into a graphical, modular network that is embedded in a nanofabricated planar device. Exploring the network in a parallel fashion using a large number of independent, molecular-motor-propelled agents then solves the mathematical problem. This approach uses orders of magnitude less energy than conventional computers, thus addressing issues related to power consumption and heat dissipation. We provide a proof-of-concept demonstration of such a device by solving, in a parallel fashion, the small instance {2, 5, 9} of the subset sum problem, which is a benchmark NP-complete problem. Finally, we discuss the technical advances necessary to make our system scalable with presently available technology.

  1. A universal concept based on cellular neural networks for ultrafast and flexible solving of differential equations.

    Science.gov (United States)

    Chedjou, Jean Chamberlain; Kyamakya, Kyandoghere

    2015-04-01

    This paper develops and validates a comprehensive and universally applicable computational concept for solving nonlinear differential equations (NDEs) through a neurocomputing concept based on cellular neural networks (CNNs). High-precision, stability, convergence, and lowest-possible memory requirements are ensured by the CNN processor architecture. A significant challenge solved in this paper is that all these cited computing features are ensured in all system-states (regular or chaotic ones) and in all bifurcation conditions that may be experienced by NDEs.One particular quintessence of this paper is to develop and demonstrate a solver concept that shows and ensures that CNN processors (realized either in hardware or in software) are universal solvers of NDE models. The solving logic or algorithm of given NDEs (possible examples are: Duffing, Mathieu, Van der Pol, Jerk, Chua, Rössler, Lorenz, Burgers, and the transport equations) through a CNN processor system is provided by a set of templates that are computed by our comprehensive templates calculation technique that we call nonlinear adaptive optimization. This paper is therefore a significant contribution and represents a cutting-edge real-time computational engineering approach, especially while considering the various scientific and engineering applications of this ultrafast, energy-and-memory-efficient, and high-precise NDE solver concept. For illustration purposes, three NDE models are demonstratively solved, and related CNN templates are derived and used: the periodically excited Duffing equation, the Mathieu equation, and the transport equation.

  2. Profiling the miRNAs for Early Cancer Detection using DNA-based Logic Gates

    Directory of Open Access Journals (Sweden)

    Tahereh Yahya

    2017-12-01

    Full Text Available Abstract Background: DNA-based computing is an emerging research aspect that enables the in-vivo computation and decision making with significant correctness. Recent papers show that the expression level of miRNAs are related to the progress status of some diseases such as cancers and DNA computing is introduced as a low cost and concise technique for detection of these biomarkers. In this paper, DNA-based logic gates are implemented in the laboratory to detect the level of miR-21 as the biomarker of cancer. Materials and Methods: At the first, required strands for designing DNA gates are synthesized. Then, double stranded gate is generated in laboratory using a temperature gradient that followed by electrophoresis process. This double strand is the computation engine for detecting the miR-21 biomarker. miR-21 is as input in designed gate. At the end, the expression level of miR-21 is identified by measuring the generated fluorescent. Results: at the first stage, the proposed DNA-based logic gate is evaluated by using the synthesized input strands and then it is experimented on a tumor tissue. Experimental results on synthesized strands show that its detection quality/correctness is 2.5x better than conventional methods. Conclusion: Experimental results on the tumor tissues are successful and are matched with those are extracted from real time PCR results. Also, the results show that this method is significantly more suitable than real time PCR in view of time and cost.

  3. EDDYMULT: a computing system for solving eddy current problems in a multi-torus system

    International Nuclear Information System (INIS)

    Nakamura, Yukiharu; Ozeki, Takahisa

    1989-03-01

    A new computing system EDDYMULT based on the finite element circuit method has been developed to solve actual eddy current problems in a multi-torus system, which consists of many torus-conductors and various kinds of axisymmetric poloidal field coils. The EDDYMULT computing system can deal three-dimensionally with the modal decomposition of eddy current in a multi-torus system, the transient phenomena of eddy current distributions and the resultant magnetic field. Therefore, users can apply the computing system to the solution of the eddy current problems in a tokamak fusion device, such as the design of poloidal field coil power supplies, the mechanical stress design of the intensive electromagnetic loading on device components and the control analysis of plasma position. The present report gives a detailed description of the EDDYMULT system as an user's manual: 1) theory, 2) structure of the code system, 3) input description, 4) problem restrictions, 5) description of the subroutines, etc. (author)

  4. Effect of Computer-Presented Organizational/Memory Aids on Problem Solving Behavior.

    Science.gov (United States)

    Steinberg, Esther R.; And Others

    This research studied the effects of computer-presented organizational/memory aids on problem solving behavior. The aids were either matrix or verbal charts shown on the display screen next to the problem. The 104 college student subjects were randomly assigned to one of the four conditions: type of chart (matrix or verbal chart) and use of charts…

  5. Solving linear systems in FLICA-4, thermohydraulic code for 3-D transient computations

    International Nuclear Information System (INIS)

    Allaire, G.

    1995-01-01

    FLICA-4 is a computer code, developed at the CEA (France), devoted to steady state and transient thermal-hydraulic analysis of nuclear reactor cores, for small size problems (around 100 mesh cells) as well as for large ones (more than 100000), on, either standard workstations or vector super-computers. As for time implicit codes, the largest time and memory consuming part of FLICA-4 is the routine dedicated to solve the linear system (the size of which is of the order of the number of cells). Therefore, the efficiency of the code is crucially influenced by the optimization of the algorithms used in assembling and solving linear systems: direct methods as the Gauss (or LU) decomposition for moderate size problems, iterative methods as the preconditioned conjugate gradient for large problems. 6 figs., 13 refs

  6. Solving algebraic computational problems in geodesy and geoinformatics the answer to modern challenges

    CERN Document Server

    Awange, Joseph L

    2004-01-01

    While preparing and teaching 'Introduction to Geodesy I and II' to - dergraduate students at Stuttgart University, we noticed a gap which motivated the writing of the present book: Almost every topic that we taughtrequiredsomeskillsinalgebra,andinparticular,computeral- bra! From positioning to transformation problems inherent in geodesy and geoinformatics, knowledge of algebra and application of computer algebra software were required. In preparing this book therefore, we haveattemptedtoputtogetherbasicconceptsofabstractalgebra which underpin the techniques for solving algebraic problems. Algebraic c- putational algorithms useful for solving problems which require exact solutions to nonlinear systems of equations are presented and tested on various problems. Though the present book focuses mainly on the two ?elds,theconceptsand techniquespresented hereinarenonetheless- plicable to other ?elds where algebraic computational problems might be encountered. In Engineering for example, network densi?cation and robo...

  7. Resource-constrained project scheduling: computing lower bounds by solving minimum cut problems

    NARCIS (Netherlands)

    Möhring, R.H.; Nesetril, J.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    1999-01-01

    We present a novel approach to compute Lagrangian lower bounds on the objective function value of a wide class of resource-constrained project scheduling problems. The basis is a polynomial-time algorithm to solve the following scheduling problem: Given a set of activities with start-time dependent

  8. PRECONDITIONED CONJUGATE-GRADIENT 2 (PCG2), a computer program for solving ground-water flow equations

    Science.gov (United States)

    Hill, Mary C.

    1990-01-01

    This report documents PCG2 : a numerical code to be used with the U.S. Geological Survey modular three-dimensional, finite-difference, ground-water flow model . PCG2 uses the preconditioned conjugate-gradient method to solve the equations produced by the model for hydraulic head. Linear or nonlinear flow conditions may be simulated. PCG2 includes two reconditioning options : modified incomplete Cholesky preconditioning, which is efficient on scalar computers; and polynomial preconditioning, which requires less computer storage and, with modifications that depend on the computer used, is most efficient on vector computers . Convergence of the solver is determined using both head-change and residual criteria. Nonlinear problems are solved using Picard iterations. This documentation provides a description of the preconditioned conjugate gradient method and the two preconditioners, detailed instructions for linking PCG2 to the modular model, sample data inputs, a brief description of PCG2, and a FORTRAN listing.

  9. The effect of problem-based and lecture-based instructional strategies on learner problem solving performance, problem solving processes, and attitudes

    Science.gov (United States)

    Visser, Yusra Laila

    This study compared the effect of lecture-based instruction to that of problem-based instruction on learner performance (on near-transfer and far-transfer problems), problem solving processes (reasoning strategy usage and reasoning efficiency), and attitudes (overall motivation and learner confidence) in a Genetics course. The study also analyzed the effect of self-regulatory skills and prior-academic achievement on performance for both instructional strategies. Sixty 11th grade students at a public math and science academy were assigned to either a lecture-based instructional strategy or a problem-based instructional strategy. Both treatment groups received 18 weeks of Genetics instruction through the assigned instructional strategy. In terms of problem solving performance, results revealed that the lecture-based group performed significantly better on near-transfer post-test problems. The problem-based group performed significantly better on far-transfer post-test problems. In addition, results indicated the learners in the lecture-based instructional treatment were significantly more likely to employ data-driven reasoning in the solving of problems, whereas learners in the problem-based instructional treatment were significantly more likely to employ hypothesis-driven reasoning in problem solving. No significant differences in reasoning efficiency were uncovered between treatment groups. Preliminary analysis of the motivation data suggested that there were no significant differences in motivation between treatment groups. However, a post-research exploratory analysis suggests that overall motivation was significantly higher in the lecture-based instructional treatment than in the problem-based instructional treatment. Learner confidence was significantly higher in the lecture-based group than in the problem-based group. A significant positive correlation was detected between self-regulatory skills scores and problem solving performance scores in the problem-based

  10. Solving Large-Scale Computational Problems Using Insights from Statistical Physics

    Energy Technology Data Exchange (ETDEWEB)

    Selman, Bart [Cornell University

    2012-02-29

    Many challenging problems in computer science and related fields can be formulated as constraint satisfaction problems. Such problems consist of a set of discrete variables and a set of constraints between those variables, and represent a general class of so-called NP-complete problems. The goal is to find a value assignment to the variables that satisfies all constraints, generally requiring a search through and exponentially large space of variable-value assignments. Models for disordered systems, as studied in statistical physics, can provide important new insights into the nature of constraint satisfaction problems. Recently, work in this area has resulted in the discovery of a new method for solving such problems, called the survey propagation (SP) method. With SP, we can solve problems with millions of variables and constraints, an improvement of two orders of magnitude over previous methods.

  11. Solving Coupled Gross--Pitaevskii Equations on a Cluster of PlayStation 3 Computers

    Science.gov (United States)

    Edwards, Mark; Heward, Jeffrey; Clark, C. W.

    2009-05-01

    At Georgia Southern University we have constructed an 8+1--node cluster of Sony PlayStation 3 (PS3) computers with the intention of using this computing resource to solve problems related to the behavior of ultra--cold atoms in general with a particular emphasis on studying bose--bose and bose--fermi mixtures confined in optical lattices. As a first project that uses this computing resource, we have implemented a parallel solver of the coupled time--dependent, one--dimensional Gross--Pitaevskii (TDGP) equations. These equations govern the behavior of dual-- species bosonic mixtures. We chose the split--operator/FFT to solve the coupled 1D TDGP equations. The fast Fourier transform component of this solver can be readily parallelized on the PS3 cpu known as the Cell Broadband Engine (CellBE). Each CellBE chip contains a single 64--bit PowerPC Processor Element known as the PPE and eight ``Synergistic Processor Element'' identified as the SPE's. We report on this algorithm and compare its performance to a non--parallel solver as applied to modeling evaporative cooling in dual--species bosonic mixtures.

  12. EP BASED PSO METHOD FOR SOLVING PROFIT BASED MULTI AREA UNIT COMMITMENT PROBLEM

    Directory of Open Access Journals (Sweden)

    K. VENKATESAN

    2015-04-01

    Full Text Available This paper presents a new approach to solve the profit based multi area unit commitment problem (PBMAUCP using an evolutionary programming based particle swarm optimization (EPPSO method. The objective of this paper is to maximize the profit of generation companies (GENCOs with considering system social benefit. The proposed method helps GENCOs to make a decision, how much power and reserve should be sold in markets, and how to schedule generators in order to receive the maximum profit. Joint operation of generation resources can result in significant operational cost savings. Power transfer between the areas through the tie lines depends upon the operating cost of generation at each hour and tie line transfer limits. The tie line transfer limits were considered as a set of constraints during optimization process to ensure the system security and reliability. The overall algorithm can be implemented on an IBM PC, which can process a fairly large system in a reasonable period of time. Case study of four areas with different load pattern each containing 7 units (NTPS and 26 units connected via tie lines have been taken for analysis. Numerical results showed comparing the profit of evolutionary programming-based particle swarm optimization method (EPPSO with conventional dynamic programming (DP, evolutionary programming (EP, and particle swarm optimization (PSO method. Experimental results shows that the application of this evolutionary programming based particle swarm optimization method have the potential to solve profit based multi area unit commitment problem with lesser computation time.

  13. Organization of the secure distributed computing based on multi-agent system

    Science.gov (United States)

    Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera

    2018-04-01

    Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.

  14. The multilevel fast multipole algorithm (MLFMA) for solving large-scale computational electromagnetics problems

    CERN Document Server

    Ergul, Ozgur

    2014-01-01

    The Multilevel Fast Multipole Algorithm (MLFMA) for Solving Large-Scale Computational Electromagnetic Problems provides a detailed and instructional overview of implementing MLFMA. The book: Presents a comprehensive treatment of the MLFMA algorithm, including basic linear algebra concepts, recent developments on the parallel computation, and a number of application examplesCovers solutions of electromagnetic problems involving dielectric objects and perfectly-conducting objectsDiscusses applications including scattering from airborne targets, scattering from red

  15. Measurement and theory of hydrogen bonding contribution to isosteric DNA base pairs.

    Science.gov (United States)

    Khakshoor, Omid; Wheeler, Steven E; Houk, K N; Kool, Eric T

    2012-02-15

    We address the recent debate surrounding the ability of 2,4-difluorotoluene (F), a low-polarity mimic of thymine (T), to form a hydrogen-bonded complex with adenine in DNA. The hydrogen bonding ability of F has been characterized as small to zero in various experimental studies, and moderate to small in computational studies. However, recent X-ray crystallographic studies of difluorotoluene in DNA/RNA have indicated, based on interatomic distances, possible hydrogen bonding interactions between F and natural bases in nucleic acid duplexes and in a DNA polymerase active site. Since F is widely used to measure electrostatic contributions to pairing and replication, it is important to quantify the impact of this isostere on DNA stability. Here, we studied the pairing stability and selectivity of this compound and a closely related variant, dichlorotoluene deoxyriboside (L), in DNA, using both experimental and computational approaches. We measured the thermodynamics of duplex formation in three sequence contexts and with all possible pairing partners by thermal melting studies using the van't Hoff approach, and for selected cases by isothermal titration calorimetry (ITC). Experimental results showed that internal F-A pairing in DNA is destabilizing by 3.8 kcal/mol (van't Hoff, 37 °C) as compared with T-A pairing. At the end of a duplex, base-base interactions are considerably smaller; however, the net F-A interaction remains repulsive while T-A pairing is attractive. As for selectivity, F is found to be slightly selective for adenine over C, G, T by 0.5 kcal mol, as compared with thymine's selectivity of 2.4 kcal/mol. Interestingly, dichlorotoluene in DNA is slightly less destabilizing and slightly more selective than F, despite the lack of strongly electronegative fluorine atoms. Experimental data were complemented by computational results, evaluated at the M06-2X/6-31+G(d) and MP2/cc-pVTZ levels of theory. These computations suggest that the pairing energy of F to A

  16. Excited state dynamics of DNA bases

    Czech Academy of Sciences Publication Activity Database

    Kleinermanns, K.; Nachtigallová, Dana; de Vries, M. S.

    2013-01-01

    Roč. 32, č. 2 (2013), s. 308-342 ISSN 0144-235X R&D Projects: GA ČR GAP208/12/1318 Grant - others:National Science Foundation(US) CHE-0911564; NASA (US) NNX12AG77G; Deutsche Forschungsgemeinschaft(DE) SFB 663; Deutsche Forschungsgemeinschaft(DE) KI 531-29 Institutional support: RVO:61388963 Keywords : DNA bases * nucleobases * excited state * dynamics * computations * gas phase * conical intersections Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 4.920, year: 2013

  17. Integrating cut-and-solve and semi-Lagrangean based dual ascent for the single-source capacitated facility location problem

    DEFF Research Database (Denmark)

    Gadegaard, Sune Lauth

    polytope with generalized upper bounds. From our computational study, we show that the semi-Lagrangean relaxation approach has its merits when the instances are tightly constrained with regards to the capacity of the system, but that it is very hard to compete with a standalone implementation of the cut......This paper describes how the cut-and-solve framework and semi-Lagrangean based dual ascent algorithms can be integrated in two natural ways in order to solve the single source capacitated facility location problem. The first uses the cut-and-solve framework both as a heuristic and as an exact...... solver for the semi-Lagrangean subproblems. The other uses a semi-Lagrangean based dual ascent algorithm to solve the sparse problems arising in the cut-and-solve algorithm. Furthermore, we developed a simple way to separate a special type of cutting planes from what we denote the effective capacity...

  18. DNA-based watermarks using the DNA-Crypt algorithm

    Directory of Open Access Journals (Sweden)

    Barnekow Angelika

    2007-05-01

    Full Text Available Abstract Background The aim of this paper is to demonstrate the application of watermarks based on DNA sequences to identify the unauthorized use of genetically modified organisms (GMOs protected by patents. Predicted mutations in the genome can be corrected by the DNA-Crypt program leaving the encrypted information intact. Existing DNA cryptographic and steganographic algorithms use synthetic DNA sequences to store binary information however, although these sequences can be used for authentication, they may change the target DNA sequence when introduced into living organisms. Results The DNA-Crypt algorithm and image steganography are based on the same watermark-hiding principle, namely using the least significant base in case of DNA-Crypt and the least significant bit in case of the image steganography. It can be combined with binary encryption algorithms like AES, RSA or Blowfish. DNA-Crypt is able to correct mutations in the target DNA with several mutation correction codes such as the Hamming-code or the WDH-code. Mutations which can occur infrequently may destroy the encrypted information, however an integrated fuzzy controller decides on a set of heuristics based on three input dimensions, and recommends whether or not to use a correction code. These three input dimensions are the length of the sequence, the individual mutation rate and the stability over time, which is represented by the number of generations. In silico experiments using the Ypt7 in Saccharomyces cerevisiae shows that the DNA watermarks produced by DNA-Crypt do not alter the translation of mRNA into protein. Conclusion The program is able to store watermarks in living organisms and can maintain the original information by correcting mutations itself. Pairwise or multiple sequence alignments show that DNA-Crypt produces few mismatches between the sequences similar to all steganographic algorithms.

  19. DNA-based watermarks using the DNA-Crypt algorithm.

    Science.gov (United States)

    Heider, Dominik; Barnekow, Angelika

    2007-05-29

    The aim of this paper is to demonstrate the application of watermarks based on DNA sequences to identify the unauthorized use of genetically modified organisms (GMOs) protected by patents. Predicted mutations in the genome can be corrected by the DNA-Crypt program leaving the encrypted information intact. Existing DNA cryptographic and steganographic algorithms use synthetic DNA sequences to store binary information however, although these sequences can be used for authentication, they may change the target DNA sequence when introduced into living organisms. The DNA-Crypt algorithm and image steganography are based on the same watermark-hiding principle, namely using the least significant base in case of DNA-Crypt and the least significant bit in case of the image steganography. It can be combined with binary encryption algorithms like AES, RSA or Blowfish. DNA-Crypt is able to correct mutations in the target DNA with several mutation correction codes such as the Hamming-code or the WDH-code. Mutations which can occur infrequently may destroy the encrypted information, however an integrated fuzzy controller decides on a set of heuristics based on three input dimensions, and recommends whether or not to use a correction code. These three input dimensions are the length of the sequence, the individual mutation rate and the stability over time, which is represented by the number of generations. In silico experiments using the Ypt7 in Saccharomyces cerevisiae shows that the DNA watermarks produced by DNA-Crypt do not alter the translation of mRNA into protein. The program is able to store watermarks in living organisms and can maintain the original information by correcting mutations itself. Pairwise or multiple sequence alignments show that DNA-Crypt produces few mismatches between the sequences similar to all steganographic algorithms.

  20. DNA-based watermarks using the DNA-Crypt algorithm

    Science.gov (United States)

    Heider, Dominik; Barnekow, Angelika

    2007-01-01

    Background The aim of this paper is to demonstrate the application of watermarks based on DNA sequences to identify the unauthorized use of genetically modified organisms (GMOs) protected by patents. Predicted mutations in the genome can be corrected by the DNA-Crypt program leaving the encrypted information intact. Existing DNA cryptographic and steganographic algorithms use synthetic DNA sequences to store binary information however, although these sequences can be used for authentication, they may change the target DNA sequence when introduced into living organisms. Results The DNA-Crypt algorithm and image steganography are based on the same watermark-hiding principle, namely using the least significant base in case of DNA-Crypt and the least significant bit in case of the image steganography. It can be combined with binary encryption algorithms like AES, RSA or Blowfish. DNA-Crypt is able to correct mutations in the target DNA with several mutation correction codes such as the Hamming-code or the WDH-code. Mutations which can occur infrequently may destroy the encrypted information, however an integrated fuzzy controller decides on a set of heuristics based on three input dimensions, and recommends whether or not to use a correction code. These three input dimensions are the length of the sequence, the individual mutation rate and the stability over time, which is represented by the number of generations. In silico experiments using the Ypt7 in Saccharomyces cerevisiae shows that the DNA watermarks produced by DNA-Crypt do not alter the translation of mRNA into protein. Conclusion The program is able to store watermarks in living organisms and can maintain the original information by correcting mutations itself. Pairwise or multiple sequence alignments show that DNA-Crypt produces few mismatches between the sequences similar to all steganographic algorithms. PMID:17535434

  1. A computer-based teaching programme (CBTP) developed for ...

    African Journals Online (AJOL)

    The nursing profession, like other professions, is focused on preparing students for practice, and particular attention must be paid to the ability of student nurses to extend their knowledge and to solve nursing care problems effectively. A computer-based teaching programme (CBTP) for clinical practice to achieve these ...

  2. The extended RBAC model based on grid computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Jian-gang; WANG Ru-chuan; WANG Hai-yan

    2006-01-01

    This article proposes the extended role-based access control (RBAC) model for solving dynamic and multidomain problems in grid computing, The formulated description of the model has been provided. The introduction of context and the mapping relations of context-to-role and context-to-permission help the model adapt to dynamic property in grid environment.The multidomain role inheritance relation by the authorization agent service realizes the multidomain authorization amongst the autonomy domain. A function has been proposed for solving the role inheritance conflict during the establishment of the multidomain role inheritance relation.

  3. A Novel Image Encryption Algorithm Based on DNA Encoding and Spatiotemporal Chaos

    Directory of Open Access Journals (Sweden)

    Chunyan Song

    2015-10-01

    Full Text Available DNA computing based image encryption is a new, promising field. In this paper, we propose a novel image encryption scheme based on DNA encoding and spatiotemporal chaos. In particular, after the plain image is primarily diffused with the bitwise Exclusive-OR operation, the DNA mapping rule is introduced to encode the diffused image. In order to enhance the encryption, the spatiotemporal chaotic system is used to confuse the rows and columns of the DNA encoded image. The experiments demonstrate that the proposed encryption algorithm is of high key sensitivity and large key space, and it can resist brute-force attack, entropy attack, differential attack, chosen-plaintext attack, known-plaintext attack and statistical attack.

  4. EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.

    Science.gov (United States)

    Jarvis, John J.; And Others

    Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…

  5. Neural bases for basic processes in heuristic problem solving: Take solving Sudoku puzzles as an example.

    Science.gov (United States)

    Qin, Yulin; Xiang, Jie; Wang, Rifeng; Zhou, Haiyan; Li, Kuncheng; Zhong, Ning

    2012-12-01

    Newell and Simon postulated that the basic steps in human problem-solving involve iteratively applying operators to transform the state of the problem to eventually achieve a goal. To check the neural basis of this framework, the present study focused on the basic processes in human heuristic problem-solving that the participants identified the current problem state and then recalled and applied the corresponding heuristic rules to change the problem state. A new paradigm, solving simplified Sudoku puzzles, was developed for an event-related functional magnetic resonance imaging (fMRI) study in problem solving. Regions of interest (ROIs), including the left prefrontal cortex, the bilateral posterior parietal cortex, the anterior cingulated cortex, the bilateral caudate nuclei, the bilateral fusiform, as well as the bilateral frontal eye fields, were found to be involved in the task. To obtain convergent evidence, in addition to traditional statistical analysis, we used the multivariate voxel classification method to check the accuracy of the predictions for the condition of the task from the blood oxygen level dependent (BOLD) response of the ROIs, using a new classifier developed in this study for fMRI data. To reveal the roles that the ROIs play in problem solving, we developed an ACT-R computational model of the information-processing processes in human problem solving, and tried to predict the BOLD response of the ROIs from the task. Advances in human problem-solving research after Newell and Simon are then briefly discussed. © 2012 The Institute of Psychology, Chinese Academy of Sciences and Blackwell Publishing Asia Pty Ltd.

  6. Simulation of Si:P spin-based quantum computer architecture

    International Nuclear Information System (INIS)

    Chang Yiachung; Fang Angbo

    2008-01-01

    We present realistic simulation for single and double phosphorous donors in a silicon-based quantum computer design by solving a valley-orbit coupled effective-mass equation for describing phosphorous donors in strained silicon quantum well (QW). Using a generalized unrestricted Hartree-Fock method, we solve the two-electron effective-mass equation with quantum well confinement and realistic gate potentials. The effects of QW width, gate voltages, donor separation, and donor position shift on the lowest singlet and triplet energies and their charge distributions for a neighboring donor pair in the quantum computer(QC) architecture are analyzed. The gate tunability are defined and evaluated for a typical QC design. Estimates are obtained for the duration of spin half-swap gate operation.

  7. Problem solving and inference mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Furukawa, K; Nakajima, R; Yonezawa, A; Goto, S; Aoyama, A

    1982-01-01

    The heart of the fifth generation computer will be powerful mechanisms for problem solving and inference. A deduction-oriented language is to be designed, which will form the core of the whole computing system. The language is based on predicate logic with the extended features of structuring facilities, meta structures and relational data base interfaces. Parallel computation mechanisms and specialized hardware architectures are being investigated to make possible efficient realization of the language features. The project includes research into an intelligent programming system, a knowledge representation language and system, and a meta inference system to be built on the core. 30 references.

  8. A highly efficient parallel algorithm for solving the neutron diffusion nodal equations on shared-memory computers

    International Nuclear Information System (INIS)

    Azmy, Y.Y.; Kirk, B.L.

    1990-01-01

    Modern parallel computer architectures offer an enormous potential for reducing CPU and wall-clock execution times of large-scale computations commonly performed in various applications in science and engineering. Recently, several authors have reported their efforts in developing and implementing parallel algorithms for solving the neutron diffusion equation on a variety of shared- and distributed-memory parallel computers. Testing of these algorithms for a variety of two- and three-dimensional meshes showed significant speedup of the computation. Even for very large problems (i.e., three-dimensional fine meshes) executed concurrently on a few nodes in serial (nonvector) mode, however, the measured computational efficiency is very low (40 to 86%). In this paper, the authors present a highly efficient (∼85 to 99.9%) algorithm for solving the two-dimensional nodal diffusion equations on the Sequent Balance 8000 parallel computer. Also presented is a model for the performance, represented by the efficiency, as a function of problem size and the number of participating processors. The model is validated through several tests and then extrapolated to larger problems and more processors to predict the performance of the algorithm in more computationally demanding situations

  9. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    Science.gov (United States)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  10. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  11. Learning Matlab a problem solving approach

    CERN Document Server

    Gander, Walter

    2015-01-01

    This comprehensive and stimulating introduction to Matlab, a computer language now widely used for technical computing, is based on an introductory course held at Qian Weichang College, Shanghai University, in the fall of 2014.  Teaching and learning a substantial programming language aren’t always straightforward tasks. Accordingly, this textbook is not meant to cover the whole range of this high-performance technical programming environment, but to motivate first- and second-year undergraduate students in mathematics and computer science to learn Matlab by studying representative problems, developing algorithms and programming them in Matlab. While several topics are taken from the field of scientific computing, the main emphasis is on programming. A wealth of examples are completely discussed and solved, allowing students to learn Matlab by doing: by solving problems, comparing approaches and assessing the proposed solutions.

  12. Quantum computing based on semiconductor nanowires

    NARCIS (Netherlands)

    Frolov, S.M.; Plissard, S.R.; Nadj-Perge, S.; Kouwenhoven, L.P.; Bakkers, E.P.A.M.

    2013-01-01

    A quantum computer will have computational power beyond that of conventional computers, which can be exploited for solving important and complex problems, such as predicting the conformations of large biological molecules. Materials play a major role in this emerging technology, as they can enable

  13. Principles of DNA architectonics: design of DNA-based nanoobjects

    International Nuclear Information System (INIS)

    Vinogradova, O A; Pyshnyi, D V

    2012-01-01

    The methods of preparation of monomeric DNA blocks that serve as key building units for the construction of complex DNA objects are described. Examples are given of the formation of DNA blocks based on native and modified oligonucleotide components using hydrogen bonding and nucleic acid-specific types of bonding and also some affinity interactions with RNA, proteins, ligands. The static discrete and periodic two- and three-dimensional DNA objects reported to date are described systematically. Methods used to prove the structures of DNA objects and the prospects for practical application of nanostructures based on DNA and its analogues in biology, medicine and biophysics are considered. The bibliography includes 195 references.

  14. DNA-based construction at the nanoscale: emerging trends and applications

    Science.gov (United States)

    Lourdu Xavier, P.; Chandrasekaran, Arun Richard

    2018-02-01

    The field of structural DNA nanotechnology has evolved remarkably—from the creation of artificial immobile junctions to the recent DNA-protein hybrid nanoscale shapes—in a span of about 35 years. It is now possible to create complex DNA-based nanoscale shapes and large hierarchical assemblies with greater stability and predictability, thanks to the development of computational tools and advances in experimental techniques. Although it started with the original goal of DNA-assisted structure determination of difficult-to-crystallize molecules, DNA nanotechnology has found its applications in a myriad of fields. In this review, we cover some of the basic and emerging assembly principles: hybridization, base stacking/shape complementarity, and protein-mediated formation of nanoscale structures. We also review various applications of DNA nanostructures, with special emphasis on some of the biophysical applications that have been reported in recent years. In the outlook, we discuss further improvements in the assembly of such structures, and explore possible future applications involving super-resolved fluorescence, single-particle cryo-electron (cryo-EM) and x-ray free electron laser (XFEL) nanoscopic imaging techniques, and in creating new synergistic designer materials.

  15. Medical imaging in clinical applications algorithmic and computer-based approaches

    CERN Document Server

    Bhateja, Vikrant; Hassanien, Aboul

    2016-01-01

    This volume comprises of 21 selected chapters, including two overview chapters devoted to abdominal imaging in clinical applications supported computer aided diagnosis approaches as well as different techniques for solving the pectoral muscle extraction problem in the preprocessing part of the CAD systems for detecting breast cancer in its early stage using digital mammograms. The aim of this book is to stimulate further research in medical imaging applications based algorithmic and computer based approaches and utilize them in real-world clinical applications. The book is divided into four parts, Part-I: Clinical Applications of Medical Imaging, Part-II: Classification and clustering, Part-III: Computer Aided Diagnosis (CAD) Tools and Case Studies and Part-IV: Bio-inspiring based Computer Aided diagnosis techniques. .

  16. DNA Self-Assembly and Computation Studied with a Coarse-grained Dynamic Bonded Model

    DEFF Research Database (Denmark)

    Svaneborg, Carsten; Fellermann, Harold; Rasmussen, Steen

    2012-01-01

    We utilize a coarse-grained directional dynamic bonding DNA model [C. Svaneborg, Comp. Phys. Comm. (In Press DOI:10.1016/j.cpc.2012.03.005)] to study DNA self-assembly and DNA computation. In our DNA model, a single nucleotide is represented by a single interaction site, and complementary sites can...

  17. Cloud-based adaptive exon prediction for DNA analysis.

    Science.gov (United States)

    Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen

    2018-02-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.

  18. Fast-Solving Quasi-Optimal LS-S3VM Based on an Extended Candidate Set.

    Science.gov (United States)

    Ma, Yuefeng; Liang, Xun; Kwok, James T; Li, Jianping; Zhou, Xiaoping; Zhang, Haiyan

    2018-04-01

    The semisupervised least squares support vector machine (LS-S 3 VM) is an important enhancement of least squares support vector machines in semisupervised learning. Given that most data collected from the real world are without labels, semisupervised approaches are more applicable than standard supervised approaches. Although a few training methods for LS-S 3 VM exist, the problem of deriving the optimal decision hyperplane efficiently and effectually has not been solved. In this paper, a fully weighted model of LS-S 3 VM is proposed, and a simple integer programming (IP) model is introduced through an equivalent transformation to solve the model. Based on the distances between the unlabeled data and the decision hyperplane, a new indicator is designed to represent the possibility that the label of an unlabeled datum should be reversed in each iteration during training. Using the indicator, we construct an extended candidate set consisting of the indices of unlabeled data with high possibilities, which integrates more information from unlabeled data. Our algorithm is degenerated into a special scenario of the previous algorithm when the extended candidate set is reduced into a set with only one element. Two strategies are utilized to determine the descent directions based on the extended candidate set. Furthermore, we developed a novel method for locating a good starting point based on the properties of the equivalent IP model. Combined with the extended candidate set and the carefully computed starting point, a fast algorithm to solve LS-S 3 VM quasi-optimally is proposed. The choice of quasi-optimal solutions results in low computational cost and avoidance of overfitting. Experiments show that our algorithm equipped with the two designed strategies is more effective than other algorithms in at least one of the following three aspects: 1) computational complexity; 2) generalization ability; and 3) flexibility. However, our algorithm and other algorithms have

  19. DistAMo: A web-based tool to characterize DNA-motif distribution on bacterial chromosomes

    Directory of Open Access Journals (Sweden)

    Patrick eSobetzko

    2016-03-01

    Full Text Available Short DNA motifs are involved in a multitude of functions such as for example chromosome segregation, DNA replication or mismatch repair. Distribution of such motifs is often not random and the specific chromosomal pattern relates to the respective motif function. Computational approaches which quantitatively assess such chromosomal motif patterns are necessary. Here we present a new computer tool DistAMo (Distribution Analysis of DNA Motifs. The algorithm uses codon redundancy to calculate the relative abundance of short DNA motifs from single genes to entire chromosomes. Comparative genomics analyses of the GATC-motif distribution in γ-proteobacterial genomes using DistAMo revealed that (i genes beside the replication origin are enriched in GATCs, (ii genome-wide GATC distribution follows a distinct pattern and (iii genes involved in DNA replication and repair are enriched in GATCs. These features are specific for bacterial chromosomes encoding a Dam methyltransferase. The new software is available as a stand-alone or as an easy-to-use web-based server version at http://www.computational.bio.uni-giessen.de/distamo.

  20. Cognitive processes in solving variants of computer-based problems used in logic teaching

    NARCIS (Netherlands)

    Eysink, Tessa H.S.; Dijkstra, S.; Kuper, Jan

    2001-01-01

    The effect of two instructional variables, visualisation and manipulation of objects, in learning to use the logical connective, conditional, was investigated. Instructions for 66 first- year social science students were varied in the computer-based learning environment Tarski's World, designed for

  1. Developing an agent-based model on how different individuals solve complex problems

    Directory of Open Access Journals (Sweden)

    Ipek Bozkurt

    2015-01-01

    Full Text Available Purpose: Research that focuses on the emotional, mental, behavioral and cognitive capabilities of individuals has been abundant within disciplines such as psychology, sociology, and anthropology, among others. However, when facing complex problems, a new perspective to understand individuals is necessary. The main purpose of this paper is to develop an agent-based model and simulation to gain understanding on the decision-making and problem-solving abilities of individuals. Design/Methodology/approach: The micro-level analysis modeling and simulation paradigm Agent-Based Modeling Through the use of Agent-Based Modeling, insight is gained on how different individuals with different profiles deal with complex problems. Using previous literature from different bodies of knowledge, established theories and certain assumptions as input parameters, a model is built and executed through a computer simulation. Findings: The results indicate that individuals with certain profiles have better capabilities to deal with complex problems. Moderate profiles could solve the entire complex problem, whereas profiles within extreme conditions could not. This indicates that having a strong predisposition is not the ideal way when approaching complex problems, and there should always be a component from the other perspective. The probability that an individual may use these capabilities provided by the opposite predisposition provides to be a useful option. Originality/value: The originality of the present research stems from how individuals are profiled, and the model and simulation that is built to understand how they solve complex problems. The development of the agent-based model adds value to the existing body of knowledge within both social sciences, and modeling and simulation.

  2. Hybrid Taguchi DNA Swarm Intelligence for Optimal Inverse Kinematics Redundancy Resolution of Six-DOF Humanoid Robot Arms

    Directory of Open Access Journals (Sweden)

    Hsu-Chih Huang

    2014-01-01

    Full Text Available This paper presents a hybrid Taguchi deoxyribonucleic acid (DNA swarm intelligence for solving the inverse kinematics redundancy problem of six degree-of-freedom (DOF humanoid robot arms. The inverse kinematics problem of the multi-DOF humanoid robot arm is redundant and has no general closed-form solutions or analytical solutions. The optimal joint configurations are obtained by minimizing the predefined performance index in DNA algorithm for real-world humanoid robotics application. The Taguchi method is employed to determine the DNA parameters to search for the joint solutions of the six-DOF robot arms more efficiently. This approach circumvents the disadvantage of time-consuming tuning procedure in conventional DNA computing. Simulation results are conducted to illustrate the effectiveness and merit of the proposed methods. This Taguchi-based DNA (TDNA solver outperforms the conventional solvers, such as geometric solver, Jacobian-based solver, genetic algorithm (GA solver and ant, colony optimization (ACO solver.

  3. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads.

    Science.gov (United States)

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jirí

    2017-07-07

    Satellite DNA is one of the major classes of repetitive DNA, characterized by tandemly arranged repeat copies that form contiguous arrays up to megabases in length. This type of genomic organization makes satellite DNA difficult to assemble, which hampers characterization of satellite sequences by computational analysis of genomic contigs. Here, we present tandem repeat analyzer (TAREAN), a novel computational pipeline that circumvents this problem by detecting satellite repeats directly from unassembled short reads. The pipeline first employs graph-based sequence clustering to identify groups of reads that represent repetitive elements. Putative satellite repeats are subsequently detected by the presence of circular structures in their cluster graphs. Consensus sequences of repeat monomers are then reconstructed from the most frequent k-mers obtained by decomposing read sequences from corresponding clusters. The pipeline performance was successfully validated by analyzing low-pass genome sequencing data from five plant species where satellite DNA was previously experimentally characterized. Moreover, novel satellite repeats were predicted for the genome of Vicia faba and three of these repeats were verified by detecting their sequences on metaphase chromosomes using fluorescence in situ hybridization. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. DNA Origami-Graphene Hybrid Nanopore for DNA Detection.

    Science.gov (United States)

    Barati Farimani, Amir; Dibaeinia, Payam; Aluru, Narayana R

    2017-01-11

    DNA origami nanostructures can be used to functionalize solid-state nanopores for single molecule studies. In this study, we characterized a nanopore in a DNA origami-graphene heterostructure for DNA detection. The DNA origami nanopore is functionalized with a specific nucleotide type at the edge of the pore. Using extensive molecular dynamics (MD) simulations, we computed and analyzed the ionic conductivity of nanopores in heterostructures carpeted with one or two layers of DNA origami on graphene. We demonstrate that a nanopore in DNA origami-graphene gives rise to distinguishable dwell times for the four DNA base types, whereas for a nanopore in bare graphene, the dwell time is almost the same for all types of bases. The specific interactions (hydrogen bonds) between DNA origami and the translocating DNA strand yield different residence times and ionic currents. We also conclude that the speed of DNA translocation decreases due to the friction between the dangling bases at the pore mouth and the sequencing DNA strands.

  5. An E-learning System based on Affective Computing

    Science.gov (United States)

    Duo, Sun; Song, Lu Xue

    In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.

  6. Improved teaching-learning-based and JAYA optimization algorithms for solving flexible flow shop scheduling problems

    Science.gov (United States)

    Buddala, Raviteja; Mahapatra, Siba Sankar

    2017-11-01

    Flexible flow shop (or a hybrid flow shop) scheduling problem is an extension of classical flow shop scheduling problem. In a simple flow shop configuration, a job having `g' operations is performed on `g' operation centres (stages) with each stage having only one machine. If any stage contains more than one machine for providing alternate processing facility, then the problem becomes a flexible flow shop problem (FFSP). FFSP which contains all the complexities involved in a simple flow shop and parallel machine scheduling problems is a well-known NP-hard (Non-deterministic polynomial time) problem. Owing to high computational complexity involved in solving these problems, it is not always possible to obtain an optimal solution in a reasonable computation time. To obtain near-optimal solutions in a reasonable computation time, a large variety of meta-heuristics have been proposed in the past. However, tuning algorithm-specific parameters for solving FFSP is rather tricky and time consuming. To address this limitation, teaching-learning-based optimization (TLBO) and JAYA algorithm are chosen for the study because these are not only recent meta-heuristics but they do not require tuning of algorithm-specific parameters. Although these algorithms seem to be elegant, they lose solution diversity after few iterations and get trapped at the local optima. To alleviate such drawback, a new local search procedure is proposed in this paper to improve the solution quality. Further, mutation strategy (inspired from genetic algorithm) is incorporated in the basic algorithm to maintain solution diversity in the population. Computational experiments have been conducted on standard benchmark problems to calculate makespan and computational time. It is found that the rate of convergence of TLBO is superior to JAYA. From the results, it is found that TLBO and JAYA outperform many algorithms reported in the literature and can be treated as efficient methods for solving the FFSP.

  7. Self-Assembling Molecular Logic Gates Based on DNA Crossover Tiles.

    Science.gov (United States)

    Campbell, Eleanor A; Peterson, Evan; Kolpashchikov, Dmitry M

    2017-07-05

    DNA-based computational hardware has attracted ever-growing attention due to its potential to be useful in the analysis of complex mixtures of biological markers. Here we report the design of self-assembling logic gates that recognize DNA inputs and assemble into crossover tiles when the output signal is high; the crossover structures disassemble to form separate DNA stands when the output is low. The output signal can be conveniently detected by fluorescence using a molecular beacon probe as a reporter. AND, NOT, and OR logic gates were designed. We demonstrate that the gates can connect to each other to produce other logic functions. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Sequential addition of short DNA oligos in DNA-polymerase-based synthesis reactions

    Science.gov (United States)

    Gardner, Shea N; Mariella, Jr., Raymond P; Christian, Allen T; Young, Jennifer A; Clague, David S

    2013-06-25

    A method of preselecting a multiplicity of DNA sequence segments that will comprise the DNA molecule of user-defined sequence, separating the DNA sequence segments temporally, and combining the multiplicity of DNA sequence segments with at least one polymerase enzyme wherein the multiplicity of DNA sequence segments join to produce the DNA molecule of user-defined sequence. Sequence segments may be of length n, where n is an odd integer. In one embodiment the length of desired hybridizing overlap is specified by the user and the sequences and the protocol for combining them are guided by computational (bioinformatics) predictions. In one embodiment sequence segments are combined from multiple reading frames to span the same region of a sequence, so that multiple desired hybridizations may occur with different overlap lengths.

  9. Computational Recognition of RNA Splice Sites by Exact Algorithms for the Quadratic Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Anja Fischer

    2015-06-01

    Full Text Available One fundamental problem of bioinformatics is the computational recognition of DNA and RNA binding sites. Given a set of short DNA or RNA sequences of equal length such as transcription factor binding sites or RNA splice sites, the task is to learn a pattern from this set that allows the recognition of similar sites in another set of DNA or RNA sequences. Permuted Markov (PM models and permuted variable length Markov (PVLM models are two powerful models for this task, but the problem of finding an optimal PM model or PVLM model is NP-hard. While the problem of finding an optimal PM model or PVLM model of order one is equivalent to the traveling salesman problem (TSP, the problem of finding an optimal PM model or PVLM model of order two is equivalent to the quadratic TSP (QTSP. Several exact algorithms exist for solving the QTSP, but it is unclear if these algorithms are capable of solving QTSP instances resulting from RNA splice sites of at least 150 base pairs in a reasonable time frame. Here, we investigate the performance of three exact algorithms for solving the QTSP for ten datasets of splice acceptor sites and splice donor sites of five different species and find that one of these algorithms is capable of solving QTSP instances of up to 200 base pairs with a running time of less than two days.

  10. Read-only-memory-based quantum computation: Experimental explorations using nuclear magnetic resonance and future prospects

    International Nuclear Information System (INIS)

    Sypher, D.R.; Brereton, I.M.; Wiseman, H.M.; Hollis, B.L.; Travaglione, B.C.

    2002-01-01

    Read-only-memory-based (ROM-based) quantum computation (QC) is an alternative to oracle-based QC. It has the advantages of being less 'magical', and being more suited to implementing space-efficient computation (i.e., computation using the minimum number of writable qubits). Here we consider a number of small (one- and two-qubit) quantum algorithms illustrating different aspects of ROM-based QC. They are: (a) a one-qubit algorithm to solve the Deutsch problem; (b) a one-qubit binary multiplication algorithm; (c) a two-qubit controlled binary multiplication algorithm; and (d) a two-qubit ROM-based version of the Deutsch-Jozsa algorithm. For each algorithm we present experimental verification using nuclear magnetic resonance ensemble QC. The average fidelities for the implementation were in the ranges 0.9-0.97 for the one-qubit algorithms, and 0.84-0.94 for the two-qubit algorithms. We conclude with a discussion of future prospects for ROM-based quantum computation. We propose a four-qubit algorithm, using Grover's iterate, for solving a miniature 'real-world' problem relating to the lengths of paths in a network

  11. Special data base of Informational - Computational System 'INM RAS - Black Sea' for solving inverse and data assimilation problems

    Science.gov (United States)

    Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly

    2014-05-01

    Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological

  12. Simultaneous G-Quadruplex DNA Logic.

    Science.gov (United States)

    Bader, Antoine; Cockroft, Scott L

    2018-04-03

    A fundamental principle of digital computer operation is Boolean logic, where inputs and outputs are described by binary integer voltages. Similarly, inputs and outputs may be processed on the molecular level as exemplified by synthetic circuits that exploit the programmability of DNA base-pairing. Unlike modern computers, which execute large numbers of logic gates in parallel, most implementations of molecular logic have been limited to single computing tasks, or sensing applications. This work reports three G-quadruplex-based logic gates that operate simultaneously in a single reaction vessel. The gates respond to unique Boolean DNA inputs by undergoing topological conversion from duplex to G-quadruplex states that were resolved using a thioflavin T dye and gel electrophoresis. The modular, addressable, and label-free approach could be incorporated into DNA-based sensors, or used for resolving and debugging parallel processes in DNA computing applications. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Student conceptions about the DNA structure within a hierarchical organizational level: Improvement by experiment- and computer-based outreach learning.

    Science.gov (United States)

    Langheinrich, Jessica; Bogner, Franz X

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module. To give participants the hierarchical organizational level which they have to draw, was a specific feature of our study. We introduced two objective category systems for analyzing drawings and inscriptions. Our results indicated a long- as well as a short-term increase in the level of conceptual understanding and in the number of drawn elements and their grades concerning the DNA structure. Consequently, we regard the "draw and write" technique as a tool for a teacher to get to know students' alternative conceptions. Furthermore, our study points the modification potential of hands-on and computer-supported learning modules. © 2015 The International Union of Biochemistry and Molecular Biology.

  14. Curriculum providing cognitive knowledge and problem-solving skills for anesthesia systems-based practice.

    Science.gov (United States)

    Wachtel, Ruth E; Dexter, Franklin

    2010-12-01

    Residency programs accredited by the ACGME are required to teach core competencies, including systems-based practice (SBP). Projects are important for satisfying this competency, but the level of knowledge and problem-solving skills required presupposes a basic understanding of the field. The responsibilities of anesthesiologists include the coordination of patient flow in the surgical suite. Familiarity with this topic is crucial for many improvement projects. A course in operations research for surgical services was originally developed for hospital administration students. It satisfies 2 of the Institute of Medicine's core competencies for health professionals: evidence-based practice and work in interdisciplinary teams. The course lasts 3.5 days (eg, 2 weekends) and consists of 45 cognitive objectives taught using 7 published articles, 10 lectures, and 156 computer-assisted problem-solving exercises based on 17 case studies. We tested the hypothesis that the cognitive objectives of the curriculum provide the knowledge and problem-solving skills necessary to perform projects that satisfy the SBP competency. Standardized terminology was used to define each component of the SBP competency for the minimum level of knowledge needed. The 8 components of the competency were examined independently. Most cognitive objectives contributed to at least 4 of the 8 core components of the SBP competency. Each component of SBP is addressed at the minimum requirement level of exemplify by at least 6 objectives. There is at least 1 cognitive objective at the level of summarize for each SBP component. A curriculum in operating room management can provide the knowledge and problem-solving skills anesthesiologists need for participation in projects that satisfy the SBP competency.

  15. Analysis of mathematical problem-solving ability based on metacognition on problem-based learning

    Science.gov (United States)

    Mulyono; Hadiyanti, R.

    2018-03-01

    Problem-solving is the primary purpose of the mathematics curriculum. Problem-solving abilities influenced beliefs and metacognition. Metacognition as superordinate capabilities can direct, regulate cognition and motivation and then problem-solving processes. This study aims to (1) test and analyzes the quality of problem-based learning and (2) investigate the problem-solving capabilities based on metacognition. This research uses mixed method study with The subject research are class XI students of Mathematics and Science at High School Kesatrian 2 Semarang which divided into tacit use, aware use, strategic use and reflective use level. The collecting data using scale, interviews, and tests. The data processed with the proportion of test, t-test, and paired samples t-test. The result shows that the students with levels tacit use were able to complete the whole matter given, but do not understand what and why a strategy is used. Students with aware use level were able to solve the problem, be able to build new knowledge through problem-solving to the indicators, understand the problem, determine the strategies used, although not right. Students on the Strategic ladder Use can be applied and adopt a wide variety of appropriate strategies to solve the issues and achieved re-examine indicators of process and outcome. The student with reflective use level is not found in this study. Based on the results suggested that study about the identification of metacognition in problem-solving so that the characteristics of each level of metacognition more clearly in a more significant sampling. Teachers need to know in depth about the student metacognitive activity and its relationship with mathematical problem solving and another problem resolution.

  16. Computational Approach for Studying Optical Properties of DNA Systems in Solution

    DEFF Research Database (Denmark)

    Nørby, Morten Steen; Svendsen, Casper Steinmann; Olsen, Jógvan Magnus Haugaard

    2016-01-01

    In this paper we present a study of the methodological aspects regarding calculations of optical properties for DNA systems in solution. Our computational approach will be built upon a fully polarizable QM/MM/Continuum model within a damped linear response theory framework. In this approach...... the environment is given a highly advanced description in terms of the electrostatic potential through the polarizable embedding model. Furthermore, bulk solvent effects are included in an efficient manner through a conductor-like screening model. With the aim of reducing the computational cost we develop a set...... of averaged partial charges and distributed isotropic dipole-dipole polarizabilities for DNA suitable for describing the classical region in ground-state and excited-state calculations. Calculations of the UV-spectrum of the 2-aminopurine optical probe embedded in a DNA double helical structure are presented...

  17. DEMONSTRATION COMPUTER MODELS USE WHILE SOLVING THE BUILDING OF THE CUT OF THE CYLINDER

    Directory of Open Access Journals (Sweden)

    Inna O. Gulivata

    2010-10-01

    Full Text Available Relevance of material presented in the article is the use of effective methods to illustrate the geometric material for the development of spatial imagination of students. As one of the ways to improve problem solving offer to illustrate the use of display computer model (DCM investigated objects created by the software environment PowerPoint. The technique of applying DCM while solving the problems to build a section of the cylinder makes it allows to build effective learning process and promotes the formation of spatial representations of students taking into account their individual characteristics and principles of differentiated instruction.

  18. The essential component in DNA-based information storage system: robust error-tolerating module

    Directory of Open Access Journals (Sweden)

    Aldrin Kay-Yuen eYim

    2014-11-01

    Full Text Available The size of digital data is ever increasing and is expected to grow to 40,000EB by 2020, yet the estimated global information storage capacity in 2011 is less than 300EB, indicating that most of the data are transient. DNA, as a very stable nano-molecule, is an ideal massive storage device for long-term data archive. The two most notable illustrations are from Church et al. and Goldman et al., whose approaches are well-optimized for most sequencing platforms – short synthesized DNA fragments without homopolymer. Here we suggested improvements on error handling methodology that could enable the integration of DNA-based computational process, e.g. algorithms based on self-assembly of DNA. As a proof of concept, a picture of size 438 bytes was encoded to DNA with Low-Density Parity-Check error-correction code. We salvaged a significant portion of sequencing reads with mutations generated during DNA synthesis and sequencing and successfully reconstructed the entire picture. A modular-based programming framework - DNAcodec with a XML-based data format was also introduced. Our experiments demonstrated the practicability of long DNA message recovery with high error-tolerance, which opens the field to biocomputing and synthetic biology.

  19. Using graph theory for automated electric circuit solving

    International Nuclear Information System (INIS)

    Toscano, L; Stella, S; Milotti, E

    2015-01-01

    Graph theory plays many important roles in modern physics and in many different contexts, spanning diverse topics such as the description of scale-free networks and the structure of the universe as a complex directed graph in causal set theory. Graph theory is also ideally suited to describe many concepts in computer science. Therefore it is increasingly important for physics students to master the basic concepts of graph theory. Here we describe a student project where we develop a computational approach to electric circuit solving which is based on graph theoretic concepts. This highly multidisciplinary approach combines abstract mathematics, linear algebra, the physics of circuits, and computer programming to reach the ambitious goal of implementing automated circuit solving. (paper)

  20. EISPACK-J: subprogram package for solving eigenvalue problems

    International Nuclear Information System (INIS)

    Fujimura, Toichiro; Tsutsui, Tsuneo

    1979-05-01

    EISPACK-J, a subprogram package for solving eigenvalue problems, has been developed and subprograms with a variety of functions have been prepared. These subprograms can solve standard problems of complex matrices, general problems of real matrices and special problems in which only the required eigenvalues and eigenvectors are calculated. They are compared to existing subprograms, showing their features through benchmark tests. Many test problems, including realistic scale problems, are provided for the benchmark tests. Discussions are made on computer core storage and computing time required for each subprogram, and accuracy of the solution. The results show that the subprograms of EISPACK-J, based on Householder, QR and inverse iteration methods, are the best in computing time and accuracy. (author)

  1. Gener: a minimal programming module for chemical controllers based on DNA strand displacement.

    Science.gov (United States)

    Kahramanoğulları, Ozan; Cardelli, Luca

    2015-09-01

    : Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research's DSD tool as well as to LaTeX. Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. ozan@cosbi.eu. © The Author 2015. Published by Oxford University Press.

  2. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  3. TRUCE: A Coordination Action for Unconventional Computation

    DEFF Research Database (Denmark)

    Amos, M.; Stepney, S.; Doursat, R.

    2012-01-01

    Unconventional computation (UCOMP) is an important and emerging area of scientific research, which explores new ways of computing that go beyond the traditional model, as well as quantum- and brain inspired computing. Such alternatives may encompass novel substrates (e.g., DNA, living cells...... quickly, and has the potential to revolutionize not only our fundamental understanding of the nature of computing, but the way in which we solve problems, design networks, do industrial fabrication, make drugs or construct buildings. The problems we already face in the 21 st century will require new...

  4. An Analysis of Collaborative Problem-Solving Activities Mediated by Individual-Based and Collaborative Computer Simulations

    Science.gov (United States)

    Chang, C.-J.; Chang, M.-H.; Liu, C.-C.; Chiu, B.-C.; Fan Chiang, S.-H.; Wen, C.-T.; Hwang, F.-K.; Chao, P.-Y.; Chen, Y.-L.; Chai, C.-S.

    2017-01-01

    Researchers have indicated that the collaborative problem-solving space afforded by the collaborative systems significantly impact the problem-solving process. However, recent investigations into collaborative simulations, which allow a group of students to jointly manipulate a problem in a shared problem space, have yielded divergent results…

  5. EPA, Notre Dame researchers discuss challenges in adopting DNA-based methods for monitoring invasive species in U.S. water bodies

    Science.gov (United States)

    DNA-based technology helps people solve problems. It can be used to correctly match organ donors with recipients, identify victims of natural and man-made disasters, and detect bacteria and other organisms that may pollute air, soil, food, or water.

  6. Internet Computer Coaches for Introductory Physics Problem Solving

    Science.gov (United States)

    Xu Ryan, Qing

    2013-01-01

    The ability to solve problems in a variety of contexts is becoming increasingly important in our rapidly changing technological society. Problem-solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving skills throughout the…

  7. Building problem solving environments with the arches framework

    Energy Technology Data Exchange (ETDEWEB)

    Debardeleben, Nathan [Los Alamos National Laboratory; Sass, Ron [U NORTH CAROLINA; Stanzione, Jr., Daniel [ASU; Ligon, Ill, Walter [CLEMSON UNIV

    2009-01-01

    The computational problems that scientists face are rapidly escalating in size and scope. Moreover, the computer systems used to solve these problems are becoming significantly more complex than the familiar, well-understood sequential model on their desktops. While it is possible to re-train scientists to use emerging high-performance computing (HPC) models, it is much more effective to provide them with a higher-level programming environment that has been specialized to their particular domain. By fostering interaction between HPC specialists and the domain scientists, problem-solving environments (PSEs) provide a collaborative environment. A PSE environment allows scientists to focus on expressing their computational problem while the PSE and associated tools support mapping that domain-specific problem to a high-performance computing system. This article describes Arches, an object-oriented framework for building domain-specific PSEs. The framework was designed to support a wide range of problem domains and to be extensible to support very different high-performance computing targets. To demonstrate this flexibility, two PSEs have been developed from the Arches framework to solve problem in two different domains and target very different computing platforms. The Coven PSE supports parallel applications that require large-scale parallelism found in cost-effective Beowulf clusters. In contrast, RCADE targets FPGA-based reconfigurable computing and was originally designed to aid NASA Earth scientists studying satellite instrument data.

  8. DNA-Based Applications in Nanobiotechnology

    Directory of Open Access Journals (Sweden)

    Khalid M. Abu-Salah

    2010-01-01

    Full Text Available Biological molecules such as deoxyribonucleic acid (DNA have shown great potential in fabrication and construction of nanostructures and devices. The very properties that make DNA so effective as genetic material also make it a very suitable molecule for programmed self-assembly. The use of DNA to assemble metals or semiconducting particles has been extended to construct metallic nanowires and functionalized nanotubes. This paper highlights some important aspects of conjugating the unique physical properties of dots or wires with the remarkable recognition capabilities of DNA which could lead to miniaturizing biological electronics and optical devices, including biosensors and probes. Attempts to use DNA-based nanocarriers for gene delivery are discussed. In addition, the ecological advantages and risks of nanotechnology including DNA-based nanobiotechnology are evaluated.

  9. Measurement and Theory of Hydrogen Bonding Contribution to Isosteric DNA Base Pairs

    OpenAIRE

    Khakshoor, Omid; Wheeler, Steven E.; Houk, K. N.; Kool, Eric T.

    2012-01-01

    We address the recent debate surrounding the ability of 2,4-difluorotoluene (F), a low-polarity mimic of thymine (T), to form a hydrogen-bonded complex with adenine in DNA. The hydrogen bonding ability of F has been characterized as small to zero in various experimental studies, and moderate to small in computational studies. However, recent X-ray crystallographic studies of difluorotoluene in DNA/RNA have indicated, based on interatomic distances, possible hydrogen bonding interactions betwe...

  10. Flexibility in Mathematics Problem Solving Based on Adversity Quotient

    Science.gov (United States)

    Dina, N. A.; Amin, S. M.; Masriyah

    2018-01-01

    Flexibility is an ability which is needed in problem solving. One of the ways in problem solving is influenced by Adversity Quotient (AQ). AQ is the power of facing difficulties. There are three categories of AQ namely climber, camper, and quitter. This research is a descriptive research using qualitative approach. The aim of this research is to describe flexibility in mathematics problem solving based on Adversity Quotient. The subjects of this research are climber student, camper student, and quitter student. This research was started by giving Adversity Response Profile (ARP) questioner continued by giving problem solving task and interviews. The validity of data measurement was using time triangulation. The results of this research shows that climber student uses two strategies in solving problem and doesn’t have difficulty. The camper student uses two strategies in solving problem but has difficulty to finish the second strategies. The quitter student uses one strategy in solving problem and has difficulty to finish it.

  11. Intelligent DNA-based molecular diagnostics using linked genetic markers

    Energy Technology Data Exchange (ETDEWEB)

    Pathak, D.K.; Perlin, M.W.; Hoffman, E.P.

    1994-12-31

    This paper describes a knowledge-based system for molecular diagnostics, and its application to fully automated diagnosis of X-linked genetic disorders. Molecular diagnostic information is used in clinical practice for determining genetic risks, such as carrier determination and prenatal diagnosis. Initially, blood samples are obtained from related individuals, and PCR amplification is performed. Linkage-based molecular diagnosis then entails three data analysis steps. First, for every individual, the alleles (i.e., DNA composition) are determined at specified chromosomal locations. Second, the flow of genetic material among the individuals is established. Third, the probability that a given individual is either a carrier of the disease or affected by the disease is determined. The current practice is to perform each of these three steps manually, which is costly, time consuming, labor-intensive, and error-prone. As such, the knowledge-intensive data analysis and interpretation supersede the actual experimentation effort as the major bottleneck in molecular diagnostics. By examining the human problem solving for the task, we have designed and implemented a prototype knowledge-based system capable of fully automating linkage-based molecular diagnostics in X-linked genetic disorders, including Duchenne Muscular Dystrophy (DMD). Our system uses knowledge-based interpretation of gel electrophoresis images to determine individual DNA marker labels, a constraint satisfaction search for consistent genetic flow among individuals, and a blackboard-style problem solver for risk assessment. We describe the system`s successful diagnosis of DMD carrier and affected individuals from raw clinical data.

  12. An Operational Matrix Technique for Solving Variable Order Fractional Differential-Integral Equation Based on the Second Kind of Chebyshev Polynomials

    Directory of Open Access Journals (Sweden)

    Jianping Liu

    2016-01-01

    Full Text Available An operational matrix technique is proposed to solve variable order fractional differential-integral equation based on the second kind of Chebyshev polynomials in this paper. The differential operational matrix and integral operational matrix are derived based on the second kind of Chebyshev polynomials. Using two types of operational matrixes, the original equation is transformed into the arithmetic product of several dependent matrixes, which can be viewed as an algebraic system after adopting the collocation points. Further, numerical solution of original equation is obtained by solving the algebraic system. Finally, several examples show that the numerical algorithm is computationally efficient.

  13. Students' Errors in Solving the Permutation and Combination Problems Based on Problem Solving Steps of Polya

    Science.gov (United States)

    Sukoriyanto; Nusantara, Toto; Subanji; Chandra, Tjang Daniel

    2016-01-01

    This article was written based on the results of a study evaluating students' errors in problem solving of permutation and combination in terms of problem solving steps according to Polya. Twenty-five students were asked to do four problems related to permutation and combination. The research results showed that the students still did a mistake in…

  14. Developing a project-based computational physics course grounded in expert practice

    Science.gov (United States)

    Burke, Christopher J.; Atherton, Timothy J.

    2017-04-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  15. 4P: fast computing of population genetics statistics from large DNA polymorphism panels.

    Science.gov (United States)

    Benazzo, Andrea; Panziera, Alex; Bertorelle, Giorgio

    2015-01-01

    Massive DNA sequencing has significantly increased the amount of data available for population genetics and molecular ecology studies. However, the parallel computation of simple statistics within and between populations from large panels of polymorphic sites is not yet available, making the exploratory analyses of a set or subset of data a very laborious task. Here, we present 4P (parallel processing of polymorphism panels), a stand-alone software program for the rapid computation of genetic variation statistics (including the joint frequency spectrum) from millions of DNA variants in multiple individuals and multiple populations. It handles a standard input file format commonly used to store DNA variation from empirical or simulation experiments. The computational performance of 4P was evaluated using large SNP (single nucleotide polymorphism) datasets from human genomes or obtained by simulations. 4P was faster or much faster than other comparable programs, and the impact of parallel computing using multicore computers or servers was evident. 4P is a useful tool for biologists who need a simple and rapid computer program to run exploratory population genetics analyses in large panels of genomic data. It is also particularly suitable to analyze multiple data sets produced in simulation studies. Unix, Windows, and MacOs versions are provided, as well as the source code for easier pipeline implementations.

  16. ENGAGE: A Game Based Learning and Problem Solving Framework

    Science.gov (United States)

    2012-07-13

    Gamification Summit 2012  Mensa Colloquium 2012.2: Social and Video Games  Seattle Science Festival  TED Salon Vancouver : http...From - To) 6/1/2012 – 6/30/2012 4. TITLE AND SUBTITLE ENGAGE: A Game Based Learning and Problem Solving Framework 5a. CONTRACT NUMBER N/A 5b...Popović ENGAGE: A Game Based Learning and Problem Solving Framework (Task 1 Month 4) Progress, Status and Management Report Monthly Progress

  17. Multilocus DNA fingerprinting in paternity analysis: a Chilean experience

    Directory of Open Access Journals (Sweden)

    Cifuentes O. Lucía

    2000-01-01

    Full Text Available DNA polymorphism is very useful in paternity analysis. The present paper describes paternity studies done using DNA profiles obtained with the (CAC5 probe. All of the subjects studied were involved in nonjudicial cases of paternity. Genomic DNA digested with HaeIII was run on agarose gels and hybridized in the gel with the (CAC5 probe labeled with 32P. The mean number of bands larger than the 4.3 kb per individual was 16.1. The mean proportion of bands shared among unrelated individuals was 0.08 and the mean number of test bands was 7.1. This corresponded to an exclusion probability greater than 0.999999. Paternity was excluded in 34.5% of the cases. The mutation frequency estimated from non-excluded cases was 0.01143 bands per child. In these cases, the paternity was confirmed by a locus-specific analysis of eight independent PCR-based loci. The paternity index was computed in all non-excluded cases. It can be concluded that this method is a powerful and inexpensive alternative to solve paternity doubts.

  18. Validation of DNA-based identification software by computation of pedigree likelihood ratios

    NARCIS (Netherlands)

    Slooten, K.

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually

  19. Metallic Nanostructures Based on DNA Nanoshapes

    Directory of Open Access Journals (Sweden)

    Boxuan Shen

    2016-08-01

    Full Text Available Metallic nanostructures have inspired extensive research over several decades, particularly within the field of nanoelectronics and increasingly in plasmonics. Due to the limitations of conventional lithography methods, the development of bottom-up fabricated metallic nanostructures has become more and more in demand. The remarkable development of DNA-based nanostructures has provided many successful methods and realizations for these needs, such as chemical DNA metallization via seeding or ionization, as well as DNA-guided lithography and casting of metallic nanoparticles by DNA molds. These methods offer high resolution, versatility and throughput and could enable the fabrication of arbitrarily-shaped structures with a 10-nm feature size, thus bringing novel applications into view. In this review, we cover the evolution of DNA-based metallic nanostructures, starting from the metallized double-stranded DNA for electronics and progress to sophisticated plasmonic structures based on DNA origami objects.

  20. Schoenfeld's problem solving theory in a student controlled learning environment

    NARCIS (Netherlands)

    Harskamp, E.; Suhre, C.

    2007-01-01

    This paper evaluates the effectiveness of a student controlled computer program for high school mathematics based on instruction principles derived from Schoenfeld's theory of problem solving. The computer program allows students to choose problems and to make use of hints during different episodes

  1. An Examination of the Relationship between Computation, Problem Solving, and Reading

    Science.gov (United States)

    Cormier, Damien C.; Yeo, Seungsoo; Christ, Theodore J.; Offrey, Laura D.; Pratt, Katherine

    2016-01-01

    The purpose of this study is to evaluate the relationship of mathematics calculation rate (curriculum-based measurement of mathematics; CBM-M), reading rate (curriculum-based measurement of reading; CBM-R), and mathematics application and problem solving skills (mathematics screener) among students at four levels of proficiency on a statewide…

  2. Hide and seek: How do DNA glycosylases locate oxidatively damaged DNA bases amidst a sea of undamaged bases?

    Science.gov (United States)

    Lee, Andrea J; Wallace, Susan S

    2017-06-01

    The first step of the base excision repair (BER) pathway responsible for removing oxidative DNA damage utilizes DNA glycosylases to find and remove the damaged DNA base. How glycosylases find the damaged base amidst a sea of undamaged bases has long been a question in the BER field. Single molecule total internal reflection fluorescence microscopy (SM TIRFM) experiments have allowed for an exciting look into this search mechanism and have found that DNA glycosylases scan along the DNA backbone in a bidirectional and random fashion. By comparing the search behavior of bacterial glycosylases from different structural families and with varying substrate specificities, it was found that glycosylases search for damage by periodically inserting a wedge residue into the DNA stack as they redundantly search tracks of DNA that are 450-600bp in length. These studies open up a wealth of possibilities for further study in real time of the interactions of DNA glycosylases and other BER enzymes with various DNA substrates. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  4. Intelligent Aggregation Based on Content Routing Scheme for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiachen Xu

    2017-10-01

    Full Text Available Cloud computing has emerged as today’s most exciting computing paradigm for providing services using a shared framework, which opens a new door for solving the problems of the explosive growth of digital resource demands and their corresponding convenience. With the exponential growth of the number of data types and data size in so-called big data work, the backbone network is under great pressure due to its transmission capacity, which is lower than the growth of the data size and would seriously hinder the development of the network without an effective approach to solve this problem. In this paper, an Intelligent Aggregation based on a Content Routing (IACR scheme for cloud computing, which could reduce the amount of data in the network effectively and play a basic supporting role in the development of cloud computing, is first put forward. All in all, the main innovations in this paper are: (1 A framework for intelligent aggregation based on content routing is proposed, which can support aggregation based content routing; (2 The proposed IACR scheme could effectively route the high aggregation ratio data to the data center through the same routing path so as to effectively reduce the amount of data that the network transmits. The theoretical analyses experiments and results show that, compared with the previous original routing scheme, the IACR scheme can balance the load of the whole network, reduce the amount of data transmitted in the network by 41.8%, and reduce the transmission time by 31.6% in the same network with a more balanced network load.

  5. Forensic DNA methylation profiling from evidence material for investigative leads

    Science.gov (United States)

    Lee, Hwan Young; Lee, Soong Deok; Shin, Kyoung-Jin

    2016-01-01

    DNA methylation is emerging as an attractive marker providing investigative leads to solve crimes in forensic genetics. The identification of body fluids that utilizes tissue-specific DNA methylation can contribute to solving crimes by predicting activity related to the evidence material. The age estimation based on DNA methylation is expected to reduce the number of potential suspects, when the DNA profile from the evidence does not match with any known person, including those stored in the forensic database. Moreover, the variation in DNA implicates environmental exposure, such as cigarette smoking and alcohol consumption, thereby suggesting the possibility to be used as a marker for predicting the lifestyle of potential suspect. In this review, we describe recent advances in our understanding of DNA methylation variations and the utility of DNA methylation as a forensic marker for advanced investigative leads from evidence materials. [BMB Reports 2016; 49(7): 359-369] PMID:27099236

  6. A Decomposition-Based Pricing Method for Solving a Large-Scale MILP Model for an Integrated Fishery

    Directory of Open Access Journals (Sweden)

    M. Babul Hasan

    2007-01-01

    The IFP can be decomposed into a trawler-scheduling subproblem and a fish-processing subproblem in two different ways by relaxing different sets of constraints. We tried conventional decomposition techniques including subgradient optimization and Dantzig-Wolfe decomposition, both of which were unacceptably slow. We then developed a decomposition-based pricing method for solving the large fishery model, which gives excellent computation times. Numerical results for several planning horizon models are presented.

  7. Processing of free radical damaged DNA bases

    International Nuclear Information System (INIS)

    Wallace, S.

    2003-01-01

    Free radicals produced during the radiolysis of water gives rise to a plethora of DNA damages including single strand breaks, sites of base loss and a wide variety of purine and pyrimidine base lesions. All these damages are processed in cells by base excision repair. The oxidative DNA glycosylases which catalyze the first step in the removal of a base damage during base excision repair evolved primarily to protect the cells from the deleterious mutagenic effects of single free radical-induced DNA lesions arising during oxidative metabolism. This is evidenced by the high spontaneous mutation rate in bacterial mutants lacking the oxidative DNA glycosylases. However, when a low LET photon transverses the DNA molecule, a burst of free radicals is produced during the radiolysis of water that leads to the formation of clustered damages in the DNA molecule, that are recognized by the oxidative DNA glycosylases. When substrates containing two closely opposed sugar damages or base and sugar damages are incubated with the oxidative DNA glycosylases in vitro, one strand is readily incised by the lyase activity of the DNA glycosylase. Whether or not the second strand is incised depends on the distance between the strand break resulting from the incised first strand and the remaining DNA lesion on the other strand. If the lesions are more than two or three base pairs apart, the second strand is readily cleaved by the DNA glycosylase, giving rise to a double strand break. Even if the entire base excision repair system is reconstituted in vitro, whether or not a double strand break ensues depends solely upon the ability of the DNA glycosylase to cleave the second strand. These data predicted that cells deficient in the oxidative DNA glycosylases would be radioresistant while those that overproduce an oxidative DNA glycosylase would be radiosensitive. This prediction was indeed borne in Escherichia coli that is, mutants lacking the oxidative DNA glycosylases are radioresistant

  8. Model Integrated Problem Solving Based Learning pada Perkuliahan Dasar-dasar Kimia Analitik

    Directory of Open Access Journals (Sweden)

    Indarini Dwi Pursitasari

    2013-07-01

    Full Text Available Abstract: Integrated Problem Solving Based Learning Model on Foundation of Analytical Chemistry. This study was conducted to know the effects of Integrated Problem Solving Based Learning (IPSBL model on problem solving skills and cognitive ability of pre-service teachers. The subjects of the study were 41 pre- service teachers, 21 in the experimental group and 20 in the control group. The data were collected through a test on problem solving skills, a test on cognitive ability, and a questionnaire on the students’opinions on the use of IPSBL model. The quantitative data were analyzed using t-test and one-way ANOVA, and the qualitative data were analyzed by counting the percentage. The results of the study show that the implementation of IPSBL model increased the problem solving skills and cognitive ability of the pre-service teachers . The model was also responded positively by the research subjects. Abstrak: Model Integrated Problem Solving Based learning pada Perkuliahan Dasar-dasar Kimia Analitik. Penelitian ini bertujuan menentukan pengaruh model Integrated Problem Solving Based Learning(IPSBL terhadap peningkatan kemampuan problem solving dan kemampuan kognitif mahasiswa calon guru. Subjek penelitian terdiri dari 21 mahasiswa kelas eksperimen dan 20 mahasiswa kelas kontrol. Data dikumpulkan menggunakan tes kemampuan problem solving, tes kemampuan kognitif, dan angket untuk menjaring pendapat mahasiswa terhadap penggunaan model IPSBL . Data kuantitatif dianalisis denga n uji- t dan Anava dengan bantuan program SPSS 16.0. Data kualitatif dihitung persentasenya. Hasil penelitian menunjukkan bahwa model IPSBL dapat meningkatkan kemampuan problem solving dan kemampuan kognitif serta mendapat tanggapan yang positif dari mahasiswa.

  9. Model Integrated Problem Solving Based Learning pada Perkuliahan Dasar-dasar Kimia Analitik

    OpenAIRE

    Indarini Dwi Pursitasari; Anna Permanasari

    2013-01-01

    Abstract: Integrated Problem Solving Based Learning Model on Foundation of Analytical Chemistry. This study was conducted to know the effects of Integrated Problem Solving Based Learning (IPSBL) model on problem solving skills and cognitive ability of pre-service teachers. The subjects of the study were 41 pre- service teachers, 21 in the experimental group and 20 in the control group. The data were collected through a test on problem solving skills, a test on cognitive ability, and a questio...

  10. Model Integrated Problem Solving Based Learning Pada Perkuliahan Dasar-dasar Kimia Analitik

    OpenAIRE

    Pursitasari, Indarini Dwi; Permanasari, Anna

    2012-01-01

    : Integrated Problem Solving Based Learning Model on Foundation of Analytical Chemistry. This study was conducted to know the effects of Integrated Problem Solving Based Learning (IPSBL) model on problem solving skills and cognitive ability of pre-service teachers. The subjects of the study were 41 pre- service teachers, 21 in the experimental group and 20 in the control group. The data were collected through a test on problem solving skills, a test on cognitive ability, and a questionnaire o...

  11. [Single-molecule detection and characterization of DNA replication based on DNA origami].

    Science.gov (United States)

    Wang, Qi; Fan, Youjie; Li, Bin

    2014-08-01

    To investigate single-molecule detection and characterization of DNA replication. Single-stranded DNA (ssDNA) as the template of DNA replication was attached to DNA origami by a hybridization reaction based on the complementary base-pairing principle. DNA replication catalyzed by E.coli DNA polymerase I Klenow Fragment (KF) was detected using atomic force microscopy (AFM). The height variations between the ssDNA and the double-stranded DNA (dsDNA), the distribution of KF during DNA replication and biotin-streptavidin (BA) complexes on the DNA strand after replication were detected. Agarose gel electrophoresis was employed to analyze the changes in the DNA after replication. The designed ssDNA could be anchored on the target positions of over 50% of the DNA origami. The KF was capable of binding to the ssDNA fixed on DNA origami and performing its catalytic activities, and was finally dissociated from the DNA after replication. The height of DNA strand increased by about 0.7 nm after replication. The addition of streptavidin also resulted in an DNA height increase to about 4.9 nm due to the formation of BA complexes on the biotinylated dsDNA. The resulting dsDNA and BA complex were subsequently confirmed by agarose gel electrophoresis. The combination of AFM and DNA origami allows detection and characterization of DNA replication at the single molecule level, and this approach provides better insights into the mechanism of DNA polymerase and the factors affecting DNA replication.

  12. Aviram–Ratner rectifying mechanism for DNA base-pair sequencing through graphene nanogaps

    International Nuclear Information System (INIS)

    Agapito, Luis A; Gayles, Jacob; Wolowiec, Christian; Kioussis, Nicholas

    2012-01-01

    We demonstrate that biological molecules such as Watson–Crick DNA base pairs can behave as biological Aviram–Ratner electrical rectifiers because of the spatial separation and weak hydrogen bonding between the nucleobases. We have performed a parallel computational implementation of the ab initio non-equilibrium Green’s function (NEGF) theory to determine the electrical response of graphene—base-pair—graphene junctions. The results show an asymmetric (rectifying) current–voltage response for the cytosine–guanine base pair adsorbed on a graphene nanogap. In sharp contrast we find a symmetric response for the thymine–adenine case. We propose applying the asymmetry of the current–voltage response as a sensing criterion to the technological challenge of rapid DNA sequencing via graphene nanogaps. (paper)

  13. Ultrasensitive FRET-based DNA sensor using PNA/DNA hybridization.

    Science.gov (United States)

    Yang, Lan-Hee; Ahn, Dong June; Koo, Eunhae

    2016-12-01

    In the diagnosis of genetic diseases, rapid and highly sensitive DNA detection is crucial. Therefore, many strategies for detecting target DNA have been developed, including electrical, optical, and mechanical methods. Herein, a highly sensitive FRET based sensor was developed by using PNA (Peptide Nucleic Acid) probe and QD, in which red color QDs are hybridized with capture probes, reporter probes and target DNAs by EDC-NHS coupling. The hybridized probe with target DNA gives off fluorescent signal due to the energy transfer from QD to Cy5 dye in the reporter probe. Compared to the conventional DNA sensor using DNA probes, the DNA sensor using PNA probes shows higher FRET factor and efficiency due to the higher reactivity between PNA and target DNA. In addition, to elicit the effect of the distance between the donor and the acceptor, we have investigated two types of the reporter probes having Cy5 dyes attached at the different positions of the reporter probes. Results show that the shorter the distance between QDs and Cy5s, the stronger the signal intensity. Furthermore, based on the fluorescence microscopy images using microcapillary chips, the FRET signal is enhanced to be up to 276% times stronger than the signal obtained using the cuvette by the fluorescence spectrometer. These results suggest that the PNA probe system conjugated with QDs can be used as ultrasensitive DNA nanosensors. Copyright © 2016. Published by Elsevier B.V.

  14. Problem-Solving Test: Conditional Gene Targeting Using the Cre/loxP Recombination System

    Science.gov (United States)

    Szeberényi, József

    2013-01-01

    Terms to be familiar with before you start to solve the test: gene targeting, knock-out mutation, bacteriophage, complementary base-pairing, homologous recombination, deletion, transgenic organisms, promoter, polyadenylation element, transgene, DNA replication, RNA polymerase, Shine-Dalgarno sequence, restriction endonuclease, polymerase chain…

  15. A DNA-based semantic fusion model for remote sensing data.

    Directory of Open Access Journals (Sweden)

    Heng Sun

    Full Text Available Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.

  16. A DNA-based semantic fusion model for remote sensing data.

    Science.gov (United States)

    Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H

    2013-01-01

    Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.

  17. GPU-based high-performance computing for radiation therapy

    International Nuclear Information System (INIS)

    Jia, Xun; Jiang, Steve B; Ziegenhein, Peter

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. (topical review)

  18. Analytical derivation: An epistemic game for solving mathematically based physics problems

    Science.gov (United States)

    Bajracharya, Rabindra R.; Thompson, John R.

    2016-06-01

    Problem solving, which often involves multiple steps, is an integral part of physics learning and teaching. Using the perspective of the epistemic game, we documented a specific game that is commonly pursued by students while solving mathematically based physics problems: the analytical derivation game. This game involves deriving an equation through symbolic manipulations and routine mathematical operations, usually without any physical interpretation of the processes. This game often creates cognitive obstacles in students, preventing them from using alternative resources or better approaches during problem solving. We conducted hour-long, semi-structured, individual interviews with fourteen introductory physics students. Students were asked to solve four "pseudophysics" problems containing algebraic and graphical representations. The problems required the application of the fundamental theorem of calculus (FTC), which is one of the most frequently used mathematical concepts in physics problem solving. We show that the analytical derivation game is necessary, but not sufficient, to solve mathematically based physics problems, specifically those involving graphical representations.

  19. A quantum theoretical study of reactions of methyldiazonium ion with DNA base pairs

    International Nuclear Information System (INIS)

    Shukla, P.K.; Ganapathy, Vinay; Mishra, P.C.

    2011-01-01

    Graphical abstract: Reactions of methyldiazonium ion at the different sites of the DNA bases in the Watson-Crick GC and AT base pairs were investigated employing density functional and second order Moller-Plesset (MP2) perturbation theories. Display Omitted Highlights: → Methylation of the DNA bases is important as it can cause mutation and cancer. → Methylation reactions of the GC and AT base pairs with CH 3 N 2 + were not studied earlier theoretically. → Experimental observations have been explained using theoretical methods. - Abstract: Methylation of the DNA bases in the Watson-Crick GC and AT base pairs by the methyldiazonium ion was investigated employing density functional and second order Moller-Plesset (MP2) perturbation theories. Methylation at the N3, N7 and O6 sites of guanine, N1, N3 and N7 sites of adenine, O2 and N3 sites of cytosine and the O2 and O4 sites of thymine were considered. The computed reactivities for methylation follow the order N7(guanine) > N3(adenine) > O6(guanine) which is in agreement with experiment. The base pairing in DNA is found to play a significant role with regard to reactivities of the different sites.

  20. DNA barcoding for molecular identification of Demodex based on mitochondrial genes.

    Science.gov (United States)

    Hu, Li; Yang, YuanJun; Zhao, YaE; Niu, DongLing; Yang, Rui; Wang, RuiLing; Lu, Zhaohui; Li, XiaoQi

    2017-12-01

    There has been no widely accepted DNA barcode for species identification of Demodex. In this study, we attempted to solve this issue. First, mitochondrial cox1-5' and 12S gene fragments of Demodex folloculorum, D. brevis, D. canis, and D. caprae were amplified, cloned, and sequenced for the first time; intra/interspecific divergences were computed and phylogenetic trees were reconstructed. Then, divergence frequency distribution plots of those two gene fragments were drawn together with mtDNA cox1-middle region and 16S obtained in previous studies. Finally, their identification efficiency was evaluated by comparing barcoding gap. Results indicated that 12S had the higher identification efficiency. Specifically, for cox1-5' region of the four Demodex species, intraspecific divergences were less than 2.0%, and interspecific divergences were 21.1-31.0%; for 12S, intraspecific divergences were less than 1.4%, and interspecific divergences were 20.8-26.9%. The phylogenetic trees demonstrated that the four Demodex species clustered separately, and divergence frequency distribution plot showed that the largest intraspecific divergence of 12S (1.4%) was less than cox1-5' region (2.0%), cox1-middle region (3.1%), and 16S (2.8%). The barcoding gap of 12S was 19.4%, larger than cox1-5' region (19.1%), cox1-middle region (11.3%), and 16S (13.0%); the interspecific divergence span of 12S was 6.2%, smaller than cox1-5' region (10.0%), cox1-middle region (14.1%), and 16S (11.4%). Moreover, 12S has a moderate length (517 bp) for sequencing at once. Therefore, we proposed mtDNA 12S was more suitable than cox1 and 16S to be a DNA barcode for classification and identification of Demodex at lower category level.

  1. Use of a genetic algorithm to solve two-fluid flow problems on an NCUBE multiprocessor computer

    International Nuclear Information System (INIS)

    Pryor, R.J.; Cline, D.D.

    1992-01-01

    A method of solving the two-phase fluid flow equations using a genetic algorithm on a NCUBE multiprocessor computer is presented. The topics discussed are the two-phase flow equations, the genetic representation of the unknowns, the fitness function, the genetic operators, and the implementation of the algorithm on the NCUBE computer. The efficiency of the implementation is investigated using a pipe blowdown problem. Effects of varying the genetic parameters and the number of processors are presented

  2. Use of a genetic agorithm to solve two-fluid flow problems on an NCUBE multiprocessor computer

    International Nuclear Information System (INIS)

    Pryor, R.J.; Cline, D.D.

    1993-01-01

    A method of solving the two-phases fluid flow equations using a genetic algorithm on a NCUBE multiprocessor computer is presented. The topics discussed are the two-phase flow equations, the genetic representation of the unkowns, the fitness function, the genetic operators, and the implementation of the algorithm on the NCUBE computer. The efficiency of the implementation is investigated using a pipe blowdown problem. Effects of varying the genetic parameters and the number of processors are presented. (orig.)

  3. Parallel Algorithm Solves Coupled Differential Equations

    Science.gov (United States)

    Hayashi, A.

    1987-01-01

    Numerical methods adapted to concurrent processing. Algorithm solves set of coupled partial differential equations by numerical integration. Adapted to run on hypercube computer, algorithm separates problem into smaller problems solved concurrently. Increase in computing speed with concurrent processing over that achievable with conventional sequential processing appreciable, especially for large problems.

  4. Synthesis of furan-based DNA binders and their interaction with DNA

    International Nuclear Information System (INIS)

    Voege, Andrea; Hoffmann, Sascha; Gabel, Detlef

    2006-01-01

    In recent years, many substances, based on naturally occurring DNA-binding molecules have been developed for the use in cancer therapy and as virostatica. Most of these substances are binding specifically to A-T rich sequences in the DNA minor groove. Neutral and positively charged DNA-binders are known. BNCT is most effective, which the boron is directly located in the cellular nucleus, so that the intercation with thermal neutrons can directly damage the DNA. To reach this aim, we have connected ammonioundecahydrododecaborate(1-) to DNA-binding structures such as 2,5-bis(4-formylphenyl)furan via a Schiff-Base reaction followed by a reduction of the imine to a secondary amine. In a following step the amine can be alkylated to insert positive charges to prevent repulsion between the compounds and the negatively charged sugar-phosphate-backbone of the DNA. (author)

  5. Computational Approaches to Nucleic Acid Origami.

    Science.gov (United States)

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  6. The effect of regulation feedback in a computer-based formative assessment on information problem solving

    NARCIS (Netherlands)

    Timmers, Caroline; Walraven, Amber; Veldkamp, Bernard P.

    2015-01-01

    This study examines the effect of regulation feedback in a computer-based formative assessment in the context of searching for information online. Fifty 13-year-old students completed two randomly selected assessment tasks, receiving automated regulation feedback between them. Student performance

  7. Knowledge-based computer systems for radiotherapy planning.

    Science.gov (United States)

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  8. Reinforcement learning in computer vision

    Science.gov (United States)

    Bernstein, A. V.; Burnaev, E. V.

    2018-04-01

    Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.

  9. Interactive problem solving using LOGO

    CERN Document Server

    Boecker, Heinz-Dieter; Fischer, Gerhard

    2014-01-01

    This book is unique in that its stress is not on the mastery of a programming language, but on the importance and value of interactive problem solving. The authors focus on several specific interest worlds: mathematics, computer science, artificial intelligence, linguistics, and games; however, their approach can serve as a model that may be applied easily to other fields as well. Those who are interested in symbolic computing will find that Interactive Problem Solving Using LOGO provides a gentle introduction from which one may move on to other, more advanced computational frameworks or more

  10. Solving applied mathematical problems with Matlab

    CERN Document Server

    Xue, Dingyu

    2008-01-01

    Computer Mathematics Language-An Overview. Fundamentals of MATLAB Programming. Calculus Problems. MATLAB Computations of Linear Algebra Problems. Integral Transforms and Complex Variable Functions. Solutions to Nonlinear Equations and Optimization Problems. MATLAB Solutions to Differential Equation Problems. Solving Interpolations and Approximations Problems. Solving Probability and Mathematical Statistics Problems. Nontraditional Solution Methods for Mathematical Problems.

  11. Solving large mixed linear models using preconditioned conjugate gradient iteration.

    Science.gov (United States)

    Strandén, I; Lidauer, M

    1999-12-01

    Continuous evaluation of dairy cattle with a random regression test-day model requires a fast solving method and algorithm. A new computing technique feasible in Jacobi and conjugate gradient based iterative methods using iteration on data is presented. In the new computing technique, the calculations in multiplication of a vector by a matrix were recorded to three steps instead of the commonly used two steps. The three-step method was implemented in a general mixed linear model program that used preconditioned conjugate gradient iteration. Performance of this program in comparison to other general solving programs was assessed via estimation of breeding values using univariate, multivariate, and random regression test-day models. Central processing unit time per iteration with the new three-step technique was, at best, one-third that needed with the old technique. Performance was best with the test-day model, which was the largest and most complex model used. The new program did well in comparison to other general software. Programs keeping the mixed model equations in random access memory required at least 20 and 435% more time to solve the univariate and multivariate animal models, respectively. Computations of the second best iteration on data took approximately three and five times longer for the animal and test-day models, respectively, than did the new program. Good performance was due to fast computing time per iteration and quick convergence to the final solutions. Use of preconditioned conjugate gradient based methods in solving large breeding value problems is supported by our findings.

  12. Problem Solving Reasoning and Problem Based Instruction in Geometry Learning

    Science.gov (United States)

    Sulistyowati, F.; Budiyono, B.; Slamet, I.

    2017-09-01

    This research aims to analyze the comparison Problem Solving Reasoning (PSR) and Problem Based Instruction (PBI) on problem solving and mathematical communication abilities viewed from Self-Regulated Learning (SRL). Learning was given to grade 8th junior high school students. This research uses quasi experimental method, and then with descriptive analysis. Data were analyzed using two-ways multivariate analysis of variance (MANOVA) and one-way analysis of variance (ANOVA) with different cells. The result of data analysis were learning model gives different effect, level of SRL gives the same effect, and there is no interaction between the learning model with the SRL on the problem solving and mathematical communication abilities. The t-test statistic was used to find out more effective learning model. Based on the test, regardless of the level of SRL, PSR is more effective than PBI for problemsolving ability. The result of descriptive analysis was PSR had the advantage in creating learning that optimizing the ability of learners in reasoning to solve a mathematical problem. Consequently, the PSR is the right learning model to be applied in the classroom to improve problem solving ability of learners.

  13. LUCKY-TD code for solving the time-dependent transport equation with the use of parallel computations

    Energy Technology Data Exchange (ETDEWEB)

    Moryakov, A. V., E-mail: sailor@orc.ru [National Research Centre Kurchatov Institute (Russian Federation)

    2016-12-15

    An algorithm for solving the time-dependent transport equation in the P{sub m}S{sub n} group approximation with the use of parallel computations is presented. The algorithm is implemented in the LUCKY-TD code for supercomputers employing the MPI standard for the data exchange between parallel processes.

  14. Computational analyses of ancient pathogen DNA from herbarium samples: challenges and prospects.

    Science.gov (United States)

    Yoshida, Kentaro; Sasaki, Eriko; Kamoun, Sophien

    2015-01-01

    The application of DNA sequencing technology to the study of ancient DNA has enabled the reconstruction of past epidemics from genomes of historically important plant-associated microbes. Recently, the genome sequences of the potato late blight pathogen Phytophthora infestans were analyzed from 19th century herbarium specimens. These herbarium samples originated from infected potatoes collected during and after the Irish potato famine. Herbaria have therefore great potential to help elucidate past epidemics of crops, date the emergence of pathogens, and inform about past pathogen population dynamics. DNA preservation in herbarium samples was unexpectedly good, raising the possibility of a whole new research area in plant and microbial genomics. However, the recovered DNA can be extremely fragmented resulting in specific challenges in reconstructing genome sequences. Here we review some of the challenges in computational analyses of ancient DNA from herbarium samples. We also applied the recently developed linkage method to haplotype reconstruction of diploid or polyploid genomes from fragmented ancient DNA.

  15. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  16. Equilibrium Design Based on Design Thinking Solving: An Integrated Multicriteria Decision-Making Methodology

    Directory of Open Access Journals (Sweden)

    Yi-Xiong Feng

    2013-01-01

    Full Text Available A multicriteria decision-making model was proposed in order to acquire the optimum one among different product design schemes. VIKOR method was introduced to compute the ranking value of each scheme. A multiobjective optimization model for criteria weight was established. In this model, projection pursuit method was employed to identify a criteria weight set which could keep classification information of original schemes to the greatest extent, while PROMETHEE II was adopted to keep sorting information. Dominance based multiobjective simulated annealing algorithm (D-MOSA was introduced to solve the optimization model. Finally, an example was taken to demonstrate the feasibility and efficiency of this model.

  17. Dense image correspondences for computer vision

    CERN Document Server

    Liu, Ce

    2016-01-01

    This book describes the fundamental building-block of many new computer vision systems: dense and robust correspondence estimation. Dense correspondence estimation techniques are now successfully being used to solve a wide range of computer vision problems, very different from the traditional applications such techniques were originally developed to solve. This book introduces the techniques used for establishing correspondences between challenging image pairs, the novel features used to make these techniques robust, and the many problems dense correspondences are now being used to solve. The book provides information to anyone attempting to utilize dense correspondences in order to solve new or existing computer vision problems. The editors describe how to solve many computer vision problems by using dense correspondence estimation. Finally, it surveys resources, code, and data necessary for expediting the development of effective correspondence-based computer vision systems.   ·         Provides i...

  18. [Interactions of DNA bases with individual water molecules. Molecular mechanics and quantum mechanics computation results vs. experimental data].

    Science.gov (United States)

    Gonzalez, E; Lino, J; Deriabina, A; Herrera, J N F; Poltev, V I

    2013-01-01

    To elucidate details of the DNA-water interactions we performed the calculations and systemaitic search for minima of interaction energy of the systems consisting of one of DNA bases and one or two water molecules. The results of calculations using two force fields of molecular mechanics (MM) and correlated ab initio method MP2/6-31G(d, p) of quantum mechanics (QM) have been compared with one another and with experimental data. The calculations demonstrated a qualitative agreement between geometry characteristics of the most of local energy minima obtained via different methods. The deepest minima revealed by MM and QM methods correspond to water molecule position between two neighbor hydrophilic centers of the base and to the formation by water molecule of hydrogen bonds with them. Nevertheless, the relative depth of some minima and peculiarities of mutual water-base positions in' these minima depend on the method used. The analysis revealed insignificance of some differences in the results of calculations performed via different methods and the importance of other ones for the description of DNA hydration. The calculations via MM methods enable us to reproduce quantitatively all the experimental data on the enthalpies of complex formation of single water molecule with the set of mono-, di-, and trimethylated bases, as well as on water molecule locations near base hydrophilic atoms in the crystals of DNA duplex fragments, while some of these data cannot be rationalized by QM calculations.

  19. Interaction of anthraquinone anti-cancer drugs with DNA:Experimental and computational quantum chemical study

    Science.gov (United States)

    Al-Otaibi, Jamelah S.; Teesdale Spittle, Paul; El Gogary, Tarek M.

    2017-01-01

    Anthraquinones form the basis of several anticancer drugs. Anthraquinones anticancer drugs carry out their cytotoxic activities through their interaction with DNA, and inhibition of topoisomerase II activity. Anthraquinones (AQ4 and AQ4H) were synthesized and studied along with 1,4-DAAQ by computational and experimental tools. The purpose of this study is to shade more light on mechanism of interaction between anthraquinone DNA affinic agents and different types of DNA. This study will lead to gain of information useful for drug design and development. Molecular structures were optimized using DFT B3LYP/6-31 + G(d). Depending on intramolecular hydrogen bonding interactions two conformers of AQ4 were detected and computed as 25.667 kcal/mol apart. Molecular reactivity of the anthraquinone compounds was explored using global and condensed descriptors (electrophilicity and Fukui functions). Molecular docking studies for the inhibition of CDK2 and DNA binding were carried out to explore the anti cancer potency of these drugs. NMR and UV-VIS electronic absorption spectra of anthraquinones/DNA were investigated at the physiological pH. The interaction of the three anthraquinones (AQ4, AQ4H and 1,4-DAAQ) were studied with three DNA (calf thymus DNA, (Poly[dA].Poly[dT]) and (Poly[dG].Poly[dC]). NMR study shows a qualitative pattern of drug/DNA interaction in terms of band shift and broadening. UV-VIS electronic absorption spectra were employed to measure the affinity constants of drug/DNA binding using Scatchard analysis.

  20. Optimal calculational schemes for solving multigroup photon transport problem

    International Nuclear Information System (INIS)

    Dubinin, A.A.; Kurachenko, Yu.A.

    1987-01-01

    A scheme of complex algorithm for solving multigroup equation of radiation transport is suggested. The algorithm is based on using the method of successive collisions, the method of forward scattering and the spherical harmonics method, and is realized in the FORAP program (FORTRAN, BESM-6 computer). As an example the results of calculating reactor photon transport in water are presented. The considered algorithm being modified may be used for solving neutron transport problems

  1. A variable neighborhood descent based heuristic to solve the capacitated location-routing problem

    Directory of Open Access Journals (Sweden)

    M. S. Jabal-Ameli

    2011-01-01

    Full Text Available Location-routing problem (LRP is established as a new research area in the context of location analysis. The primary concern of LRP is on locating facilities and routing of vehicles among established facilities and existing demand points. In this work, we address the capacitated LRP which arises in many practical applications within logistics and supply chain management. The objective is to minimize the overall system costs which include the fixed costs of opening depots and using vehicles at each depot site, and the variable costs associated with delivery activities. A novel heuristic is proposed which is based on variable neighborhood descent (VND algorithm to solve the resulted problem. The computational study indicates that the proposed VND based heuristic is highly competitive with the existing solution algorithms in terms of solution quality.

  2. Computer programs for solving systems of nonlinear equations

    International Nuclear Information System (INIS)

    Asaoka, Takumi

    1978-03-01

    Computer programs to find a solution, usually the one closest to some guess, of a system of simultaneous nonlinear equations are provided for real functions of the real arguments. These are based on quasi-Newton methods or projection methods, which are briefly reviewed in the present report. Benchmark tests were performed on these subroutines to grasp their characteristics. As the program not requiring analytical forms of the derivatives of the Jacobian matrix, we have dealt with NS01A of Powell, NS03A of Reid for a system with the sparse Jacobian and NONLIN of Brown. Of these three subroutines of quasi-Newton methods, NONLIN is shown to be the most useful because of its stable algorithm and short computation time. On the other hand, as the subroutine for which the derivatives of the Jacobian are to be supplied analytically, we have tested INTECH of a quasi-Newton method based on the Boggs' algorithm, PROJA of Georg and Keller based on the projection method and an option of NS03A. The results have shown that INTECH, treating variables which appear only linearly in the functions separately, takes the shortest computation time, on the whole, while the projection method requires further research to find an optimal algorithm. (auth.)

  3. DNA interaction with platinum-based cytostatics revealed by DNA sequencing.

    Science.gov (United States)

    Smerkova, Kristyna; Vaculovic, Tomas; Vaculovicova, Marketa; Kynicky, Jindrich; Brtnicky, Martin; Eckschlager, Tomas; Stiborova, Marie; Hubalek, Jaromir; Adam, Vojtech

    2017-12-15

    The main mechanism of action of platinum-based cytostatic drugs - cisplatin, oxaliplatin and carboplatin - is the formation of DNA cross-links, which restricts the transcription due to the disability of DNA to enter the active site of the polymerase. The polymerase chain reaction (PCR) was employed as a simplified model of the amplification process in the cell nucleus. PCR with fluorescently labelled dideoxynucleotides commonly employed for DNA sequencing was used to monitor the effect of platinum-based cytostatics on DNA in terms of decrease in labeling efficiency dependent on a presence of the DNA-drug cross-link. It was found that significantly different amounts of the drugs - cisplatin (0.21 μg/mL), oxaliplatin (5.23 μg/mL), and carboplatin (71.11 μg/mL) - were required to cause the same quenching effect (50%) on the fluorescent labelling of 50 μg/mL of DNA. Moreover, it was found that even though the amounts of the drugs was applied to the reaction mixture differing by several orders of magnitude, the amount of incorporated platinum, quantified by inductively coupled plasma mass spectrometry, was in all cases at the level of tenths of μg per 5 μg of DNA. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Use of capillary GC-MS for identification of radiation-induced DNA base damage: Implications for base-excision repair of DNA

    International Nuclear Information System (INIS)

    Dizdaroglu, M.

    1985-01-01

    Application of GC-MS to characterization of radiation-induced base products of DNA and DNa base-amino acid crosslinks is presented. Samples of γ-irradiated DNa were hydrolyzed with formic acid, trimethylsilylated and subjected to GC-MS analysis using a fused silica capillary column. Hydrolysis conditions suitable for the simultaneous analysis of the radiation-induced products of all four DNA bases in a single run were determined. The trimethylsilyl derivatives of these products had excellent GC-properties and easily interpretable mass spectra. The complementary use of t-butyldimetylsilyl derivatives was also demonstrated. Moreover, the usefulness of this method for identification of radiation-induced DNA base-amino acid crosslinks was shown using γ-irradiated mixtures of thymine and tyrosine or phenylalanine. Because of the excellent resolving power of capillary GC and the instant and highly sensitive identification by MS, GC-MS is suggested as a suitable technique for identification of altered bases removed from DNA by base-excision repair enzymes

  5. Charge transport through DNA based electronic barriers

    Science.gov (United States)

    Patil, Sunil R.; Chawda, Vivek; Qi, Jianqing; Anantram, M. P.; Sinha, Niraj

    2018-05-01

    We report charge transport in electronic 'barriers' constructed by sequence engineering in DNA. Considering the ionization potentials of Thymine-Adenine (AT) and Guanine-Cytosine (GC) base pairs, we treat AT as 'barriers'. The effect of DNA conformation (A and B form) on charge transport is also investigated. Particularly, the effect of width of 'barriers' on hole transport is investigated. Density functional theory (DFT) calculations are performed on energy minimized DNA structures to obtain the electronic Hamiltonian. The quantum transport calculations are performed using the Landauer-Buttiker framework. Our main findings are contrary to previous studies. We find that a longer A-DNA with more AT base pairs can conduct better than shorter A-DNA with a smaller number of AT base pairs. We also find that some sequences of A-DNA can conduct better than a corresponding B-DNA with the same sequence. The counterions mediated charge transport and long range interactions are speculated to be responsible for counter-intuitive length and AT content dependence of conductance of A-DNA.

  6. A Hybrid Genetic Algorithm with a Knowledge-Based Operator for Solving the Job Shop Scheduling Problems

    Directory of Open Access Journals (Sweden)

    Hamed Piroozfard

    2016-01-01

    Full Text Available Scheduling is considered as an important topic in production management and combinatorial optimization in which it ubiquitously exists in most of the real-world applications. The attempts of finding optimal or near optimal solutions for the job shop scheduling problems are deemed important, because they are characterized as highly complex and NP-hard problems. This paper describes the development of a hybrid genetic algorithm for solving the nonpreemptive job shop scheduling problems with the objective of minimizing makespan. In order to solve the presented problem more effectively, an operation-based representation was used to enable the construction of feasible schedules. In addition, a new knowledge-based operator was designed based on the problem’s characteristics in order to use machines’ idle times to improve the solution quality, and it was developed in the context of function evaluation. A machine based precedence preserving order-based crossover was proposed to generate the offspring. Furthermore, a simulated annealing based neighborhood search technique was used to improve the local exploitation ability of the algorithm and to increase its population diversity. In order to prove the efficiency and effectiveness of the proposed algorithm, numerous benchmarked instances were collected from the Operations Research Library. Computational results of the proposed hybrid genetic algorithm demonstrate its effectiveness.

  7. Classical versus Computer Algebra Methods in Elementary Geometry

    Science.gov (United States)

    Pech, Pavel

    2005-01-01

    Computer algebra methods based on results of commutative algebra like Groebner bases of ideals and elimination of variables make it possible to solve complex, elementary and non elementary problems of geometry, which are difficult to solve using a classical approach. Computer algebra methods permit the proof of geometric theorems, automatic…

  8. Applying Groebner bases to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Alexander V.; Smirnov, Vladimir A.

    2006-01-01

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential

  9. Applying Groebner bases to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Alexander V. [Mechanical and Mathematical Department and Scientific Research Computer Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, Vladimir A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-01-15

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential.

  10. Domain decomposition methods for solving an image problem

    Energy Technology Data Exchange (ETDEWEB)

    Tsui, W.K.; Tong, C.S. [Hong Kong Baptist College (Hong Kong)

    1994-12-31

    The domain decomposition method is a technique to break up a problem so that ensuing sub-problems can be solved on a parallel computer. In order to improve the convergence rate of the capacitance systems, pre-conditioned conjugate gradient methods are commonly used. In the last decade, most of the efficient preconditioners are based on elliptic partial differential equations which are particularly useful for solving elliptic partial differential equations. In this paper, the authors apply the so called covering preconditioner, which is based on the information of the operator under investigation. Therefore, it is good for various kinds of applications, specifically, they shall apply the preconditioned domain decomposition method for solving an image restoration problem. The image restoration problem is to extract an original image which has been degraded by a known convolution process and additive Gaussian noise.

  11. Detection of DNA damage based on metal-mediated molecular beacon and DNA strands displacement reaction

    Science.gov (United States)

    Xiong, Yanxiang; Wei, Min; Wei, Wei; Yin, Lihong; Pu, Yuepu; Liu, Songqin

    2014-01-01

    DNA hairpin structure probes are usually designed by forming intra-molecular duplex based on Watson-Crick hydrogen bonds. In this paper, a molecular beacon based on silver ions-mediated cytosine-Ag+-cytosine base pairs was used to detect DNA. The inherent characteristic of the metal ligation facilitated the design of functional probe and the adjustment of its binding strength compared to traditional DNA hairpin structure probes, which make it be used to detect DNA in a simple, rapid and easy way with the help of DNA strands displacement reaction. The method was sensitive and also possesses the good specificity to differentiate the single base mismatched DNA from the complementary DNA. It was also successfully applied to study the damage effect of classic genotoxicity chemicals such as styrene oxide and sodium arsenite on DNA, which was significant in food science, environmental science and pharmaceutical science.

  12. Examining the Effects of Field Dependence-Independence on Learners' Problem-Solving Performance and Interaction with a Computer Modeling Tool: Implications for the Design of Joint Cognitive Systems

    Science.gov (United States)

    Angeli, Charoula

    2013-01-01

    An investigation was carried out to examine the effects of cognitive style on learners' performance and interaction during complex problem solving with a computer modeling tool. One hundred and nineteen undergraduates volunteered to participate in the study. Participants were first administered a test, and based on their test scores they were…

  13. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    Science.gov (United States)

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  14. DNA based radiological dosimetry technology

    International Nuclear Information System (INIS)

    Diaz Quijada, Gerardo A.; Roy, Emmanuel; Veres, Teodor; Dumoulin, Michel M.; Vachon, Caroline; Blagoeva, Rosita; Pierre, Martin

    2008-01-01

    Full text: The purpose of this project is to develop a personal and wearable dosimeter using a highly-innovative approach based on the specific recognition of DNA damage with a polymer hybrid. Our biosensor will be sensitive to breaks in nucleic acid macromolecules and relevant to mixed-field radiation. The dosimeter proposed will be small, field deployable and will sense damages for all radiation types at the DNA level. The generalized concept for the novel-based radiological dosimeter: 1) Single or double stranded oligonucleotide is immobilized on surface; 2) Single stranded has higher cross-section for fragmentation; 3) Double stranded is more biological relevant; 4) Radiation induces fragmentation; 5) Ultra-sensitive detection of fragments provides radiation dose. Successful efforts have been made towards a proof-of-concept personal wearable DNA-based dosimeter that is appropriate for mixed-field radiation. The covalent immobilization of oligonucleotides on large areas of plastic surfaces has been demonstrated and corroborated spectroscopically. The surface concentration of DNA was determined to be 8 x 1010 molecules/cm 2 from a Ce(IV) catalyzed hydrolysis study of a fluorescently labelled oligonucleotide. Current efforts are being directed at studying radiation induced fragmentation of DNA followed by its ultra-sensitive detection via a novel method. In addition, proof-of-concept wearable personal devices and a detection platform are presently being fabricated. (author)

  15. Inference rule and problem solving

    Energy Technology Data Exchange (ETDEWEB)

    Goto, S

    1982-04-01

    Intelligent information processing signifies an opportunity of having man's intellectual activity executed on the computer, in which inference, in place of ordinary calculation, is used as the basic operational mechanism for such an information processing. Many inference rules are derived from syllogisms in formal logic. The problem of programming this inference function is referred to as a problem solving. Although logically inference and problem-solving are in close relation, the calculation ability of current computers is on a low level for inferring. For clarifying the relation between inference and computers, nonmonotonic logic has been considered. The paper deals with the above topics. 16 references.

  16. Heterogeneous quantum computing for satellite constellation optimization: solving the weighted k-clique problem

    Science.gov (United States)

    Bass, Gideon; Tomlin, Casey; Kumar, Vaibhaw; Rihaczek, Pete; Dulny, Joseph, III

    2018-04-01

    NP-hard optimization problems scale very rapidly with problem size, becoming unsolvable with brute force methods, even with supercomputing resources. Typically, such problems have been approximated with heuristics. However, these methods still take a long time and are not guaranteed to find an optimal solution. Quantum computing offers the possibility of producing significant speed-up and improved solution quality. Current quantum annealing (QA) devices are designed to solve difficult optimization problems, but they are limited by hardware size and qubit connectivity restrictions. We present a novel heterogeneous computing stack that combines QA and classical machine learning, allowing the use of QA on problems larger than the hardware limits of the quantum device. These results represent experiments on a real-world problem represented by the weighted k-clique problem. Through this experiment, we provide insight into the state of quantum machine learning.

  17. Instructional Supports for Representational Fluency in Solving Linear Equations with Computer Algebra Systems and Paper-and-Pencil

    Science.gov (United States)

    Fonger, Nicole L.; Davis, Jon D.; Rohwer, Mary Lou

    2018-01-01

    This research addresses the issue of how to support students' representational fluency--the ability to create, move within, translate across, and derive meaning from external representations of mathematical ideas. The context of solving linear equations in a combined computer algebra system (CAS) and paper-and-pencil classroom environment is…

  18. A Predictor-Corrector Method for Solving Equilibrium Problems

    Directory of Open Access Journals (Sweden)

    Zong-Ke Bao

    2014-01-01

    Full Text Available We suggest and analyze a predictor-corrector method for solving nonsmooth convex equilibrium problems based on the auxiliary problem principle. In the main algorithm each stage of computation requires two proximal steps. One step serves to predict the next point; the other helps to correct the new prediction. At the same time, we present convergence analysis under perfect foresight and imperfect one. In particular, we introduce a stopping criterion which gives rise to Δ-stationary points. Moreover, we apply this algorithm for solving the particular case: variational inequalities.

  19. Amplification of DNA mixtures--Missing data approach

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2008-01-01

    This paper presents a model for the interpretation of results of STR typing of DNA mixtures based on a multivariate normal distribution of peak areas. From previous analyses of controlled experiments with mixed DNA samples, we exploit the linear relationship between peak heights and peak areas...... DNA samples, it is only possible to observe the cumulative peak heights and areas. Complying with this latent structure, we use the EM-algorithm to impute the missing variables based on a compound symmetry model. That is the measurements are subject to intra- and inter-loci correlations not depending...... on the actual alleles of the DNA profiles. Due to factorization of the likelihood, properties of the normal distribution and use of auxiliary variables, an ordinary implementation of the EM-algorithm solves the missing data problem. We estimate the parameters in the model based on a training data set. In order...

  20. On а Recursive-Parallel Algorithm for Solving the Knapsack Problem

    Directory of Open Access Journals (Sweden)

    Vladimir V. Vasilchikov

    2018-01-01

    Full Text Available In this paper, we offer an efficient parallel algorithm for solving the NP-complete Knapsack Problem in its basic, so-called 0-1 variant. To find its exact solution, algorithms belonging to the category ”branch and bound methods” have long been used. To speed up the solving with varying degrees of efficiency, various options for parallelizing computations are also used. We propose here an algorithm for solving the problem, based on the paradigm of recursive-parallel computations. We consider it suited well for problems of this kind, when it is difficult to immediately break up the computations into a sufficient number of subtasks that are comparable in complexity, since they appear dynamically at run time. We used the RPM ParLib library, developed by the author, as the main tool to program the algorithm. This library allows us to develop effective applications for parallel computing on a local network in the .NET Framework. Such applications have the ability to generate parallel branches of computation directly during program execution and dynamically redistribute work between computing modules. Any language with support for the .NET Framework can be used as a programming language in conjunction with this library. For our experiments, we developed some C# applications using this library. The main purpose of these experiments was to study the acceleration achieved by recursive-parallel computing. A detailed description of the algorithm and its testing, as well as the results obtained, are also given in the paper.

  1. Solving black box computation problems using expert knowledge theory and methods

    International Nuclear Information System (INIS)

    Booker, Jane M.; McNamara, Laura A.

    2004-01-01

    The challenge problems for the Epistemic Uncertainty Workshop at Sandia National Laboratories provide common ground for comparing different mathematical theories of uncertainty, referred to as General Information Theories (GITs). These problems also present the opportunity to discuss the use of expert knowledge as an important constituent of uncertainty quantification. More specifically, how do the principles and methods of eliciting and analyzing expert knowledge apply to these problems and similar ones encountered in complex technical problem solving and decision making? We will address this question, demonstrating how the elicitation issues and the knowledge that experts provide can be used to assess the uncertainty in outputs that emerge from a black box model or computational code represented by the challenge problems. In our experience, the rich collection of GITs provides an opportunity to capture the experts' knowledge and associated uncertainties consistent with their thinking, problem solving, and problem representation. The elicitation process is rightly treated as part of an overall analytical approach, and the information elicited is not simply a source of data. In this paper, we detail how the elicitation process itself impacts the analyst's ability to represent, aggregate, and propagate uncertainty, as well as how to interpret uncertainties in outputs. While this approach does not advocate a specific GIT, answers under uncertainty do result from the elicitation

  2. (CBTP) on knowledge, problem-solving and learning approach

    African Journals Online (AJOL)

    In the first instance attention is paid to the effect of a computer-based teaching programme (CBTP) on the knowledge, problem-solving skills and learning approach of student ... In the practice group (oncology wards) no statistically significant change in the learning approach of respondents was found after using the CBTP.

  3. MEDUSA - An overset grid flow solver for network-based parallel computer systems

    Science.gov (United States)

    Smith, Merritt H.; Pallis, Jani M.

    1993-01-01

    Continuing improvement in processing speed has made it feasible to solve the Reynolds-Averaged Navier-Stokes equations for simple three-dimensional flows on advanced workstations. Combining multiple workstations into a network-based heterogeneous parallel computer allows the application of programming principles learned on MIMD (Multiple Instruction Multiple Data) distributed memory parallel computers to the solution of larger problems. An overset-grid flow solution code has been developed which uses a cluster of workstations as a network-based parallel computer. Inter-process communication is provided by the Parallel Virtual Machine (PVM) software. Solution speed equivalent to one-third of a Cray-YMP processor has been achieved from a cluster of nine commonly used engineering workstation processors. Load imbalance and communication overhead are the principal impediments to parallel efficiency in this application.

  4. The role of guidance in computer-based problem solving for the development of concepts of logic

    NARCIS (Netherlands)

    Eysink, Tessa H.S.; Dijkstra, S.; Kuper, Jan

    The effect of two instructional variables, manipulation of objects and guidance, in learning to use the logical connective, conditional, was investigated. Instructions for 72 first- and second year social science students were varied in the computer-based learning environment Tarski’s World,

  5. A nonparametric Bayesian approach for clustering bisulfate-based DNA methylation profiles.

    Science.gov (United States)

    Zhang, Lin; Meng, Jia; Liu, Hui; Huang, Yufei

    2012-01-01

    DNA methylation occurs in the context of a CpG dinucleotide. It is an important epigenetic modification, which can be inherited through cell division. The two major types of methylation include hypomethylation and hypermethylation. Unique methylation patterns have been shown to exist in diseases including various types of cancer. DNA methylation analysis promises to become a powerful tool in cancer diagnosis, treatment and prognostication. Large-scale methylation arrays are now available for studying methylation genome-wide. The Illumina methylation platform simultaneously measures cytosine methylation at more than 1500 CpG sites associated with over 800 cancer-related genes. Cluster analysis is often used to identify DNA methylation subgroups for prognosis and diagnosis. However, due to the unique non-Gaussian characteristics, traditional clustering methods may not be appropriate for DNA and methylation data, and the determination of optimal cluster number is still problematic. A Dirichlet process beta mixture model (DPBMM) is proposed that models the DNA methylation expressions as an infinite number of beta mixture distribution. The model allows automatic learning of the relevant parameters such as the cluster mixing proportion, the parameters of beta distribution for each cluster, and especially the number of potential clusters. Since the model is high dimensional and analytically intractable, we proposed a Gibbs sampling "no-gaps" solution for computing the posterior distributions, hence the estimates of the parameters. The proposed algorithm was tested on simulated data as well as methylation data from 55 Glioblastoma multiform (GBM) brain tissue samples. To reduce the computational burden due to the high data dimensionality, a dimension reduction method is adopted. The two GBM clusters yielded by DPBMM are based on data of different number of loci (P-value < 0.1), while hierarchical clustering cannot yield statistically significant clusters.

  6. Developing a Blended Learning-Based Method for Problem-Solving in Capability Learning

    Science.gov (United States)

    Dwiyogo, Wasis D.

    2018-01-01

    The main objectives of the study were to develop and investigate the implementation of blended learning based method for problem-solving. Three experts were involved in the study and all three had stated that the model was ready to be applied in the classroom. The implementation of the blended learning-based design for problem-solving was…

  7. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  8. FPGA-based distributed computing microarchitecture for complex physical dynamics investigation.

    Science.gov (United States)

    Borgese, Gianluca; Pace, Calogero; Pantano, Pietro; Bilotta, Eleonora

    2013-09-01

    In this paper, we present a distributed computing system, called DCMARK, aimed at solving partial differential equations at the basis of many investigation fields, such as solid state physics, nuclear physics, and plasma physics. This distributed architecture is based on the cellular neural network paradigm, which allows us to divide the differential equation system solving into many parallel integration operations to be executed by a custom multiprocessor system. We push the number of processors to the limit of one processor for each equation. In order to test the present idea, we choose to implement DCMARK on a single FPGA, designing the single processor in order to minimize its hardware requirements and to obtain a large number of easily interconnected processors. This approach is particularly suited to study the properties of 1-, 2- and 3-D locally interconnected dynamical systems. In order to test the computing platform, we implement a 200 cells, Korteweg-de Vries (KdV) equation solver and perform a comparison between simulations conducted on a high performance PC and on our system. Since our distributed architecture takes a constant computing time to solve the equation system, independently of the number of dynamical elements (cells) of the CNN array, it allows us to reduce the elaboration time more than other similar systems in the literature. To ensure a high level of reconfigurability, we design a compact system on programmable chip managed by a softcore processor, which controls the fast data/control communication between our system and a PC Host. An intuitively graphical user interface allows us to change the calculation parameters and plot the results.

  9. Computer-based liquid radioactive waste control with plant emergency and generator temperature monitoring

    International Nuclear Information System (INIS)

    Plotnick, R.J.; Schneider, M.I.; Shaffer, C.E.

    1986-01-01

    At the start of the design of the liquid radwaste control system for a nuclear generating station under construction, several serious problems were detected. The solution incorporated a new approach utilizing a computer and a blend of standard and custom software to replace the existing conventionally instrumented benchboard. The computer-based system, in addition to solving the problems associated with the benchboard design, also provided other enhancements which significantly improved the operability and reliability of the radwaste system. The functionality of the computer-based radwaste control system also enabled additional applications to be added to an expanded multitask version of the radwaste computer: 1) a Nuclear Regulatory Commission (NRC) requirement that all nuclear power plants have an emergency response facility status monitoring system; and 2) the sophisticated temperature monitoring and trending requested by the electric generator manufacturer to continue its warranty commitments. The addition of these tasks to the radwaste computer saved the cost of one or more computers that would be dedicated to these work requirements

  10. Pedagogy and/or technology: Making difference in improving students' problem solving skills

    Science.gov (United States)

    Hrepic, Zdeslav; Lodder, Katherine; Shaw, Kimberly A.

    2013-01-01

    Pen input computers combined with interactive software may have substantial potential for promoting active instructional methodologies and for facilitating students' problem solving ability. An excellent example is a study in which introductory physics students improved retention, conceptual understanding and problem solving abilities when one of three weekly lectures was replaced with group problem solving sessions facilitated with Tablet PCs and DyKnow software [1,2]. The research goal of the present study was to isolate the effect of the methodology itself (using additional time to teach problem solving) from that of the involved technology. In Fall 2011 we compared the performance of students taking the same introductory physics lecture course while enrolled in two separate problem-solving sections. One section used pen-based computing to facilitate group problem solving while the other section used low-tech methods for one third of the semester (covering Kinematics), and then traded technologies for the middle third of the term (covering Dynamics). Analysis of quiz, exam and standardized pre-post test results indicated no significant difference in scores of the two groups. Combining this result with those of previous studies implies primacy of pedagogy (collaborative problem solving itself) over technology for student learning in problem solving recitations.

  11. A Reputation-Based Identity Management Model for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lifa Wu

    2015-01-01

    Full Text Available In the field of cloud computing, most research on identity management has concentrated on protecting user data. However, users typically leave a trail when they access cloud services, and the resulting user traceability can potentially lead to the leakage of sensitive user information. Meanwhile, malicious users can do harm to cloud providers through the use of pseudonyms. To solve these problems, we introduce a reputation mechanism and design a reputation-based identity management model for cloud computing. In the model, pseudonyms are generated based on a reputation signature so as to guarantee the untraceability of pseudonyms, and a mechanism that calculates user reputation is proposed, which helps cloud service providers to identify malicious users. Analysis verifies that the model can ensure that users access cloud services anonymously and that cloud providers assess the credibility of users effectively without violating user privacy.

  12. Duplex Interrogation by a Direct DNA Repair Protein in Search of Base Damage

    Science.gov (United States)

    Yi, Chengqi; Chen, Baoen; Qi, Bo; Zhang, Wen; Jia, Guifang; Zhang, Liang; Li, Charles J.; Dinner, Aaron R.; Yang, Cai-Guang; He, Chuan

    2012-01-01

    ALKBH2 is a direct DNA repair dioxygenase guarding mammalian genome against N1-methyladenine, N3-methylcytosine, and 1,N6-ethenoadenine damage. A prerequisite for repair is to identify these lesions in the genome. Here we present crystal structures of ALKBH2 bound to different duplex DNAs. Together with computational and biochemical analyses, our results suggest that DNA interrogation by ALKBH2 displays two novel features: i) ALKBH2 probes base-pair stability and detects base pairs with reduced stability; ii) ALKBH2 does not have nor need a “damage-checking site”, which is critical for preventing spurious base-cleavage for several glycosylases. The demethylation mechanism of ALKBH2 insures that only cognate lesions are oxidized and reversed to normal bases, and that a flipped, non-substrate base remains intact in the active site. Overall, the combination of duplex interrogation and oxidation chemistry allows ALKBH2 to detect and process diverse lesions efficiently and correctly. PMID:22659876

  13. An improved computational version of the LTSN method to solve transport problems in a slab

    International Nuclear Information System (INIS)

    Cardona, Augusto V.; Oliveira, Jose Vanderlei P. de; Vilhena, Marco Tullio de; Segatto, Cynthia F.

    2008-01-01

    In this work, we present an improved computational version of the LTS N method to solve transport problems in a slab. The key feature relies on the reordering of the set of S N equations. This procedure reduces by a factor of two the task of evaluating the eigenvalues of the matrix associated to SN approximations. We present numerical simulations and comparisons with the ones of the classical LTS N approach. (author)

  14. Free energy landscape and transition pathways from Watson–Crick to Hoogsteen base pairing in free duplex DNA

    Science.gov (United States)

    Yang, Changwon; Kim, Eunae; Pak, Youngshang

    2015-01-01

    Houghton (HG) base pairing plays a central role in the DNA binding of proteins and small ligands. Probing detailed transition mechanism from Watson–Crick (WC) to HG base pair (bp) formation in duplex DNAs is of fundamental importance in terms of revealing intrinsic functions of double helical DNAs beyond their sequence determined functions. We investigated a free energy landscape of a free B-DNA with an adenosine–thymine (A–T) rich sequence to probe its conformational transition pathways from WC to HG base pairing. The free energy landscape was computed with a state-of-art two-dimensional umbrella molecular dynamics simulation at the all-atom level. The present simulation showed that in an isolated duplex DNA, the spontaneous transition from WC to HG bp takes place via multiple pathways. Notably, base flipping into the major and minor grooves was found to play an important role in forming these multiple transition pathways. This finding suggests that naked B-DNA under normal conditions has an inherent ability to form HG bps via spontaneous base opening events. PMID:26250116

  15. Solving SAT Problem Based on Hybrid Differential Evolution Algorithm

    Science.gov (United States)

    Liu, Kunqi; Zhang, Jingmin; Liu, Gang; Kang, Lishan

    Satisfiability (SAT) problem is an NP-complete problem. Based on the analysis about it, SAT problem is translated equally into an optimization problem on the minimum of objective function. A hybrid differential evolution algorithm is proposed to solve the Satisfiability problem. It makes full use of strong local search capacity of hill-climbing algorithm and strong global search capability of differential evolution algorithm, which makes up their disadvantages, improves the efficiency of algorithm and avoids the stagnation phenomenon. The experiment results show that the hybrid algorithm is efficient in solving SAT problem.

  16. Studies of base pair sequence effects on DNA solvation based on all-atom molecular dynamics simulations.

    Science.gov (United States)

    Dixit, Surjit B; Mezei, Mihaly; Beveridge, David L

    2012-07-01

    Detailed analyses of the sequence-dependent solvation and ion atmosphere of DNA are presented based on molecular dynamics (MD) simulations on all the 136 unique tetranucleotide steps obtained by the ABC consortium using the AMBER suite of programs. Significant sequence effects on solvation and ion localization were observed in these simulations. The results were compared to essentially all known experimental data on the subject. Proximity analysis was employed to highlight the sequence dependent differences in solvation and ion localization properties in the grooves of DNA. Comparison of the MD-calculated DNA structure with canonical A- and B-forms supports the idea that the G/C-rich sequences are closer to canonical A- than B-form structures, while the reverse is true for the poly A sequences, with the exception of the alternating ATAT sequence. Analysis of hydration density maps reveals that the flexibility of solute molecule has a significant effect on the nature of observed hydration. Energetic analysis of solute-solvent interactions based on proximity analysis of solvent reveals that the GC or CG base pairs interact more strongly with water molecules in the minor groove of DNA that the AT or TA base pairs, while the interactions of the AT or TA pairs in the major groove are stronger than those of the GC or CG pairs. Computation of solvent-accessible surface area of the nucleotide units in the simulated trajectories reveals that the similarity with results derived from analysis of a database of crystallographic structures is excellent. The MD trajectories tend to follow Manning's counterion condensation theory, presenting a region of condensed counterions within a radius of about 17 A from the DNA surface independent of sequence. The GC and CG pairs tend to associate with cations in the major groove of the DNA structure to a greater extent than the AT and TA pairs. Cation association is more frequent in the minor groove of AT than the GC pairs. In general, the

  17. DNA nanotechnology: a future perspective

    Science.gov (United States)

    2013-01-01

    In addition to its genetic function, DNA is one of the most distinct and smart self-assembling nanomaterials. DNA nanotechnology exploits the predictable self-assembly of DNA oligonucleotides to design and assemble innovative and highly discrete nanostructures. Highly ordered DNA motifs are capable of providing an ultra-fine framework for the next generation of nanofabrications. The majority of these applications are based upon the complementarity of DNA base pairing: adenine with thymine, and guanine with cytosine. DNA provides an intelligent route for the creation of nanoarchitectures with programmable and predictable patterns. DNA strands twist along one helix for a number of bases before switching to the other helix by passing through a crossover junction. The association of two crossovers keeps the helices parallel and holds them tightly together, allowing the assembly of bigger structures. Because of the DNA molecule's unique and novel characteristics, it can easily be applied in a vast variety of multidisciplinary research areas like biomedicine, computer science, nano/optoelectronics, and bionanotechnology. PMID:23497147

  18. Ni(II) complexes of arginine Schiff-bases and its interaction with DNA

    Energy Technology Data Exchange (ETDEWEB)

    Sallam, S.A., E-mail: shehabsallam@yahoo.com [Chemistry Department, Faculty of Science, Suez Canal University, Isamilia (Egypt); Abbas, A.M. [Chemistry Department, Faculty of Science, Suez Canal University, Isamilia (Egypt)

    2013-04-15

    Ni(II) complexes with Schiff-bases obtained by condensation of arginine with salicylaldehyde; 2,3-; 2,4-; 2,5-dihydroxybenzaldehyde and o-hydroxynaphthaldehyde have been synthesized using the template method in ethanol or ammonia media. They were characterized by elemental analyses, conductivity measurements, magnetic moment, UV, IR and {sup 1}H NMR spectra as well as thermal analysis (TG, DTG and DTA). The Schiff-bases are dibasic tridentate donors and the complexes have diamagnetic square planar and octahedral structures. The complexes decompose in three steps where kinetic and thermodynamic parameters of the decomposition steps were computed. The interactions of the formed complexes with FM-DNA were monitored by UV and fluorescence spectroscopy. -- Highlights: ► Arginine Schiff-bases and their nickel(II) complexes have been synthesized. ► Magnetic and spectral data show diamagnetic square planar and octahedral complexes. ► The complexes thermally decompose in three stages. Interaction with FM-DNA shows hyperchromism with blue shift.

  19. Ni(II) complexes of arginine Schiff-bases and its interaction with DNA

    International Nuclear Information System (INIS)

    Sallam, S.A.; Abbas, A.M.

    2013-01-01

    Ni(II) complexes with Schiff-bases obtained by condensation of arginine with salicylaldehyde; 2,3-; 2,4-; 2,5-dihydroxybenzaldehyde and o-hydroxynaphthaldehyde have been synthesized using the template method in ethanol or ammonia media. They were characterized by elemental analyses, conductivity measurements, magnetic moment, UV, IR and 1 H NMR spectra as well as thermal analysis (TG, DTG and DTA). The Schiff-bases are dibasic tridentate donors and the complexes have diamagnetic square planar and octahedral structures. The complexes decompose in three steps where kinetic and thermodynamic parameters of the decomposition steps were computed. The interactions of the formed complexes with FM-DNA were monitored by UV and fluorescence spectroscopy. -- Highlights: ► Arginine Schiff-bases and their nickel(II) complexes have been synthesized. ► Magnetic and spectral data show diamagnetic square planar and octahedral complexes. ► The complexes thermally decompose in three stages. Interaction with FM-DNA shows hyperchromism with blue shift

  20. Fast phylogenetic DNA barcoding

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Boomsma, Wouter Krogh; Willerslev, Eske

    2008-01-01

    We present a heuristic approach to the DNA assignment problem based on phylogenetic inferences using constrained neighbour joining and non-parametric bootstrapping. We show that this method performs as well as the more computationally intensive full Bayesian approach in an analysis of 500 insect...... DNA sequences obtained from GenBank. We also analyse a previously published dataset of environmental DNA sequences from soil from New Zealand and Siberia, and use these data to illustrate the fact that statistical approaches to the DNA assignment problem allow for more appropriate criteria...... for determining the taxonomic level at which a particular DNA sequence can be assigned....

  1. Web-Based Problem-Solving Assignment and Grading System

    Science.gov (United States)

    Brereton, Giles; Rosenberg, Ronald

    2014-11-01

    In engineering courses with very specific learning objectives, such as fluid mechanics and thermodynamics, it is conventional to reinforce concepts and principles with problem-solving assignments and to measure success in problem solving as an indicator of student achievement. While the modern-day ease of copying and searching for online solutions can undermine the value of traditional assignments, web-based technologies also provide opportunities to generate individualized well-posed problems with an infinite number of different combinations of initial/final/boundary conditions, so that the probability of any two students being assigned identical problems in a course is vanishingly small. Such problems can be designed and programmed to be: single or multiple-step, self-grading, allow students single or multiple attempts; provide feedback when incorrect; selectable according to difficulty; incorporated within gaming packages; etc. In this talk, we discuss the use of a homework/exam generating program of this kind in a single-semester course, within a web-based client-server system that ensures secure operation.

  2. Immunogenicity of a DNA-launched replicon-based canine parvovirus DNA vaccine expressing VP2 antigen in dogs.

    Science.gov (United States)

    Dahiya, Shyam S; Saini, Mohini; Kumar, Pankaj; Gupta, Praveen K

    2012-10-01

    A replicon-based DNA vaccine encoding VP2 gene of canine parvovirus (CPV) was developed by cloning CPV-VP2 gene into a replicon-based DNA vaccine vector (pAlpha). The characteristics of a replicon-based DNA vaccine like, self-amplification of transcripts and induction of apoptosis were analyzed in transfected mammalian cells. When the pAlpha-CPV-VP2 was injected intradermal as DNA-launched replicon-based DNA vaccine in dogs, it induced CPV-specific humoral and cell mediated immune responses. The virus neutralization antibody and lymphocyte proliferative responses were higher than conventional CPV DNA vaccine and commercial CPV vaccine. These results indicated that DNA-launched replicon-based CPV DNA vaccine was effective in inducing both CPV-specific humoral and cellular immune responses and can be considered as effective alternative to conventional CPV DNA vaccine and commercial CPV vaccine. Crown Copyright © 2012. Published by Elsevier India Pvt Ltd. All rights reserved.

  3. A Novel Fast and Secure Approach for Voice Encryption Based on DNA Computing

    Science.gov (United States)

    Kakaei Kate, Hamidreza; Razmara, Jafar; Isazadeh, Ayaz

    2018-06-01

    Today, in the world of information communication, voice information has a particular importance. One way to preserve voice data from attacks is voice encryption. The encryption algorithms use various techniques such as hashing, chaotic, mixing, and many others. In this paper, an algorithm is proposed for voice encryption based on three different schemes to increase flexibility and strength of the algorithm. The proposed algorithm uses an innovative encoding scheme, the DNA encryption technique and a permutation function to provide a secure and fast solution for voice encryption. The algorithm is evaluated based on various measures including signal to noise ratio, peak signal to noise ratio, correlation coefficient, signal similarity and signal frequency content. The results demonstrate applicability of the proposed method in secure and fast encryption of voice files

  4. Design of character-based DNA barcode motif for species identification: A computational approach and its validation in fishes.

    Science.gov (United States)

    Chakraborty, Mohua; Dhar, Bishal; Ghosh, Sankar Kumar

    2017-11-01

    The DNA barcodes are generally interpreted using distance-based and character-based methods. The former uses clustering of comparable groups, based on the relative genetic distance, while the latter is based on the presence or absence of discrete nucleotide substitutions. The distance-based approach has a limitation in defining a universal species boundary across the taxa as the rate of mtDNA evolution is not constant throughout the taxa. However, character-based approach more accurately defines this using a unique set of nucleotide characters. The character-based analysis of full-length barcode has some inherent limitations, like sequencing of the full-length barcode, use of a sparse-data matrix and lack of a uniform diagnostic position for each group. A short continuous stretch of a fragment can be used to resolve the limitations. Here, we observe that a 154-bp fragment, from the transversion-rich domain of 1367 COI barcode sequences can successfully delimit species in the three most diverse orders of freshwater fishes. This fragment is used to design species-specific barcode motifs for 109 species by the character-based method, which successfully identifies the correct species using a pattern-matching program. The motifs also correctly identify geographically isolated population of the Cypriniformes species. Further, this region is validated as a species-specific mini-barcode for freshwater fishes by successful PCR amplification and sequencing of the motif (154 bp) using the designed primers. We anticipate that use of such motifs will enhance the diagnostic power of DNA barcode, and the mini-barcode approach will greatly benefit the field-based system of rapid species identification. © 2017 John Wiley & Sons Ltd.

  5. Problem Solving Method Based on E-Learning System for Engineering Education

    Science.gov (United States)

    Khazaal, Hasan F.

    2015-01-01

    Encouraging engineering students to handle advanced technology with multimedia, as well as motivate them to have the skills of solving the problem, are the missions of the teacher in preparing students for a modern professional career. This research proposes a scenario of problem solving in basic electrical circuits based on an e-learning system…

  6. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  7. Solving Problem of Graph Isomorphism by Membrane-Quantum Hybrid Model

    Directory of Open Access Journals (Sweden)

    Artiom Alhazov

    2015-10-01

    Full Text Available This work presents the application of new parallelization methods based on membrane-quantum hybrid computing to graph isomorphism problem solving. Applied membrane-quantum hybrid computational model was developed by authors. Massive parallelism of unconventional computing is used to implement classic brute force algorithm efficiently. This approach does not suppose any restrictions of considered graphs types. The estimated performance of the model is less then quadratic that makes a very good result for the problem of \\textbf{NP} complexity.

  8. Computing one of Victor Moll's irresistible integrals with computer algebra

    Directory of Open Access Journals (Sweden)

    Christoph Koutschan

    2008-04-01

    Full Text Available We investigate a certain quartic integral from V. Moll's book “Irresistible Integrals” and demonstrate how it can be solved by computer algebra methods, namely by using non-commutative Gröbner bases. We present recent implementations in the computer algebra systems SINGULAR and MATHEMATICA.

  9. An information gap in DNA evidence interpretation.

    Directory of Open Access Journals (Sweden)

    Mark W Perlin

    Full Text Available Forensic DNA evidence often contains mixtures of multiple contributors, or is present in low template amounts. The resulting data signals may appear to be relatively uninformative when interpreted using qualitative inclusion-based methods. However, these same data can yield greater identification information when interpreted by computer using quantitative data-modeling methods. This study applies both qualitative and quantitative interpretation methods to a well-characterized DNA mixture and dilution data set, and compares the inferred match information. The results show that qualitative interpretation loses identification power at low culprit DNA quantities (below 100 pg, but that quantitative methods produce useful information down into the 10 pg range. Thus there is a ten-fold information gap that separates the qualitative and quantitative DNA mixture interpretation approaches. With low quantities of culprit DNA (10 pg to 100 pg, computer-based quantitative interpretation provides greater match sensitivity.

  10. Solving the scalability issue in quantum-based refinement: Q|R#1.

    Science.gov (United States)

    Zheng, Min; Moriarty, Nigel W; Xu, Yanting; Reimers, Jeffrey R; Afonine, Pavel V; Waller, Mark P

    2017-12-01

    Accurately refining biomacromolecules using a quantum-chemical method is challenging because the cost of a quantum-chemical calculation scales approximately as n m , where n is the number of atoms and m (≥3) is based on the quantum method of choice. This fundamental problem means that quantum-chemical calculations become intractable when the size of the system requires more computational resources than are available. In the development of the software package called Q|R, this issue is referred to as Q|R#1. A divide-and-conquer approach has been developed that fragments the atomic model into small manageable pieces in order to solve Q|R#1. Firstly, the atomic model of a crystal structure is analyzed to detect noncovalent interactions between residues, and the results of the analysis are represented as an interaction graph. Secondly, a graph-clustering algorithm is used to partition the interaction graph into a set of clusters in such a way as to minimize disruption to the noncovalent interaction network. Thirdly, the environment surrounding each individual cluster is analyzed and any residue that is interacting with a particular cluster is assigned to the buffer region of that particular cluster. A fragment is defined as a cluster plus its buffer region. The gradients for all atoms from each of the fragments are computed, and only the gradients from each cluster are combined to create the total gradients. A quantum-based refinement is carried out using the total gradients as chemical restraints. In order to validate this interaction graph-based fragmentation approach in Q|R, the entire atomic model of an amyloid cross-β spine crystal structure (PDB entry 2oNA) was refined.

  11. Solving multiconstraint assignment problems using learning automata.

    Science.gov (United States)

    Horn, Geir; Oommen, B John

    2010-02-01

    This paper considers the NP-hard problem of object assignment with respect to multiple constraints: assigning a set of elements (or objects) into mutually exclusive classes (or groups), where the elements which are "similar" to each other are hopefully located in the same class. The literature reports solutions in which the similarity constraint consists of a single index that is inappropriate for the type of multiconstraint problems considered here and where the constraints could simultaneously be contradictory. This feature, where we permit possibly contradictory constraints, distinguishes this paper from the state of the art. Indeed, we are aware of no learning automata (or other heuristic) solutions which solve this problem in its most general setting. Such a scenario is illustrated with the static mapping problem, which consists of distributing the processes of a parallel application onto a set of computing nodes. This is a classical and yet very important problem within the areas of parallel computing, grid computing, and cloud computing. We have developed four learning-automata (LA)-based algorithms to solve this problem: First, a fixed-structure stochastic automata algorithm is presented, where the processes try to form pairs to go onto the same node. This algorithm solves the problem, although it requires some centralized coordination. As it is desirable to avoid centralized control, we subsequently present three different variable-structure stochastic automata (VSSA) algorithms, which have superior partitioning properties in certain settings, although they forfeit some of the scalability features of the fixed-structure algorithm. All three VSSA algorithms model the processes as automata having first the hosting nodes as possible actions; second, the processes as possible actions; and, third, attempting to estimate the process communication digraph prior to probabilistically mapping the processes. This paper, which, we believe, comprehensively reports the

  12. A universal DNA-based protein detection system.

    Science.gov (United States)

    Tran, Thua N N; Cui, Jinhui; Hartman, Mark R; Peng, Songming; Funabashi, Hisakage; Duan, Faping; Yang, Dayong; March, John C; Lis, John T; Cui, Haixin; Luo, Dan

    2013-09-25

    Protein immune detection requires secondary antibodies which must be carefully selected in order to avoid interspecies cross-reactivity, and is therefore restricted by the limited availability of primary/secondary antibody pairs. Here we present a versatile DNA-based protein detection system using a universal adapter to interface between IgG antibodies and DNA-modified reporter molecules. As a demonstration of this capability, we successfully used DNA nano-barcodes, quantum dots, and horseradish peroxidase enzyme to detect multiple proteins using our DNA-based labeling system. Our system not only eliminates secondary antibodies but also serves as a novel method platform for protein detection with modularity, high capacity, and multiplexed capability.

  13. Forensic DNA testing.

    Science.gov (United States)

    Butler, John M

    2011-12-01

    Forensic DNA testing has a number of applications, including parentage testing, identifying human remains from natural or man-made disasters or terrorist attacks, and solving crimes. This article provides background information followed by an overview of the process of forensic DNA testing, including sample collection, DNA extraction, PCR amplification, short tandem repeat (STR) allele separation and sizing, typing and profile interpretation, statistical analysis, and quality assurance. The article concludes with discussions of possible problems with the data and other forensic DNA testing techniques.

  14. An integrated information management system based DSS for problem solving and decision making in open & distance learning institutions of India

    Directory of Open Access Journals (Sweden)

    Pankaj Khanna

    2014-04-01

    Full Text Available An integrated information system based DSS is developed for Open and Distance Learning (ODL institutions in India. The system has been web structured with the most suitable newly developed modules. A DSS model has been developed for solving semi-structured and unstructured problems including decision making with regard to various programmes and activities operating in the ODLIs. The DSS model designed for problem solving is generally based on quantitative formulas, whereas for problems involving imprecision and uncertainty, a fuzzy theory based DSS is employed. The computer operated system thus developed would help the ODLI management to quickly identify programmes and activities that require immediate attention. It shall also provide guidance for obtaining the most appropriate managerial decisions without any loss of time. As a result, the various subsystems operating in the ODLI are able to administer its activities more efficiently and effectively to enhance the overall performance of the concerned ODL institution to a new level.

  15. DNAzyme-Based Logic Gate-Mediated DNA Self-Assembly.

    Science.gov (United States)

    Zhang, Cheng; Yang, Jing; Jiang, Shuoxing; Liu, Yan; Yan, Hao

    2016-01-13

    Controlling DNA self-assembly processes using rationally designed logic gates is a major goal of DNA-based nanotechnology and programming. Such controls could facilitate the hierarchical engineering of complex nanopatterns responding to various molecular triggers or inputs. Here, we demonstrate the use of a series of DNAzyme-based logic gates to control DNA tile self-assembly onto a prescribed DNA origami frame. Logic systems such as "YES," "OR," "AND," and "logic switch" are implemented based on DNAzyme-mediated tile recognition with the DNA origami frame. DNAzyme is designed to play two roles: (1) as an intermediate messenger to motivate downstream reactions and (2) as a final trigger to report fluorescent signals, enabling information relay between the DNA origami-framed tile assembly and fluorescent signaling. The results of this study demonstrate the plausibility of DNAzyme-mediated hierarchical self-assembly and provide new tools for generating dynamic and responsive self-assembly systems.

  16. Capturing Problem-Solving Processes Using Critical Rationalism

    Science.gov (United States)

    Chitpin, Stephanie; Simon, Marielle

    2012-01-01

    The examination of problem-solving processes continues to be a current research topic in education. Knowing how to solve problems is not only a key aspect of learning mathematics but is also at the heart of cognitive theories, linguistics, artificial intelligence, and computers sciences. Problem solving is a multistep, higher-order cognitive task…

  17. Effects of formic acid hydrolysis on the quantitative analysis of radiation-induced DNA base damage products assayed by gas chromatography/mass spectrometry

    International Nuclear Information System (INIS)

    Swarts, S.G.; Smith, G.S.; Miao, L.; Wheeler, K.T.

    1996-01-01

    Gas chromatography/mass spectrometry (GC/ MS-SIM) is an excellent technique for performing both qualitative and quantitative analysis of DNA base damage products that are formed by exposure to ionizing radiation or by the interaction of intracellular DNA with activated oxygen species. This technique commonly uses a hot formic acid hydrolysis step to degrade the DNA to individual free bases. However, due to the harsh nature of this degradation procedure, the quantitation of DNA base damage products may be adversely affected. Consequently, we examined the effects of various formic acid hydrolysis procedures on the quantitation of a number of DNA base damage products and identified several factors that can influence this quantitation. These factors included (1) the inherent acid stabilities of both the lesions and the internal standards; (2) the hydrolysis temperature; (3) the source and grade of the formic acid; and (4) the sample mass during hydrolysis. Our data also suggested that the N, O-bis (trimethylsilyl)trifluoroacetamide (BSTFA) derivatization efficiency can be adversely affected, presumably by trace contaminants either in the formic acid or from the acid-activated surface of the glass derivatization vials. Where adverse effects were noted, modifications were explored in an attempt to improve the quantitation of these DNA lesions. Although experimental steps could be taken to minimize the influence of these factors on the quantitation of some base damage products, no single procedure solved the quantitation problem for all base lesions. However, a significant improvement in the quantitation was achieved if the relative molecular response factor (RMRF) values for these lesions were generated with authentic DNA base damage products that had been treated exactly like the experimental samples. (orig.)

  18. Analysis of rotary engine combustion processes based on unsteady, three-dimensional computations

    Science.gov (United States)

    Raju, M. S.; Willis, E. A.

    1990-01-01

    A new computer code was developed for predicting the turbulent and chemically reacting flows with sprays occurring inside of a stratified charge rotary engine. The solution procedure is based on an Eulerian Lagrangian approach where the unsteady, three-dimensional Navier-Stokes equations for a perfect gas mixture with variable properties are solved in generalized, Eulerian coordinates on a moving grid by making use of an implicit finite volume, Steger-Warming flux vector splitting scheme, and the liquid phase equations are solved in Lagrangian coordinates. Both the details of the numerical algorithm and the finite difference predictions of the combustor flow field during the opening of exhaust and/or intake, and also during fuel vaporization and combustion, are presented.

  19. An overview of the prediction of protein DNA-binding sites.

    Science.gov (United States)

    Si, Jingna; Zhao, Rui; Wu, Rongling

    2015-03-06

    Interactions between proteins and DNA play an important role in many essential biological processes such as DNA replication, transcription, splicing, and repair. The identification of amino acid residues involved in DNA-binding sites is critical for understanding the mechanism of these biological activities. In the last decade, numerous computational approaches have been developed to predict protein DNA-binding sites based on protein sequence and/or structural information, which play an important role in complementing experimental strategies. At this time, approaches can be divided into three categories: sequence-based DNA-binding site prediction, structure-based DNA-binding site prediction, and homology modeling and threading. In this article, we review existing research on computational methods to predict protein DNA-binding sites, which includes data sets, various residue sequence/structural features, machine learning methods for comparison and selection, evaluation methods, performance comparison of different tools, and future directions in protein DNA-binding site prediction. In particular, we detail the meta-analysis of protein DNA-binding sites. We also propose specific implications that are likely to result in novel prediction methods, increased performance, or practical applications.

  20. Variability of worked examples and transfer of geometrical problem-solving skills : a cognitive-load approach

    NARCIS (Netherlands)

    Paas, Fred G.W.C.; van Merrienboer, Jeroen J.G.; van Merrienboer, J.J.G.

    1994-01-01

    Four computer-based training strategies for geometrical problem solving in the domain of computer numerically controlled machinery programming were studied with regard to their effects on training performance, transfer performance, and cognitive load. A low- and a high-variability conventional

  1. A Computational/Experimental Platform for Investigating Three-Dimensional Puzzle Solving of Comminuted Articular Fractures

    Science.gov (United States)

    Thomas, Thaddeus P.; Anderson, Donald D.; Willis, Andrew R.; Liu, Pengcheng; Frank, Matthew C.; Marsh, J. Lawrence; Brown, Thomas D.

    2011-01-01

    Reconstructing highly comminuted articular fractures poses a difficult surgical challenge, akin to solving a complicated three-dimensional (3D) puzzle. Pre-operative planning using CT is critically important, given the desirability of less invasive surgical approaches. The goal of this work is to advance 3D puzzle solving methods toward use as a pre-operative tool for reconstructing these complex fractures. Methodology for generating typical fragmentation/dispersal patterns was developed. Five identical replicas of human distal tibia anatomy, were machined from blocks of high-density polyetherurethane foam (bone fragmentation surrogate), and were fractured using an instrumented drop tower. Pre- and post-fracture geometries were obtained using laser scans and CT. A semi-automatic virtual reconstruction computer program aligned fragment native (non-fracture) surfaces to a pre-fracture template. The tibias were precisely reconstructed with alignment accuracies ranging from 0.03-0.4mm. This novel technology has potential to significantly enhance surgical techniques for reconstructing comminuted intra-articular fractures, as illustrated for a representative clinical case. PMID:20924863

  2. Free energy landscape and transition pathways from Watson-Crick to Hoogsteen base pairing in free duplex DNA.

    Science.gov (United States)

    Yang, Changwon; Kim, Eunae; Pak, Youngshang

    2015-09-18

    Houghton (HG) base pairing plays a central role in the DNA binding of proteins and small ligands. Probing detailed transition mechanism from Watson-Crick (WC) to HG base pair (bp) formation in duplex DNAs is of fundamental importance in terms of revealing intrinsic functions of double helical DNAs beyond their sequence determined functions. We investigated a free energy landscape of a free B-DNA with an adenosine-thymine (A-T) rich sequence to probe its conformational transition pathways from WC to HG base pairing. The free energy landscape was computed with a state-of-art two-dimensional umbrella molecular dynamics simulation at the all-atom level. The present simulation showed that in an isolated duplex DNA, the spontaneous transition from WC to HG bp takes place via multiple pathways. Notably, base flipping into the major and minor grooves was found to play an important role in forming these multiple transition pathways. This finding suggests that naked B-DNA under normal conditions has an inherent ability to form HG bps via spontaneous base opening events. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Selective base excision repair of DNA damage by the non-base-flipping DNA glycosylase AlkC

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Rongxin; Mullins, Elwood A.; Shen, Xing; #8208; Xing; Lay, Kori T.; Yuen, Philip K.; David, Sheila S.; Rokas, Antonis; Eichman, Brandt F. (UCD); (Vanderbilt)

    2017-10-20

    DNA glycosylases preserve genome integrity and define the specificity of the base excision repair pathway for discreet, detrimental modifications, and thus, the mechanisms by which glycosylases locate DNA damage are of particular interest. Bacterial AlkC and AlkD are specific for cationic alkylated nucleobases and have a distinctive HEAT-like repeat (HLR) fold. AlkD uses a unique non-base-flipping mechanism that enables excision of bulky lesions more commonly associated with nucleotide excision repair. In contrast, AlkC has a much narrower specificity for small lesions, principally N3-methyladenine (3mA). Here, we describe how AlkC selects for and excises 3mA using a non-base-flipping strategy distinct from that of AlkD. A crystal structure resembling a catalytic intermediate complex shows how AlkC uses unique HLR and immunoglobulin-like domains to induce a sharp kink in the DNA, exposing the damaged nucleobase to active site residues that project into the DNA. This active site can accommodate and excise N3-methylcytosine (3mC) and N1-methyladenine (1mA), which are also repaired by AlkB-catalyzed oxidative demethylation, providing a potential alternative mechanism for repair of these lesions in bacteria.

  4. Interactions between Al₁₂X (X = Al, C, N and P) nanoparticles and DNA nucleobases/base pairs: implications for nanotoxicity.

    Science.gov (United States)

    Jin, Peng; Chen, Yongsheng; Zhang, Shengbai B; Chen, Zhongfang

    2012-02-01

    The interactions between neutral Al(12)X(I ( h )) (X = Al, C, N and P) nanoparticles and DNA nucleobases, namely adenine (A), thymine (T), guanine (G) and cytosine (C), as well as the Watson-Crick base pairs (BPs) AT and GC, were investigated by means of density functional theory computations. The Al(12)X clusters can tightly bind to DNA bases and BPs to form stable complexes with negative binding Gibbs free energies at room temperature, and considerable charge transfers occur between the bases/BPs and the Al(12)X clusters. These strong interactions, which are also expected for larger Al nanoparticles, may have potentially adverse impacts on the structure and stability of DNA and thus cause its dysfunction.

  5. Polarizable Force Field for DNA Based on the Classical Drude Oscillator: I. Refinement Using Quantum Mechanical Base Stacking and Conformational Energetics.

    Science.gov (United States)

    Lemkul, Justin A; MacKerell, Alexander D

    2017-05-09

    Empirical force fields seek to relate the configuration of a set of atoms to its energy, thus yielding the forces governing its dynamics, using classical physics rather than more expensive quantum mechanical calculations that are computationally intractable for large systems. Most force fields used to simulate biomolecular systems use fixed atomic partial charges, neglecting the influence of electronic polarization, instead making use of a mean-field approximation that may not be transferable across environments. Recent hardware and software developments make polarizable simulations feasible, and to this end, polarizable force fields represent the next generation of molecular dynamics simulation technology. In this work, we describe the refinement of a polarizable force field for DNA based on the classical Drude oscillator model by targeting quantum mechanical interaction energies and conformational energy profiles of model compounds necessary to build a complete DNA force field. The parametrization strategy employed in the present work seeks to correct weak base stacking in A- and B-DNA and the unwinding of Z-DNA observed in the previous version of the force field, called Drude-2013. Refinement of base nonbonded terms and reparametrization of dihedral terms in the glycosidic linkage, deoxyribofuranose rings, and important backbone torsions resulted in improved agreement with quantum mechanical potential energy surfaces. Notably, we expand on previous efforts by explicitly including Z-DNA conformational energetics in the refinement.

  6. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    B. Mourrain; J.B. Lasserre; M. Laurent (Monique); P. Rostalski; P. Trebuchet (Philippe)

    2013-01-01

    htmlabstractIn this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and

  7. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    B. Mourrain; J.B. Lasserre; M. Laurent (Monique); P. Rostalski; P. Trebuchet (Philippe)

    2011-01-01

    htmlabstractIn this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and

  8. The Effect of Basepair Mismatch on DNA Strand Displacement

    OpenAIRE

    Broadwater, D.?W.?Bo; Kim, Harold?D.

    2016-01-01

    DNA strand displacement is a key reaction in DNA homologous recombination and DNA mismatch repair and is also heavily utilized in DNA-based computation and locomotion. Despite its ubiquity in science and engineering, sequence-dependent effects of displacement kinetics have not been extensively characterized. Here, we measured toehold-mediated strand displacement kinetics using single-molecule fluorescence in the presence of a single base pair mismatch. The apparent displacement rate varied si...

  9. Multiscale empirical interpolation for solving nonlinear PDEs

    KAUST Repository

    Calo, Victor M.

    2014-12-01

    In this paper, we propose a multiscale empirical interpolation method for solving nonlinear multiscale partial differential equations. The proposed method combines empirical interpolation techniques and local multiscale methods, such as the Generalized Multiscale Finite Element Method (GMsFEM). To solve nonlinear equations, the GMsFEM is used to represent the solution on a coarse grid with multiscale basis functions computed offline. Computing the GMsFEM solution involves calculating the system residuals and Jacobians on the fine grid. We use empirical interpolation concepts to evaluate these residuals and Jacobians of the multiscale system with a computational cost which is proportional to the size of the coarse-scale problem rather than the fully-resolved fine scale one. The empirical interpolation method uses basis functions which are built by sampling the nonlinear function we want to approximate a limited number of times. The coefficients needed for this approximation are computed in the offline stage by inverting an inexpensive linear system. The proposed multiscale empirical interpolation techniques: (1) divide computing the nonlinear function into coarse regions; (2) evaluate contributions of nonlinear functions in each coarse region taking advantage of a reduced-order representation of the solution; and (3) introduce multiscale proper-orthogonal-decomposition techniques to find appropriate interpolation vectors. We demonstrate the effectiveness of the proposed methods on several nonlinear multiscale PDEs that are solved with Newton\\'s methods and fully-implicit time marching schemes. Our numerical results show that the proposed methods provide a robust framework for solving nonlinear multiscale PDEs on a coarse grid with bounded error and significant computational cost reduction.

  10. DNA bases assembled on the Au(110)/electrolyte interface: A combined experimental and theoretical study

    DEFF Research Database (Denmark)

    Salvatore, Princia; Nazmutdinov, Renat R.; Ulstrup, Jens

    2015-01-01

    Among the low-index single-crystal gold surfaces, the Au(110) surface is the most active toward molecular adsorption and the one with fewest electrochemical adsorption data reported. Cyclic voltammetry (CV), electrochemically controlled scanning tunneling microscopy (EC-STM), and density functional......, accompanied by a pair of strong voltammetry peaks in the double-layer region in acid solutions. Adsorption of the DNA bases gives featureless voltammograms with lower double-layer capacitance, suggesting that all the bases are chemisorbed on the Au(110) surface. Further investigation of the surface structures...... of the adlayers of the four DNA bases by EC-STM disclosed lifting of the Au(110) reconstruction, specific molecular packing in dense monolayers, and pH dependence of the A and G adsorption. DFT computations based on a cluster model for the Au(110) surface were performed to investigate the adsorption energy...

  11. Structure of a stacked anthraquinone–DNA complex

    Science.gov (United States)

    De Luchi, Daniela; Usón, Isabel; Wright, Glenford; Gouyette, Catherine; Subirana, Juan A.

    2010-01-01

    The crystal structure of the telomeric sequence d(UBrAGG) interacting with an anthraquinone derivative has been solved by MAD. In all previously studied complexes of intercalating drugs, the drug is usually sandwiched between two DNA base pairs. Instead, the present structure looks like a crystal of stacked anthraquinone molecules in which isolated base pairs are intercalated. Unusual base pairs are present in the structure, such as G·G and A·UBr reverse Watson–Crick base pairs. PMID:20823516

  12. The Effect of Problem-Solving Instruction on the Programming Self-efficacy and Achievement of Introductory Computer Science Students

    Science.gov (United States)

    Maddrey, Elizabeth

    Research in academia and industry continues to identify a decline in enrollment in computer science. One major component of this decline in enrollment is a shortage of female students. The primary reasons for the gender gap presented in the research include lack of computer experience prior to their first year in college, misconceptions about the field, negative cultural stereotypes, lack of female mentors and role models, subtle discriminations in the classroom, and lack of self-confidence (Pollock, McCoy, Carberry, Hundigopal, & You, 2004). Male students are also leaving the field due to misconceptions about the field, negative cultural stereotypes, and a lack of self-confidence. Analysis of first year attrition revealed that one of the major challenges faced by students of both genders is a lack of problem-solving skills (Beaubouef, Lucas & Howatt, 2001; Olsen, 2005; Paxton & Mumey, 2001). The purpose of this study was to investigate whether specific, non-mathematical problem-solving instruction as part of introductory programming courses significantly increased computer programming self-efficacy and achievement of students. The results of this study showed that students in the experimental group had significantly higher achievement than students in the control group. While this shows statistical significance, due to the effect size and disordinal nature of the data between groups, care has to be taken in its interpretation. The study did not show significantly higher programming self-efficacy among the experimental students. There was not enough data collected to statistically analyze the effect of the treatment on self-efficacy and achievement by gender. However, differences in means were observed between the gender groups, with females in the experimental group demonstrating a higher than average degree of self-efficacy when compared with males in the experimental group and both genders in the control group. These results suggest that the treatment from this

  13. Programmable chemical controllers made from DNA

    Science.gov (United States)

    Chen, Yuan-Jyue; Dalchau, Neil; Srinivas, Niranjan; Phillips, Andrew; Cardelli, Luca; Soloveichik, David; Seelig, Georg

    2013-10-01

    Biological organisms use complex molecular networks to navigate their environment and regulate their internal state. The development of synthetic systems with similar capabilities could lead to applications such as smart therapeutics or fabrication methods based on self-organization. To achieve this, molecular control circuits need to be engineered to perform integrated sensing, computation and actuation. Here we report a DNA-based technology for implementing the computational core of such controllers. We use the formalism of chemical reaction networks as a 'programming language' and our DNA architecture can, in principle, implement any behaviour that can be mathematically expressed as such. Unlike logic circuits, our formulation naturally allows complex signal processing of intrinsically analogue biological and chemical inputs. Controller components can be derived from biologically synthesized (plasmid) DNA, which reduces errors associated with chemically synthesized DNA. We implement several building-block reaction types and then combine them into a network that realizes, at the molecular level, an algorithm used in distributed control systems for achieving consensus between multiple agents.

  14. The Language Factor in Elementary Mathematics Assessments: Computational Skills and Applied Problem Solving in a Multidimensional IRT Framework

    Science.gov (United States)

    Hickendorff, Marian

    2013-01-01

    The results of an exploratory study into measurement of elementary mathematics ability are presented. The focus is on the abilities involved in solving standard computation problems on the one hand and problems presented in a realistic context on the other. The objectives were to assess to what extent these abilities are shared or distinct, and…

  15. A Faster Algorithm for Solving One-Clock Priced Timed Games

    DEFF Research Database (Denmark)

    Hansen, Thomas Dueholm; Ibsen-Jensen, Rasmus; Miltersen, Peter Bro

    2013-01-01

    previously known time bound for solving one-clock priced timed games was 2O(n2+m) , due to Rutkowski. For our improvement, we introduce and study a new algorithm for solving one-clock priced timed games, based on the sweep-line technique from computational geometry and the strategy iteration paradigm from......One-clock priced timed games is a class of two-player, zero-sum, continuous-time games that was defined and thoroughly studied in previous works. We show that one-clock priced timed games can be solved in time m 12 n n O(1), where n is the number of states and m is the number of actions. The best...

  16. A Faster Algorithm for Solving One-Clock Priced Timed Games

    DEFF Research Database (Denmark)

    Hansen, Thomas Dueholm; Ibsen-Jensen, Rasmus; Miltersen, Peter Bro

    2012-01-01

    previously known time bound for solving one-clock priced timed games was 2^(O(n^2+m)), due to Rutkowski. For our improvement, we introduce and study a new algorithm for solving one-clock priced timed games, based on the sweep-line technique from computational geometry and the strategy iteration paradigm from......One-clock priced timed games is a class of two-player, zero-sum, continuous-time games that was defined and thoroughly studied in previous works. We show that one-clock priced timed games can be solved in time m 12^n n^(O(1)), where n is the number of states and m is the number of actions. The best...

  17. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    Lasserre, J.B.; Laurent, M.; Mourrain, B.; Rostalski, P.; Trébuchet, P.

    2013-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming its complex (resp. real) variety is finite. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-definite

  18. Chess games: a model for RNA based computation.

    Science.gov (United States)

    Cukras, A R; Faulhammer, D; Lipton, R J; Landweber, L F

    1999-10-01

    Here we develop the theory of RNA computing and a method for solving the 'knight problem' as an instance of a satisfiability (SAT) problem. Using only biological molecules and enzymes as tools, we developed an algorithm for solving the knight problem (3 x 3 chess board) using a 10-bit combinatorial pool and sequential RNase H digestions. The results of preliminary experiments presented here reveal that the protocol recovers far more correct solutions than expected at random, but the persistence of errors still presents the greatest challenge.

  19. Improve Problem Solving Skills through Adapting Programming Tools

    Science.gov (United States)

    Shaykhian, Linda H.; Shaykhian, Gholam Ali

    2007-01-01

    There are numerous ways for engineers and students to become better problem-solvers. The use of command line and visual programming tools can help to model a problem and formulate a solution through visualization. The analysis of problem attributes and constraints provide insight into the scope and complexity of the problem. The visualization aspect of the problem-solving approach tends to make students and engineers more systematic in their thought process and help them catch errors before proceeding too far in the wrong direction. The problem-solver identifies and defines important terms, variables, rules, and procedures required for solving a problem. Every step required to construct the problem solution can be defined in program commands that produce intermediate output. This paper advocates improved problem solving skills through using a programming tool. MatLab created by MathWorks, is an interactive numerical computing environment and programming language. It is a matrix-based system that easily lends itself to matrix manipulation, and plotting of functions and data. MatLab can be used as an interactive command line or a sequence of commands that can be saved in a file as a script or named functions. Prior programming experience is not required to use MatLab commands. The GNU Octave, part of the GNU project, a free computer program for performing numerical computations, is comparable to MatLab. MatLab visual and command programming are presented here.

  20. Trusted computing strengthens cloud authentication.

    Science.gov (United States)

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model.

  1. Trusted Computing Strengthens Cloud Authentication

    Directory of Open Access Journals (Sweden)

    Eghbal Ghazizadeh

    2014-01-01

    Full Text Available Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM. Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model.

  2. Trusted Computing Strengthens Cloud Authentication

    Science.gov (United States)

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  3. Evaluation of the Effectiveness of a Tablet Computer Application (App) in Helping Students with Visual Impairments Solve Mathematics Problems

    Science.gov (United States)

    Beal, Carole R.; Rosenblum, L. Penny

    2018-01-01

    Introduction: The authors examined a tablet computer application (iPad app) for its effectiveness in helping students studying prealgebra to solve mathematical word problems. Methods: Forty-three visually impaired students (that is, those who are blind or have low vision) completed eight alternating mathematics units presented using their…

  4. The Elementary School Students’ Mathematical Problem Solving Based on Reading Abilities

    Science.gov (United States)

    Wulandari, R. D.; Lukito, A.; Khabibah, S.

    2018-01-01

    The aim of this research is to describe the third grade of elementary school students’ mathematical problem in solving skills based on their reading abilities. This research is a descriptive research with qualitative approach. This research was conducted at elementary school Kebraon II Surabaya in second semester of 2016-2017 academic years. The participants of this research consist of third grade students with different reading abilities that are independent level, instructional level and frustration level. The participants of this research were selected with purposive sampling technique. The data of this study were collected using reading the narration texts, the Ekwall and Shanker Informal Reading Inventory, problem solving task and interview guidelines. The collected data were evaluated using a descriptive analysis method. Once the study had been completed, it was concluded that problem solving skills varied according to reading abilities, student with independent level and instructional level can solve the problem and students with frustration level can’t solve the problem because they can’t interpret the problem well.

  5. A Novel Computational Method for Detecting DNA Methylation Sites with DNA Sequence Information and Physicochemical Properties.

    Science.gov (United States)

    Pan, Gaofeng; Jiang, Limin; Tang, Jijun; Guo, Fei

    2018-02-08

    DNA methylation is an important biochemical process, and it has a close connection with many types of cancer. Research about DNA methylation can help us to understand the regulation mechanism and epigenetic reprogramming. Therefore, it becomes very important to recognize the methylation sites in the DNA sequence. In the past several decades, many computational methods-especially machine learning methods-have been developed since the high-throughout sequencing technology became widely used in research and industry. In order to accurately identify whether or not a nucleotide residue is methylated under the specific DNA sequence context, we propose a novel method that overcomes the shortcomings of previous methods for predicting methylation sites. We use k -gram, multivariate mutual information, discrete wavelet transform, and pseudo amino acid composition to extract features, and train a sparse Bayesian learning model to do DNA methylation prediction. Five criteria-area under the receiver operating characteristic curve (AUC), Matthew's correlation coefficient (MCC), accuracy (ACC), sensitivity (SN), and specificity-are used to evaluate the prediction results of our method. On the benchmark dataset, we could reach 0.8632 on AUC, 0.8017 on ACC, 0.5558 on MCC, and 0.7268 on SN. Additionally, the best results on two scBS-seq profiled mouse embryonic stem cells datasets were 0.8896 and 0.9511 by AUC, respectively. When compared with other outstanding methods, our method surpassed them on the accuracy of prediction. The improvement of AUC by our method compared to other methods was at least 0.0399 . For the convenience of other researchers, our code has been uploaded to a file hosting service, and can be downloaded from: https://figshare.com/s/0697b692d802861282d3.

  6. A Novel Computational Method for Detecting DNA Methylation Sites with DNA Sequence Information and Physicochemical Properties

    Directory of Open Access Journals (Sweden)

    Gaofeng Pan

    2018-02-01

    Full Text Available DNA methylation is an important biochemical process, and it has a close connection with many types of cancer. Research about DNA methylation can help us to understand the regulation mechanism and epigenetic reprogramming. Therefore, it becomes very important to recognize the methylation sites in the DNA sequence. In the past several decades, many computational methods—especially machine learning methods—have been developed since the high-throughout sequencing technology became widely used in research and industry. In order to accurately identify whether or not a nucleotide residue is methylated under the specific DNA sequence context, we propose a novel method that overcomes the shortcomings of previous methods for predicting methylation sites. We use k-gram, multivariate mutual information, discrete wavelet transform, and pseudo amino acid composition to extract features, and train a sparse Bayesian learning model to do DNA methylation prediction. Five criteria—area under the receiver operating characteristic curve (AUC, Matthew’s correlation coefficient (MCC, accuracy (ACC, sensitivity (SN, and specificity—are used to evaluate the prediction results of our method. On the benchmark dataset, we could reach 0.8632 on AUC, 0.8017 on ACC, 0.5558 on MCC, and 0.7268 on SN. Additionally, the best results on two scBS-seq profiled mouse embryonic stem cells datasets were 0.8896 and 0.9511 by AUC, respectively. When compared with other outstanding methods, our method surpassed them on the accuracy of prediction. The improvement of AUC by our method compared to other methods was at least 0.0399 . For the convenience of other researchers, our code has been uploaded to a file hosting service, and can be downloaded from: https://figshare.com/s/0697b692d802861282d3.

  7. Applying Agrep to r-NSA to solve multiple sequences approximate matching.

    Science.gov (United States)

    Ni, Bing; Wong, Man-Hon; Lam, Chi-Fai David; Leung, Kwong-Sak

    2014-01-01

    This paper addresses the approximate matching problem in a database consisting of multiple DNA sequences, where the proposed approach applies Agrep to a new truncated suffix array, r-NSA. The construction time of the structure is linear to the database size, and the computations of indexing a substring in the structure are constant. The number of characters processed in applying Agrep is analysed theoretically, and the theoretical upper-bound can approximate closely the empirical number of characters, which is obtained through enumerating the characters in the actual structure built. Experiments are carried out using (synthetic) random DNA sequences, as well as (real) genome sequences including Hepatitis-B Virus and X-chromosome. Experimental results show that, compared to the straight-forward approach that applies Agrep to multiple sequences individually, the proposed approach solves the matching problem in much shorter time. The speed-up of our approach depends on the sequence patterns, and for highly similar homologous genome sequences, which are the common cases in real-life genomes, it can be up to several orders of magnitude.

  8. Some Applications of Algebraic System Solving

    Science.gov (United States)

    Roanes-Lozano, Eugenio

    2011-01-01

    Technology and, in particular, computer algebra systems, allows us to change both the way we teach mathematics and the mathematical curriculum. Curiously enough, unlike what happens with linear system solving, algebraic system solving is not widely known. The aim of this paper is to show that, although the theory lying behind the "exact…

  9. MD study of pyrimidine base damage on DNA and its recognition by repair enzyme

    International Nuclear Information System (INIS)

    Pinak, M.

    2000-01-01

    The molecular dynamics (MD) simulation was used on the study of two specific damages of pyrimidine bases of DNA. Pyrimidine bases are major targets either of free radicals induced by ionizing radiation in DNA surrounding environment or UV radiation. Thymine dimer (TD) is UV induced damage, in which two neighboring thymines in one strand are joined by covalent bonds of C(5)-C(5) and C(6)-C(6) atoms of thymines. Thymine glycol (TG) is ionizing radiation induced damage in which the free water radical adds to unsaturated bond C(5)-C(6) of thymine. Both damages are experimentally suggested to be mutagenetic and carcinogenic unless properly repaired by repair enzymes. In the case of MD of TD, there is detected strong kink around the TD site that is not observed in native DNA. In addition there is observed the different value of electrostatic energy at the TD site - negative '-10 kcal/mol', in contrary to nearly neutral value of native thymine site. Structural changes and specific electrostatic energy - seems to be important for proper recognition of TD damaged site, formation of DNA-enzyme complex and thus for subsequent repair of DNA. In the case of TG damaged DNA there is major structural distortion at the TG site, mainly the increased distance between TG and the C5' of adjacent nucleotide. This enlarged gap between the neighboring nucleotides may prevent the insertion of complementary base during replication causing the replication process to stop. In which extend this structural feature together with energy properties of TG contributes to the proper recognition of TG by repair enzyme Endonuclease III is subject of further computational MD study. (author)

  10. Controlling charge current through a DNA based molecular transistor

    Energy Technology Data Exchange (ETDEWEB)

    Behnia, S., E-mail: s.behnia@sci.uut.ac.ir; Fathizadeh, S.; Ziaei, J.

    2017-01-05

    Molecular electronics is complementary to silicon-based electronics and may induce electronic functions which are difficult to obtain with conventional technology. We have considered a DNA based molecular transistor and study its transport properties. The appropriate DNA sequence as a central chain in molecular transistor and the functional interval for applied voltages is obtained. I–V characteristic diagram shows the rectifier behavior as well as the negative differential resistance phenomenon of DNA transistor. We have observed the nearly periodic behavior in the current flowing through DNA. It is reported that there is a critical gate voltage for each applied bias which above it, the electrical current is always positive. - Highlights: • Modeling a DNA based molecular transistor and studying its transport properties. • Choosing the appropriate DNA sequence using the quantum chaos tools. • Choosing the functional interval for voltages via the inverse participation ratio tool. • Detecting the rectifier and negative differential resistance behavior of DNA.

  11. DNA-Based Enzyme Reactors and Systems

    Directory of Open Access Journals (Sweden)

    Veikko Linko

    2016-07-01

    Full Text Available During recent years, the possibility to create custom biocompatible nanoshapes using DNA as a building material has rapidly emerged. Further, these rationally designed DNA structures could be exploited in positioning pivotal molecules, such as enzymes, with nanometer-level precision. This feature could be used in the fabrication of artificial biochemical machinery that is able to mimic the complex reactions found in living cells. Currently, DNA-enzyme hybrids can be used to control (multi-enzyme cascade reactions and to regulate the enzyme functions and the reaction pathways. Moreover, sophisticated DNA structures can be utilized in encapsulating active enzymes and delivering the molecular cargo into cells. In this review, we focus on the latest enzyme systems based on novel DNA nanostructures: enzyme reactors, regulatory devices and carriers that can find uses in various biotechnological and nanomedical applications.

  12. An Overview of the Prediction of Protein DNA-Binding Sites

    Directory of Open Access Journals (Sweden)

    Jingna Si

    2015-03-01

    Full Text Available Interactions between proteins and DNA play an important role in many essential biological processes such as DNA replication, transcription, splicing, and repair. The identification of amino acid residues involved in DNA-binding sites is critical for understanding the mechanism of these biological activities. In the last decade, numerous computational approaches have been developed to predict protein DNA-binding sites based on protein sequence and/or structural information, which play an important role in complementing experimental strategies. At this time, approaches can be divided into three categories: sequence-based DNA-binding site prediction, structure-based DNA-binding site prediction, and homology modeling and threading. In this article, we review existing research on computational methods to predict protein DNA-binding sites, which includes data sets, various residue sequence/structural features, machine learning methods for comparison and selection, evaluation methods, performance comparison of different tools, and future directions in protein DNA-binding site prediction. In particular, we detail the meta-analysis of protein DNA-binding sites. We also propose specific implications that are likely to result in novel prediction methods, increased performance, or practical applications.

  13. Domain decomposition method for solving elliptic problems in unbounded domains

    International Nuclear Information System (INIS)

    Khoromskij, B.N.; Mazurkevich, G.E.; Zhidkov, E.P.

    1991-01-01

    Computational aspects of the box domain decomposition (DD) method for solving boundary value problems in an unbounded domain are discussed. A new variant of the DD-method for elliptic problems in unbounded domains is suggested. It is based on the partitioning of an unbounded domain adapted to the given asymptotic decay of an unknown function at infinity. The comparison of computational expenditures is given for boundary integral method and the suggested DD-algorithm. 29 refs.; 2 figs.; 2 tabs

  14. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    Energy Technology Data Exchange (ETDEWEB)

    Orimoto, Yuuichi, E-mail: orimoto.yuuichi.888@m.kyushu-u.ac.jp [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Aoki, Yuriko [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012 (Japan)

    2016-07-14

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between “choose-maximum” (choose a base pair giving the maximum β for each step) and “choose-minimum” (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  15. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    International Nuclear Information System (INIS)

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-01-01

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between “choose-maximum” (choose a base pair giving the maximum β for each step) and “choose-minimum” (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  16. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  17. Structural basis for the inhibition of human alkyladenine DNA glycosylase (AAG) by 3,N4-ethenocytosine-containing DNA.

    Science.gov (United States)

    Lingaraju, Gondichatnahalli M; Davis, C Ainsley; Setser, Jeremy W; Samson, Leona D; Drennan, Catherine L

    2011-04-15

    Reactive oxygen and nitrogen species, generated by neutrophils and macrophages in chronically inflamed tissues, readily damage DNA, producing a variety of potentially genotoxic etheno base lesions; such inflammation-related DNA damage is now known to contribute to carcinogenesis. Although the human alkyladenine DNA glycosylase (AAG) can specifically bind DNA containing either 1,N(6)-ethenoadenine (εA) lesions or 3,N(4)-ethenocytosine (εC) lesions, it can only excise εA lesions. AAG binds very tightly to DNA containing εC lesions, forming an abortive protein-DNA complex; such binding not only shields εC from repair by other enzymes but also inhibits AAG from acting on other DNA lesions. To understand the structural basis for inhibition, we have characterized the binding of AAG to DNA containing εC lesions and have solved a crystal structure of AAG bound to a DNA duplex containing the εC lesion. This study provides the first structure of a DNA glycosylase in complex with an inhibitory base lesion that is induced endogenously and that is also induced upon exposure to environmental agents such as vinyl chloride. We identify the primary cause of inhibition as a failure to activate the nucleotide base as an efficient leaving group and demonstrate that the higher binding affinity of AAG for εC versus εA is achieved through formation of an additional hydrogen bond between Asn-169 in the active site pocket and the O(2) of εC. This structure provides the basis for the design of AAG inhibitors currently being sought as an adjuvant for cancer chemotherapy.

  18. The Goal Specificity Effect on Strategy Use and Instructional Efficiency during Computer-Based Scientific Discovery Learning

    Science.gov (United States)

    Kunsting, Josef; Wirth, Joachim; Paas, Fred

    2011-01-01

    Using a computer-based scientific discovery learning environment on buoyancy in fluids we investigated the "effects of goal specificity" (nonspecific goals vs. specific goals) for two goal types (problem solving goals vs. learning goals) on "strategy use" and "instructional efficiency". Our empirical findings close an important research gap,…

  19. Electrochemical DNA Hybridization Sensors Based on Conducting Polymers

    Science.gov (United States)

    Rahman, Md. Mahbubur; Li, Xiao-Bo; Lopa, Nasrin Siraj; Ahn, Sang Jung; Lee, Jae-Joon

    2015-01-01

    Conducting polymers (CPs) are a group of polymeric materials that have attracted considerable attention because of their unique electronic, chemical, and biochemical properties. This is reflected in their use in a wide range of potential applications, including light-emitting diodes, anti-static coating, electrochromic materials, solar cells, chemical sensors, biosensors, and drug-release systems. Electrochemical DNA sensors based on CPs can be used in numerous areas related to human health. This review summarizes the recent progress made in the development and use of CP-based electrochemical DNA hybridization sensors. We discuss the distinct properties of CPs with respect to their use in the immobilization of probe DNA on electrode surfaces, and we describe the immobilization techniques used for developing DNA hybridization sensors together with the various transduction methods employed. In the concluding part of this review, we present some of the challenges faced in the use of CP-based DNA hybridization sensors, as well as a future perspective. PMID:25664436

  20. Electrochemical DNA Hybridization Sensors Based on Conducting Polymers

    Directory of Open Access Journals (Sweden)

    Md. Mahbubur Rahman

    2015-02-01

    Full Text Available Conducting polymers (CPs are a group of polymeric materials that have attracted considerable attention because of their unique electronic, chemical, and biochemical properties. This is reflected in their use in a wide range of potential applications, including light-emitting diodes, anti-static coating, electrochromic materials, solar cells, chemical sensors, biosensors, and drug-release systems. Electrochemical DNA sensors based on CPs can be used in numerous areas related to human health. This review summarizes the recent progress made in the development and use of CP-based electrochemical DNA hybridization sensors. We discuss the distinct properties of CPs with respect to their use in the immobilization of probe DNA on electrode surfaces, and we describe the immobilization techniques used for developing DNA hybridization sensors together with the various transduction methods employed. In the concluding part of this review, we present some of the challenges faced in the use of CP-based DNA hybridization sensors, as well as a future perspective.

  1. Biologically important conformational features of DNA as interpreted by quantum mechanics and molecular mechanics computations of its simple fragments.

    Science.gov (United States)

    Poltev, V; Anisimov, V M; Dominguez, V; Gonzalez, E; Deriabina, A; Garcia, D; Rivas, F; Polteva, N A

    2018-02-01

    Deciphering the mechanism of functioning of DNA as the carrier of genetic information requires identifying inherent factors determining its structure and function. Following this path, our previous DFT studies attributed the origin of unique conformational characteristics of right-handed Watson-Crick duplexes (WCDs) to the conformational profile of deoxydinucleoside monophosphates (dDMPs) serving as the minimal repeating units of DNA strand. According to those findings, the directionality of the sugar-phosphate chain and the characteristic ranges of dihedral angles of energy minima combined with the geometric differences between purines and pyrimidines determine the dependence on base sequence of the three-dimensional (3D) structure of WCDs. This work extends our computational study to complementary deoxydinucleotide-monophosphates (cdDMPs) of non-standard conformation, including those of Z-family, Hoogsteen duplexes, parallel-stranded structures, and duplexes with mispaired bases. For most of these systems, except Z-conformation, computations closely reproduce experimental data within the tolerance of characteristic limits of dihedral parameters for each conformation family. Computation of cdDMPs with Z-conformation reveals that their experimental structures do not correspond to the internal energy minimum. This finding establishes the leading role of external factors in formation of the Z-conformation. Energy minima of cdDMPs of non-Watson-Crick duplexes demonstrate different sequence-dependence features than those known for WCDs. The obtained results provide evidence that the biologically important regularities of 3D structure distinguish WCDs from duplexes having non-Watson-Crick nucleotide pairing.

  2. Recent progress on DNA based walkers.

    Science.gov (United States)

    Pan, Jing; Li, Feiran; Cha, Tae-Gon; Chen, Haorong; Choi, Jong Hyun

    2015-08-01

    DNA based synthetic molecular walkers are reminiscent of biological protein motors. They are powered by hybridization with fuel strands, environment induced conformational transitions, and covalent chemistry of oligonucleotides. Recent developments in experimental techniques enable direct observation of individual walkers with high temporal and spatial resolution. The functionalities of state-of-the-art DNA walker systems can thus be analyzed for various applications. Herein we review recent progress on DNA walker principles and characterization methods, and evaluate various aspects of their functions for future applications. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Development of a problem solving evaluation instrument; untangling of specific problem solving assets

    Science.gov (United States)

    Adams, Wendy Kristine

    The purpose of my research was to produce a problem solving evaluation tool for physics. To do this it was necessary to gain a thorough understanding of how students solve problems. Although physics educators highly value problem solving and have put extensive effort into understanding successful problem solving, there is currently no efficient way to evaluate problem solving skill. Attempts have been made in the past; however, knowledge of the principles required to solve the subject problem are so absolutely critical that they completely overshadow any other skills students may use when solving a problem. The work presented here is unique because the evaluation tool removes the requirement that the student already have a grasp of physics concepts. It is also unique because I picked a wide range of people and picked a wide range of tasks for evaluation. This is an important design feature that helps make things emerge more clearly. This dissertation includes an extensive literature review of problem solving in physics, math, education and cognitive science as well as descriptions of studies involving student use of interactive computer simulations, the design and validation of a beliefs about physics survey and finally the design of the problem solving evaluation tool. I have successfully developed and validated a problem solving evaluation tool that identifies 44 separate assets (skills) necessary for solving problems. Rigorous validation studies, including work with an independent interviewer, show these assets identified by this content-free evaluation tool are the same assets that students use to solve problems in mechanics and quantum mechanics. Understanding this set of component assets will help teachers and researchers address problem solving within the classroom.

  4. DNA nanostructure-based drug delivery nanosystems in cancer therapy.

    Science.gov (United States)

    Wu, Dandan; Wang, Lei; Li, Wei; Xu, Xiaowen; Jiang, Wei

    2017-11-25

    DNA as a novel biomaterial can be used to fabricate different kinds of DNA nanostructures based on its principle of GC/AT complementary base pairing. Studies have shown that DNA nanostructure is a nice drug carrier to overcome big obstacles existing in cancer therapy such as systemic toxicity and unsatisfied drug efficacy. Thus, different types of DNA nanostructure-based drug delivery nanosystems have been designed in cancer therapy. To improve treating efficacy, they are also developed into more functional drug delivery nanosystems. In recent years, some important progresses have been made. The objective of this review is to make a retrospect and summary about these different kinds of DNA nanostructure-based drug delivery nanosystems and their latest progresses: (1) active targeting; (2) mutidrug co-delivery; (3) construction of stimuli-responsive/intelligent nanosystems. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Autoregressive-model-based missing value estimation for DNA microarray time series data.

    Science.gov (United States)

    Choong, Miew Keen; Charbit, Maurice; Yan, Hong

    2009-01-01

    Missing value estimation is important in DNA microarray data analysis. A number of algorithms have been developed to solve this problem, but they have several limitations. Most existing algorithms are not able to deal with the situation where a particular time point (column) of the data is missing entirely. In this paper, we present an autoregressive-model-based missing value estimation method (ARLSimpute) that takes into account the dynamic property of microarray temporal data and the local similarity structures in the data. ARLSimpute is especially effective for the situation where a particular time point contains many missing values or where the entire time point is missing. Experiment results suggest that our proposed algorithm is an accurate missing value estimator in comparison with other imputation methods on simulated as well as real microarray time series datasets.

  6. Optimal control of gene mutation in DNA replication.

    Science.gov (United States)

    Yu, Juanyi; Li, Jr-Shin; Tarn, Tzyh-Jong

    2012-01-01

    We propose a molecular-level control system view of the gene mutations in DNA replication from the finite field concept. By treating DNA sequences as state variables, chemical mutagens and radiation as control inputs, one cell cycle as a step increment, and the measurements of the resulting DNA sequence as outputs, we derive system equations for both deterministic and stochastic discrete-time, finite-state systems of different scales. Defining the cost function as a summation of the costs of applying mutagens and the off-trajectory penalty, we solve the deterministic and stochastic optimal control problems by dynamic programming algorithm. In addition, given that the system is completely controllable, we find that the global optimum of both base-to-base and codon-to-codon deterministic mutations can always be achieved within a finite number of steps.

  7. GPU-based acceleration of computations in nonlinear finite element deformation analysis.

    Science.gov (United States)

    Mafi, Ramin; Sirouspour, Shahin

    2014-03-01

    The physics of deformation for biological soft-tissue is best described by nonlinear continuum mechanics-based models, which then can be discretized by the FEM for a numerical solution. However, computational complexity of such models have limited their use in applications requiring real-time or fast response. In this work, we propose a graphic processing unit-based implementation of the FEM using implicit time integration for dynamic nonlinear deformation analysis. This is the most general formulation of the deformation analysis. It is valid for large deformations and strains and can account for material nonlinearities. The data-parallel nature and the intense arithmetic computations of nonlinear FEM equations make it particularly suitable for implementation on a parallel computing platform such as graphic processing unit. In this work, we present and compare two different designs based on the matrix-free and conventional preconditioned conjugate gradients algorithms for solving the FEM equations arising in deformation analysis. The speedup achieved with the proposed parallel implementations of the algorithms will be instrumental in the development of advanced surgical simulators and medical image registration methods involving soft-tissue deformation. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Computational physics problem solving with Python

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2015-01-01

    The use of computation and simulation has become an essential part of the scientific process. Being able to transform a theory into an algorithm requires significant theoretical insight, detailed physical and mathematical understanding, and a working level of competency in programming. This upper-division text provides an unusually broad survey of the topics of modern computational physics from a multidisciplinary, computational science point of view. Its philosophy is rooted in learning by doing (assisted by many model programs), with new scientific materials as well as with the Python progr

  9. Problem Solving-Based Experiment untuk Meningkatkan Keterampilan Penalaran Ilmiah Mahasiswa Fisika

    Directory of Open Access Journals (Sweden)

    Muhamad Gina Nugraha

    2017-12-01

    Full Text Available Abstract As one of the foundations in the development of technology, physics must be supported by experimental activities that are able to develop a scientist's skills, such as scientific reasoning skills. Experiments with cookbook methods that have been conducted in various experimental activities are considered not able to maximize the potential of students because it does not provide wide opportunities for students to explore. One of the solutions to develop the scientific reasoning skills of physics students is the problem solving-based experiment approach. The research was conducted by one group pretest-posstest design to 20 physics students as research sample. The research instrument used is the scientific reasoning instrument test developed by Lawson which is known as Lawson Classroom Test of Scientific Reasoning (LCTSR and student work sheet instrument (LKM containing problems in daily life and questions about: tools and materials, prediction, exploration, measurement, analysis and conclusions. The results show all aspects of scientific reasoning being measured, i.e. 1 conservation of matter and volume, 2 proportional thinking, 3 identification and control of variables, 4 probabilistic thinking, 5 correlative thinking, and 6 hypothetic-deductive thinking has increased. Based on the result of research can be concluded that the problem solving-based experiment can improve the scientific reasoning skills of physics students. Keywords: Problem solving, experiment, scientific reasoning skills Abstrak Fisika sebagai salah satu pondasi ilmu dalam perkembangan teknologi harus didukung dengan kegiatan eksperimen yang mampu menumbuhkembangkan keterampilan seorang ilmuwan, diantaranya keterampilan penalaran ilmiah dalam menyikapi fenomena alam. Eksperimen dengan metode cookbook yang selama ini menjamur dalam berbagai kegiatan eksperimen dipandang tidak mampu memaksimalkan potensi mahasiswa karena tidak memberikan kesempatan yang luas kepada

  10. Solving ptychography with a convex relaxation

    Science.gov (United States)

    Horstmeyer, Roarke; Chen, Richard Y.; Ou, Xiaoze; Ames, Brendan; Tropp, Joel A.; Yang, Changhuei

    2015-05-01

    Ptychography is a powerful computational imaging technique that transforms a collection of low-resolution images into a high-resolution sample reconstruction. Unfortunately, algorithms that currently solve this reconstruction problem lack stability, robustness, and theoretical guarantees. Recently, convex optimization algorithms have improved the accuracy and reliability of several related reconstruction efforts. This paper proposes a convex formulation of the ptychography problem. This formulation has no local minima, it can be solved using a wide range of algorithms, it can incorporate appropriate noise models, and it can include multiple a priori constraints. The paper considers a specific algorithm, based on low-rank factorization, whose runtime and memory usage are near-linear in the size of the output image. Experiments demonstrate that this approach offers a 25% lower background variance on average than alternating projections, the ptychographic reconstruction algorithm that is currently in widespread use.

  11. Computational fluid dynamics on a massively parallel computer

    Science.gov (United States)

    Jespersen, Dennis C.; Levit, Creon

    1989-01-01

    A finite difference code was implemented for the compressible Navier-Stokes equations on the Connection Machine, a massively parallel computer. The code is based on the ARC2D/ARC3D program and uses the implicit factored algorithm of Beam and Warming. The codes uses odd-even elimination to solve linear systems. Timings and computation rates are given for the code, and a comparison is made with a Cray XMP.

  12. The Effects of Computer Programming on High School Students' Reasoning Skills and Mathematical Self-Efficacy and Problem Solving

    Science.gov (United States)

    Psycharis, Sarantos; Kallia, Maria

    2017-01-01

    In this paper we investigate whether computer programming has an impact on high school student's reasoning skills, problem solving and self-efficacy in Mathematics. The quasi-experimental design was adopted to implement the study. The sample of the research comprised 66 high school students separated into two groups, the experimental and the…

  13. DNA fragments assembly based on nicking enzyme system.

    Directory of Open Access Journals (Sweden)

    Rui-Yan Wang

    Full Text Available A couple of DNA ligation-independent cloning (LIC methods have been reported to meet various requirements in metabolic engineering and synthetic biology. The principle of LIC is the assembly of multiple overlapping DNA fragments by single-stranded (ss DNA overlaps annealing. Here we present a method to generate single-stranded DNA overlaps based on Nicking Endonucleases (NEases for LIC, the method was termed NE-LIC. Factors related to cloning efficiency were optimized in this study. This NE-LIC allows generating 3'-end or 5'-end ss DNA overlaps of various lengths for fragments assembly. We demonstrated that the 10 bp/15 bp overlaps had the highest DNA fragments assembling efficiency, while 5 bp/10 bp overlaps showed the highest efficiency when T4 DNA ligase was added. Its advantage over Sequence and Ligation Independent Cloning (SLIC and Uracil-Specific Excision Reagent (USER was obvious. The mechanism can be applied to many other LIC strategies. Finally, the NEases based LIC (NE-LIC was successfully applied to assemble a pathway of six gene fragments responsible for synthesizing microbial poly-3-hydroxybutyrate (PHB.

  14. A novel constraint for thermodynamically designing DNA sequences.

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    Full Text Available Biotechnological and biomolecular advances have introduced novel uses for DNA such as DNA computing, storage, and encryption. For these applications, DNA sequence design requires maximal desired (and minimal undesired hybridizations, which are the product of a single new DNA strand from 2 single DNA strands. Here, we propose a novel constraint to design DNA sequences based on thermodynamic properties. Existing constraints for DNA design are based on the Hamming distance, a constraint that does not address the thermodynamic properties of the DNA sequence. Using a unique, improved genetic algorithm, we designed DNA sequence sets which satisfy different distance constraints and employ a free energy gap based on a minimum free energy (MFE to gauge DNA sequences based on set thermodynamic properties. When compared to the best constraints of the Hamming distance, our method yielded better thermodynamic qualities. We then used our improved genetic algorithm to obtain lower-bound DNA sequence sets. Here, we discuss the effects of novel constraint parameters on the free energy gap.

  15. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    Science.gov (United States)

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  16. Identification of species based on DNA barcode using k-mer feature vector and Random forest classifier.

    Science.gov (United States)

    Meher, Prabina Kumar; Sahu, Tanmaya Kumar; Rao, A R

    2016-11-05

    DNA barcoding is a molecular diagnostic method that allows automated and accurate identification of species based on a short and standardized fragment of DNA. To this end, an attempt has been made in this study to develop a computational approach for identifying the species by comparing its barcode with the barcode sequence of known species present in the reference library. Each barcode sequence was first mapped onto a numeric feature vector based on k-mer frequencies and then Random forest methodology was employed on the transformed dataset for species identification. The proposed approach outperformed similarity-based, tree-based, diagnostic-based approaches and found comparable with existing supervised learning based approaches in terms of species identification success rate, while compared using real and simulated datasets. Based on the proposed approach, an online web interface SPIDBAR has also been developed and made freely available at http://cabgrid.res.in:8080/spidbar/ for species identification by the taxonomists. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Cartoon computation: quantum-like computing without quantum mechanics

    International Nuclear Information System (INIS)

    Aerts, Diederik; Czachor, Marek

    2007-01-01

    We present a computational framework based on geometric structures. No quantum mechanics is involved, and yet the algorithms perform tasks analogous to quantum computation. Tensor products and entangled states are not needed-they are replaced by sets of basic shapes. To test the formalism we solve in geometric terms the Deutsch-Jozsa problem, historically the first example that demonstrated the potential power of quantum computation. Each step of the algorithm has a clear geometric interpretation and allows for a cartoon representation. (fast track communication)

  18. Balancing Expression and Structure in Game Design: Developing Computational Participation Using Studio-Based Design Pedagogy

    Science.gov (United States)

    DeVane, Ben; Steward, Cody; Tran, Kelly M.

    2016-01-01

    This article reports on a project that used a game-creation tool to introduce middle-school students ages 10 to 13 to problem-solving strategies similar to those in computer science through the lens of studio-based design arts. Drawing on historic paradigms in design pedagogy and contemporary educational approaches in the digital arts to teach…

  19. Algorithm for solving the linear Cauchy problem for large systems of ordinary differential equations with the use of parallel computations

    Energy Technology Data Exchange (ETDEWEB)

    Moryakov, A. V., E-mail: sailor@orc.ru [National Research Centre Kurchatov Institute (Russian Federation)

    2016-12-15

    An algorithm for solving the linear Cauchy problem for large systems of ordinary differential equations is presented. The algorithm for systems of first-order differential equations is implemented in the EDELWEISS code with the possibility of parallel computations on supercomputers employing the MPI (Message Passing Interface) standard for the data exchange between parallel processes. The solution is represented by a series of orthogonal polynomials on the interval [0, 1]. The algorithm is characterized by simplicity and the possibility to solve nonlinear problems with a correction of the operator in accordance with the solution obtained in the previous iterative process.

  20. DNA-based random number generation in security circuitry.

    Science.gov (United States)

    Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C

    2010-06-01

    DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.

  1. CTD: a computer program to solve the three dimensional multi-group diffusion equation in X, Y, Z, and triangular Z geometries

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, J K

    1973-05-01

    CTD is a computer program written in Fortran 4 to solve the multi-group diffusion theory equations in X, Y, Z and triangular Z geometries. A power print- out neutron balance and breeding gain are also produced. 4 references. (auth)

  2. A set of numerical meteorological models for solving basic and some special problems in the Boris Kidric Institute

    International Nuclear Information System (INIS)

    Grscic, Z.

    1989-01-01

    Models for solving transport and dispersion problems of radioactive pollutants through atmosphere are briefly shown. These models are the base for solving and some special problems such as: estimating effective and physical heights of radioactive sources, computation of radioactive concentration distribution from multiple sources etc (author)

  3. Normal-Mode Analysis of Circular DNA at the Base-Pair Level. 2. Large-Scale Configurational Transformation of a Naturally Curved Molecule.

    Science.gov (United States)

    Matsumoto, Atsushi; Tobias, Irwin; Olson, Wilma K

    2005-01-01

    Fine structural and energetic details embedded in the DNA base sequence, such as intrinsic curvature, are important to the packaging and processing of the genetic material. Here we investigate the internal dynamics of a 200 bp closed circular molecule with natural curvature using a newly developed normal-mode treatment of DNA in terms of neighboring base-pair "step" parameters. The intrinsic curvature of the DNA is described by a 10 bp repeating pattern of bending distortions at successive base-pair steps. We vary the degree of intrinsic curvature and the superhelical stress on the molecule and consider the normal-mode fluctuations of both the circle and the stable figure-8 configuration under conditions where the energies of the two states are similar. To extract the properties due solely to curvature, we ignore other important features of the double helix, such as the extensibility of the chain, the anisotropy of local bending, and the coupling of step parameters. We compare the computed normal modes of the curved DNA model with the corresponding dynamical features of a covalently closed duplex of the same chain length constructed from naturally straight DNA and with the theoretically predicted dynamical properties of a naturally circular, inextensible elastic rod, i.e., an O-ring. The cyclic molecules with intrinsic curvature are found to be more deformable under superhelical stress than rings formed from naturally straight DNA. As superhelical stress is accumulated in the DNA, the frequency, i.e., energy, of the dominant bending mode decreases in value, and if the imposed stress is sufficiently large, a global configurational rearrangement of the circle to the figure-8 form takes place. We combine energy minimization with normal-mode calculations of the two states to decipher the configurational pathway between the two states. We also describe and make use of a general analytical treatment of the thermal fluctuations of an elastic rod to characterize the

  4. Photonic band structures solved by a plane-wave-based transfer-matrix method.

    Science.gov (United States)

    Li, Zhi-Yuan; Lin, Lan-Lan

    2003-04-01

    Transfer-matrix methods adopting a plane-wave basis have been routinely used to calculate the scattering of electromagnetic waves by general multilayer gratings and photonic crystal slabs. In this paper we show that this technique, when combined with Bloch's theorem, can be extended to solve the photonic band structure for 2D and 3D photonic crystal structures. Three different eigensolution schemes to solve the traditional band diagrams along high-symmetry lines in the first Brillouin zone of the crystal are discussed. Optimal rules for the Fourier expansion over the dielectric function and electromagnetic fields with discontinuities occurring at the boundary of different material domains have been employed to accelerate the convergence of numerical computation. Application of this method to an important class of 3D layer-by-layer photonic crystals reveals the superior convergency of this different approach over the conventional plane-wave expansion method.

  5. Photonic band structures solved by a plane-wave-based transfer-matrix method

    International Nuclear Information System (INIS)

    Li Zhiyuan; Lin Lanlan

    2003-01-01

    Transfer-matrix methods adopting a plane-wave basis have been routinely used to calculate the scattering of electromagnetic waves by general multilayer gratings and photonic crystal slabs. In this paper we show that this technique, when combined with Bloch's theorem, can be extended to solve the photonic band structure for 2D and 3D photonic crystal structures. Three different eigensolution schemes to solve the traditional band diagrams along high-symmetry lines in the first Brillouin zone of the crystal are discussed. Optimal rules for the Fourier expansion over the dielectric function and electromagnetic fields with discontinuities occurring at the boundary of different material domains have been employed to accelerate the convergence of numerical computation. Application of this method to an important class of 3D layer-by-layer photonic crystals reveals the superior convergency of this different approach over the conventional plane-wave expansion method

  6. Identification of unique repeated patterns, location of mutation in DNA finger printing using artificial intelligence technique.

    Science.gov (United States)

    Mukunthan, B; Nagaveni, N

    2014-01-01

    In genetic engineering, conventional techniques and algorithms employed by forensic scientists to assist in identification of individuals on the basis of their respective DNA profiles involves more complex computational steps and mathematical formulae, also the identification of location of mutation in a genomic sequence in laboratories is still an exigent task. This novel approach provides ability to solve the problems that do not have an algorithmic solution and the available solutions are also too complex to be found. The perfect blend made of bioinformatics and neural networks technique results in efficient DNA pattern analysis algorithm with utmost prediction accuracy.

  7. Nonequilibrium Phase Transitions Associated with DNA Replication

    Science.gov (United States)

    2011-02-11

    polymerases) catalyzing the growth of a DNA primer strand (the nascent chain of nucleotides complementary to the template strand) based on the Watson ...the fraction (error rate) of monomers for which y, where y is the correct Watson - Crick complementary base of , can be obtained by ¼ X...Nonequilibrium Phase Transitions Associated with DNA Replication Hyung-June Woo* and Anders Wallqvist Biotechnology High Performance Computing

  8. Quasiparticle properties of DNA bases from GW calculations in a Wannier basis

    Science.gov (United States)

    Qian, Xiaofeng; Marzari, Nicola; Umari, Paolo

    2009-03-01

    The quasiparticle GW-Wannier (GWW) approach [1] has been recently developed to overcome the size limitations of conventional planewave GW calculations. By taking advantage of the localization properties of the maximally-localized Wannier functions and choosing a small set of polarization basis we reduce the number of Bloch wavefunctions products required for the evaluation of dynamical polarizabilities, and in turn greatly reduce memory requirements and computational efficiency. We apply GWW to study quasiparticle properties of different DNA bases and base-pairs, and solvation effects on the energy gap, demonstrating in the process the key advantages of this approach. [1] P. Umari,G. Stenuit, and S. Baroni, cond-mat/0811.1453

  9. Mapping Base Modifications in DNA by Transverse-Current Sequencing

    Science.gov (United States)

    Alvarez, Jose R.; Skachkov, Dmitry; Massey, Steven E.; Kalitsov, Alan; Velev, Julian P.

    2018-02-01

    Sequencing DNA modifications and lesions, such as methylation of cytosine and oxidation of guanine, is even more important and challenging than sequencing the genome itself. The traditional methods for detecting DNA modifications are either insensitive to these modifications or require additional processing steps to identify a particular type of modification. Transverse-current sequencing in nanopores can potentially identify the canonical bases and base modifications in the same run. In this work, we demonstrate that the most common DNA epigenetic modifications and lesions can be detected with any predefined accuracy based on their tunneling current signature. Our results are based on simulations of the nanopore tunneling current through DNA molecules, calculated using nonequilibrium electron-transport methodology within an effective multiorbital model derived from first-principles calculations, followed by a base-calling algorithm accounting for neighbor current-current correlations. This methodology can be integrated with existing experimental techniques to improve base-calling fidelity.

  10. Scilab software as an alternative low-cost computing in solving the linear equations problem

    Science.gov (United States)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  11. Solving-Problems and Hypermedia Systems

    Directory of Open Access Journals (Sweden)

    Ricardo LÓPEZ FERNÁNDEZ

    2009-06-01

    Full Text Available The solving problems like the transfer constitute two nuclei, related, essential in the cognitive investigation and in the mathematical education. No is in and of itself casual that, from the first moment, in the investigations on the application gives the computer science to the teaching the mathematics, cybernetic models were developed that simulated processes problem solving and transfer cotexts (GPS, 1969 and IDEA (Interactive Decision Envisioning Aid, Pea, BrunerCohen, Webster & Mellen, 1987. The present articulates it analyzes, that can contribute to the development in this respect the new technologies hypermedias, give applications that are good to implement processes of learning the heuristic thought and give the capacity of «transfer». From our perspective and from the experience that we have developed in this field, to carry out a function gives analysis and the theories on the problem solving, it requires that we exercise a previous of interpretation the central aspsects over the theories gives the solving problem and transfer starting from the classic theories on the prosecution of the information. In this sense, so much the theory gives the dual memory as the most recent, J. Anderson (1993 based on the mechanisms activation nodes information they allow to establish an interpretation suggester over the mental mechanism that you/they operate in the heuristic processes. On this analysis, the present articulates it develops a theoritical interpretation over the function gives the supports based on technology hypermedia advancing in the definition of a necessary theoretical body, having in it counts that on the other hand the practical experimentation is permanent concluding in the efficiency and effectiveness gives the support hypermedia like mechanism of comunication in the processes heuristic learning.

  12. Prompting in Web-Based Environments: Supporting Self-Monitoring and Problem Solving Skills in College Students

    Science.gov (United States)

    Kauffman, Douglas F.; Ge, Xun; Xie, Kui; Chen, Ching-Huei

    2008-01-01

    This study explored Metacognition and how automated instructional support in the form of problem-solving and self-reflection prompts influenced students' capacity to solve complex problems in a Web-based learning environment. Specifically, we examined the independent and interactive effects of problem-solving prompts and reflection prompts on…

  13. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  14. An Application of Computer Vision Systems to Solve the Problem of Unmanned Aerial Vehicle Control

    Directory of Open Access Journals (Sweden)

    Aksenov Alexey Y.

    2014-09-01

    Full Text Available The paper considers an approach for application of computer vision systems to solve the problem of unmanned aerial vehicle control. The processing of images obtained through onboard camera is required for absolute positioning of aerial platform (automatic landing and take-off, hovering etc. used image processing on-board camera. The proposed method combines the advantages of existing systems and gives the ability to perform hovering over a given point, the exact take-off and landing. The limitations of implemented methods are determined and the algorithm is proposed to combine them in order to improve the efficiency.

  15. Computational electromagnetics and model-based inversion a modern paradigm for eddy-current nondestructive evaluation

    CERN Document Server

    Sabbagh, Harold A; Sabbagh, Elias H; Aldrin, John C; Knopp, Jeremy S

    2013-01-01

    Computational Electromagnetics and Model-Based Inversion: A Modern Paradigm for Eddy Current Nondestructive Evaluation describes the natural marriage of the computer to eddy-current NDE. Three distinct topics are emphasized in the book: (a) fundamental mathematical principles of volume-integral equations as a subset of computational electromagnetics, (b) mathematical algorithms applied to signal-processing and inverse scattering problems, and (c) applications of these two topics to problems in which real and model data are used. By showing how mathematics and the computer can solve problems more effectively than current analog practices, this book defines the modern technology of eddy-current NDE. This book will be useful to advanced students and practitioners in the fields of computational electromagnetics, electromagnetic inverse-scattering theory, nondestructive evaluation, materials evaluation and biomedical imaging. Users of eddy-current NDE technology in industries as varied as nuclear power, aerospace,...

  16. Direct method of solving finite difference nonlinear equations for multicomponent diffusion in a gas centrifuge

    International Nuclear Information System (INIS)

    Potemki, Valeri G.; Borisevich, Valentine D.; Yupatov, Sergei V.

    1996-01-01

    This paper describes the the next evolution step in development of the direct method for solving systems of Nonlinear Algebraic Equations (SNAE). These equations arise from the finite difference approximation of original nonlinear partial differential equations (PDE). This method has been extended on the SNAE with three variables. The solving SNAE bases on Reiterating General Singular Value Decomposition of rectangular matrix pencils (RGSVD-algorithm). In contrast to the computer algebra algorithm in integer arithmetic based on the reduction to the Groebner's basis that algorithm is working in floating point arithmetic and realizes the reduction to the Kronecker's form. The possibilities of the method are illustrated on the example of solving the one-dimensional diffusion equation for 3-component model isotope mixture in a ga centrifuge. The implicit scheme for the finite difference equations without simplifying the nonlinear properties of the original equations is realized. The technique offered provides convergence to the solution for the single run. The Toolbox SNAE is developed in the framework of the high performance numeric computation and visualization software MATLAB. It includes more than 30 modules in MATLAB language for solving SNAE with two and three variables. (author)

  17. In vivo formation and repair of DNA double-strand breaks after computed tomography examinations

    OpenAIRE

    Löbrich, Markus; Rief, Nicole; Kühne, Martin; Heckmann, Martina; Fleckenstein, Jochen; Rübe, Christian; Uder, Michael

    2005-01-01

    Ionizing radiation can lead to a variety of deleterious effects in humans, most importantly to the induction of cancer. DNA double-strand breaks (DSBs) are among the most significant genetic lesions introduced by ionizing radiation that can initiate carcinogenesis. We have enumerated γ-H2AX foci as a measure for DSBs in lymphocytes from individuals undergoing computed tomography examination of the thorax and/or the abdomen. The number of DSBs induced by computed tomography examination was fou...

  18. Taylor's series method for solving the nonlinear point kinetics equations

    International Nuclear Information System (INIS)

    Nahla, Abdallah A.

    2011-01-01

    Highlights: → Taylor's series method for nonlinear point kinetics equations is applied. → The general order of derivatives are derived for this system. → Stability of Taylor's series method is studied. → Taylor's series method is A-stable for negative reactivity. → Taylor's series method is an accurate computational technique. - Abstract: Taylor's series method for solving the point reactor kinetics equations with multi-group of delayed neutrons in the presence of Newtonian temperature feedback reactivity is applied and programmed by FORTRAN. This system is the couples of the stiff nonlinear ordinary differential equations. This numerical method is based on the different order derivatives of the neutron density, the precursor concentrations of i-group of delayed neutrons and the reactivity. The r th order of derivatives are derived. The stability of Taylor's series method is discussed. Three sets of applications: step, ramp and temperature feedback reactivities are computed. Taylor's series method is an accurate computational technique and stable for negative step, negative ramp and temperature feedback reactivities. This method is useful than the traditional methods for solving the nonlinear point kinetics equations.

  19. Problem Solving with General Semantics.

    Science.gov (United States)

    Hewson, David

    1996-01-01

    Discusses how to use general semantics formulations to improve problem solving at home or at work--methods come from the areas of artificial intelligence/computer science, engineering, operations research, and psychology. (PA)

  20. Metal-mediated DNA base pairing: alternatives to hydrogen-bonded Watson-Crick base pairs.

    Science.gov (United States)

    Takezawa, Yusuke; Shionoya, Mitsuhiko

    2012-12-18

    With its capacity to store and transfer the genetic information within a sequence of monomers, DNA forms its central role in chemical evolution through replication and amplification. This elegant behavior is largely based on highly specific molecular recognition between nucleobases through the specific hydrogen bonds in the Watson-Crick base pairing system. While the native base pairs have been amazingly sophisticated through the long history of evolution, synthetic chemists have devoted considerable efforts to create alternative base pairing systems in recent decades. Most of these new systems were designed based on the shape complementarity of the pairs or the rearrangement of hydrogen-bonding patterns. We wondered whether metal coordination could serve as an alternative driving force for DNA base pairing and why hydrogen bonding was selected on Earth in the course of molecular evolution. Therefore, we envisioned an alternative design strategy: we replaced hydrogen bonding with another important scheme in biological systems, metal-coordination bonding. In this Account, we provide an overview of the chemistry of metal-mediated base pairing including basic concepts, molecular design, characteristic structures and properties, and possible applications of DNA-based molecular systems. We describe several examples of artificial metal-mediated base pairs, such as Cu(2+)-mediated hydroxypyridone base pair, H-Cu(2+)-H (where H denotes a hydroxypyridone-bearing nucleoside), developed by us and other researchers. To design the metallo-base pairs we carefully chose appropriate combinations of ligand-bearing nucleosides and metal ions. As expected from their stronger bonding through metal coordination, DNA duplexes possessing metallo-base pairs exhibited higher thermal stability than natural hydrogen-bonded DNAs. Furthermore, we could also use metal-mediated base pairs to construct or induce other high-order structures. These features could lead to metal-responsive functional

  1. Spreadsheet-based program for alignment of overlapping DNA sequences.

    Science.gov (United States)

    Anbazhagan, R; Gabrielson, E

    1999-06-01

    Molecular biology laboratories frequently face the challenge of aligning small overlapping DNA sequences derived from a long DNA segment. Here, we present a short program that can be used to adapt Excel spreadsheets as a tool for aligning DNA sequences, regardless of their orientation. The program runs on any Windows or Macintosh operating system computer with Excel 97 or Excel 98. The program is available for use as an Excel file, which can be downloaded from the BioTechniques Web site. Upon execution, the program opens a specially designed customized workbook and is capable of identifying overlapping regions between two sequence fragments and displaying the sequence alignment. It also performs a number of specialized functions such as recognition of restriction enzyme cutting sites and CpG island mapping without costly specialized software.

  2. Computational Principle and Performance Evaluation of Coherent Ising Machine Based on Degenerate Optical Parametric Oscillator Network

    Directory of Open Access Journals (Sweden)

    Yoshitaka Haribara

    2016-04-01

    Full Text Available We present the operational principle of a coherent Ising machine (CIM based on a degenerate optical parametric oscillator (DOPO network. A quantum theory of CIM is formulated, and the computational ability of CIM is evaluated by numerical simulation based on c-number stochastic differential equations. We also discuss the advanced CIM with quantum measurement-feedback control and various problems which can be solved by CIM.

  3. Analytical Devices Based on Direct Synthesis of DNA on Paper.

    Science.gov (United States)

    Glavan, Ana C; Niu, Jia; Chen, Zhen; Güder, Firat; Cheng, Chao-Min; Liu, David; Whitesides, George M

    2016-01-05

    This paper addresses a growing need in clinical diagnostics for parallel, multiplex analysis of biomarkers from small biological samples. It describes a new procedure for assembling arrays of ssDNA and proteins on paper. This method starts with the synthesis of DNA oligonucleotides covalently linked to paper and proceeds to assemble microzones of DNA-conjugated paper into arrays capable of simultaneously capturing DNA, DNA-conjugated protein antigens, and DNA-conjugated antibodies. The synthesis of ssDNA oligonucleotides on paper is convenient and effective with 32% of the oligonucleotides cleaved and eluted from the paper substrate being full-length by HPLC for a 32-mer. These ssDNA arrays can be used to detect fluorophore-linked DNA oligonucleotides in solution, and as the basis for DNA-directed assembly of arrays of DNA-conjugated capture antibodies on paper, detect protein antigens by sandwich ELISAs. Paper-anchored ssDNA arrays with different sequences can be used to assemble paper-based devices capable of detecting DNA and antibodies in the same device and enable simple microfluidic paper-based devices.

  4. Comprehension and computation in Bayesian problem solving

    Directory of Open Access Journals (Sweden)

    Eric D. Johnson

    2015-07-01

    Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.

  5. A modified artificial bee colony based on chaos theory for solving non-convex emission/economic dispatch

    International Nuclear Information System (INIS)

    Shayeghi, H.; Ghasemi, A.

    2014-01-01

    Highlights: • This paper presents a developed multi objective CIABC based on CLS theory for solving EED problem. • The EED problem is formulated as a non-convex multi objective optimization problem. • Considered three test systems to demonstrate its efficiency including practical constrains. • The significant improvement in the results comparing the reported literature. - Abstract: In this paper, a modified ABC based on chaos theory namely CIABC is comprehensively enhanced and effectively applied for solving a multi-objective EED problem to minimize three conflicting objective functions with non-smooth and non-convex generator fuel cost characteristics while satisfying the operation constraints. The proposed method uses a Chaotic Local Search (CLS) to enhance the self searching ability of the original ABC algorithm for finding feasible optimal solutions of the EED problem. Also, many linear and nonlinear constraints, such as generation limits, transmission line loss, security constraints and non-smooth cost functions are considered as dynamic operational constraints. Moreover, a method based on fuzzy set theory is employed to extract one of the Pareto-optimal solutions as the best compromise one. The proposed multi objective evolutionary method has been applied to the standard IEEE 30 bus six generators, fourteen generators and 40 thermal generating units, respectively, as small, medium and large test power system. The numerical results obtained with the proposed method based on tables and figures compared with other evolutionary algorithm of scientific literatures. The results regards that the proposed CIABC algorithm surpasses the other available methods in terms of computational efficiency and solution quality

  6. Oxidative DNA base modifications as factors in carcinogenesis

    International Nuclear Information System (INIS)

    Olinski, R.; Jaruga, P.; Zastawny, T.H.

    1998-01-01

    Reactive oxygen species can cause extensive DNA modifications including modified bases. Some of the DNA base damage has been found to possess premutagenic properties. Therefore, if not repaired, it can contribute to carcinogenesis. We have found elevated amounts of modified bases in cancerous and precancerous tissues as compared with normal tissues. Most of the agents used in anticancer therapy are paradoxically responsible for induction of secondary malignancies and some of them may generate free radicals. The results of our experiments provide evidence that exposure of cancer patients to therapeutic doses of ionizing radiation and anticancer drugs cause base modifications in genomic DNA of lymphocytes. Some of these base damages could lead to mutagenesis in critical genes and ultimately to secondary cancers such as leukemias. This may point to an important role of oxidative base damage in cancer initiation. Alternatively, the increased level of the modified base products may contribute to genetic instability and metastatic potential of tumor cells. (author)

  7. A DNA Structure-Based Bionic Wavelet Transform and Its Application to DNA Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Fei Chen

    2003-01-01

    Full Text Available DNA sequence analysis is of great significance for increasing our understanding of genomic functions. An important task facing us is the exploration of hidden structural information stored in the DNA sequence. This paper introduces a DNA structure-based adaptive wavelet transform (WT – the bionic wavelet transform (BWT – for DNA sequence analysis. The symbolic DNA sequence can be separated into four channels of indicator sequences. An adaptive symbol-to-number mapping, determined from the structural feature of the DNA sequence, was introduced into WT. It can adjust the weight value of each channel to maximise the useful energy distribution of the whole BWT output. The performance of the proposed BWT was examined by analysing synthetic and real DNA sequences. Results show that BWT performs better than traditional WT in presenting greater energy distribution. This new BWT method should be useful for the detection of the latent structural features in future DNA sequence analysis.

  8. Thai Language Sentence Similarity Computation Based on Syntactic Structure and Semantic Vector

    Science.gov (United States)

    Wang, Hongbin; Feng, Yinhan; Cheng, Liang

    2018-03-01

    Sentence similarity computation plays an increasingly important role in text mining, Web page retrieval, machine translation, speech recognition and question answering systems. Thai language as a kind of resources scarce language, it is not like Chinese language with HowNet and CiLin resources. So the Thai sentence similarity research faces some challenges. In order to solve this problem of the Thai language sentence similarity computation. This paper proposes a novel method to compute the similarity of Thai language sentence based on syntactic structure and semantic vector. This method firstly uses the Part-of-Speech (POS) dependency to calculate two sentences syntactic structure similarity, and then through the word vector to calculate two sentences semantic similarity. Finally, we combine the two methods to calculate two Thai language sentences similarity. The proposed method not only considers semantic, but also considers the sentence syntactic structure. The experiment result shows that this method in Thai language sentence similarity computation is feasible.

  9. An image encryption scheme based on the MLNCML system using DNA sequences

    Science.gov (United States)

    Zhang, Ying-Qian; Wang, Xing-Yuan; Liu, Jia; Chi, Ze-Lin

    2016-07-01

    We propose a new image scheme based on the spatiotemporal chaos of the Mixed Linear-Nonlinear Coupled Map Lattices (MLNCML). This spatiotemporal chaotic system has more cryptographic features in dynamics than the system of Coupled Map Lattices (CML). In the proposed scheme, we employ the strategy of DNA computing and one time pad encryption policy, which can enhance the sensitivity to the plaintext and resist differential attack, brute-force attack, statistical attack and plaintext attack. Simulation results and theoretical analysis indicate that the proposed scheme has superior high security.

  10. Small molecules, inhibitors of DNA-PK, targeting DNA repair and beyond

    Directory of Open Access Journals (Sweden)

    David eDavidson

    2013-01-01

    Full Text Available Many current chemotherapies function by damaging genomic DNA in rapidly dividing cells ultimately leading to cell death. This therapeutic approach differentially targets cancer cells that generally display rapid cell division compared to normal tissue cells. However, although these treatments are initially effective in arresting tumor growth and reducing tumor burden, resistance and disease progression eventually occur. A major mechanism underlying this resistance is increased levels of cellular DNA repair. Most cells have complex mechanisms in place to repair DNA damage that occurs due to environmental exposures or normal metabolic processes. These systems, initially overwhelmed when faced with chemotherapy induced DNA damage, become more efficient under constant selective pressure and as a result chemotherapies become less effective. Thus, inhibiting DNA repair pathways using target specific small molecule inhibitors may overcome cellular resistance to DNA damaging chemotherapies. Non-homologous end joining (NHEJ a major mechanism for the repair of double strand breaks (DSB in DNA is regulated in part by the serine/threonine kinase, DNA dependent protein kinase (DNA-PK. The DNA-PK holoenzyme acts as a scaffold protein tethering broken DNA ends and recruiting other repair molecules. It also has enzymatic activity that may be involved in DNA damage signaling. Because of its’ central role in repair of DSBs, DNA-PK has been the focus of a number of small molecule studies. In these studies specific DNA-PK inhibitors have shown efficacy in synergizing chemotherapies in vitro. However, compounds currently known to specifically inhibit DNA-PK are limited by poor pharmacokinetics: these compounds have poor solubility and have high metabolic lability in vivo leading to short serum half-lives. Future improvement in DNA-PK inhibition will likely be achieved by designing new molecules based on the recently reported crystallographic structure of DNA

  11. Enhanced base excision repair capacity in carotid atherosclerosis may protect nuclear DNA but not mitochondrial DNA

    DEFF Research Database (Denmark)

    Skarpengland, Tonje; B. Dahl, Tuva; Skjelland, Mona

    2016-01-01

    Lesional and systemic oxidative stress has been implicated in the pathogenesis of atherosclerosis, potentially leading to accumulation of DNA base lesions within atherosclerotic plaques. Although base excision repair (BER) is a major pathway counteracting oxidative DNA damage, our knowledge on BER...

  12. Design and Application of Interactive Simulations in Problem-Solving in University-Level Physics Education

    Science.gov (United States)

    Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel

    2016-08-01

    In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving teaching materials and assess their effectiveness in improving students' ability to solve problems in university-level physics. Firstly, we analyze the effect of using simulation-based materials in the development of students' skills in employing procedures that are typically used in the scientific method of problem-solving. We found that a significant percentage of the experimental students used expert-type scientific procedures such as qualitative analysis of the problem, making hypotheses, and analysis of results. At the end of the course, only a minority of the students persisted with habits based solely on mathematical equations. Secondly, we compare the effectiveness in terms of problem-solving of the experimental group students with the students who are taught conventionally. We found that the implementation of the problem-solving strategy improved experimental students' results regarding obtaining a correct solution from the academic point of view, in standard textbook problems. Thirdly, we explore students' satisfaction with simulation-based problem-solving teaching materials and we found that the majority appear to be satisfied with the methodology proposed and took on a favorable attitude to learning problem-solving. The research was carried out among first-year Engineering Degree students.

  13. Computer-aided design of DNA origami structures.

    Science.gov (United States)

    Selnihhin, Denis; Andersen, Ebbe Sloth

    2015-01-01

    The DNA origami method enables the creation of complex nanoscale objects that can be used to organize molecular components and to function as reconfigurable mechanical devices. Of relevance to synthetic biology, DNA origami structures can be delivered to cells where they can perform complicated sense-and-act tasks, and can be used as scaffolds to organize enzymes for enhanced synthesis. The design of DNA origami structures is a complicated matter and is most efficiently done using dedicated software packages. This chapter describes a procedure for designing DNA origami structures using a combination of state-of-the-art software tools. First, we introduce the basic method for calculating crossover positions between DNA helices and the standard crossover patterns for flat, square, and honeycomb DNA origami lattices. Second, we provide a step-by-step tutorial for the design of a simple DNA origami biosensor device, from schematic idea to blueprint creation and to 3D modeling and animation, and explain how careful modeling can facilitate later experimentation in the laboratory.

  14. Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-Rich DNA, and nuclear DNA analyses

    Science.gov (United States)

    Freeman, S.; Pham, M.; Rodriguez, R.J.

    1993-01-01

    Molecular genotyping of Colletotrichum species based on arbitrarily primed PCR, A + T-rich DNA, and nuclear DNA analyses. Experimental Mycology 17, 309-322. Isolates of Colletotrichum were grouped into 10 separate species based on arbitrarily primed PCR (ap-PCR), A + T-rich DNA (AT-DNA) and nuclear DNA banding patterns. In general, the grouping of Colletotrichum isolates by these molecular approaches corresponded to that done by classical taxonomic identification, however, some exceptions were observed. PCR amplification of genomic DNA using four different primers allowed for reliable differentiation between isolates of the 10 species. HaeIII digestion patterns of AT-DNA also distinguished between species of Colletotrichum by generating species-specific band patterns. In addition, hybridization of the repetitive DNA element (GcpR1) to genomic DNA identified a unique set of Pst 1-digested nuclear DNA fragments in each of the 10 species of Colletotrichum tested. Multiple isolates of C. acutatum, C. coccodes, C. fragariae, C. lindemuthianum, C. magna, C. orbiculare, C. graminicola from maize, and C. graminicola from sorghum showed 86-100% intraspecies similarity based on ap-PCR and AT-DNA analyses. Interspecies similarity determined by ap-PCR and AT-DNA analyses varied between 0 and 33%. Three distinct banding patterns were detected in isolates of C. gloeosporioides from strawberry. Similarly, three different banding patterns were observed among isolates of C. musae from diseased banana.

  15. A nuclear DNA-based species determination and DNA quantification assay for common poultry species.

    Science.gov (United States)

    Ng, J; Satkoski, J; Premasuthan, A; Kanthaswamy, S

    2014-12-01

    DNA testing for food authentication and quality control requires sensitive species-specific quantification of nuclear DNA from complex and unknown biological sources. We have developed a multiplex assay based on TaqMan® real-time quantitative PCR (qPCR) for species-specific detection and quantification of chicken (Gallus gallus), duck (Anas platyrhynchos), and turkey (Meleagris gallopavo) nuclear DNA. The multiplex assay is able to accurately detect very low quantities of species-specific DNA from single or multispecies sample mixtures; its minimum effective quantification range is 5 to 50 pg of starting DNA material. In addition to its use in food fraudulence cases, we have validated the assay using simulated forensic sample conditions to demonstrate its utility in forensic investigations. Despite treatment with potent inhibitors such as hematin and humic acid, and degradation of template DNA by DNase, the assay was still able to robustly detect and quantify DNA from each of the three poultry species in mixed samples. The efficient species determination and accurate DNA quantification will help reduce fraudulent food labeling and facilitate downstream DNA analysis for genetic identification and traceability.

  16. Domain decomposition method for solving the neutron diffusion equation

    International Nuclear Information System (INIS)

    Coulomb, F.

    1989-03-01

    The aim of this work is to study methods for solving the neutron diffusion equation; we are interested in methods based on a classical finite element discretization and well suited for use on parallel computers. Domain decomposition methods seem to answer this preoccupation. This study deals with a decomposition of the domain. A theoretical study is carried out for Lagrange finite elements and some examples are given; in the case of mixed dual finite elements, the study is based on examples [fr

  17. From nonspecific DNA-protein encounter complexes to the prediction of DNA-protein interactions.

    Directory of Open Access Journals (Sweden)

    Mu Gao

    2009-03-01

    Full Text Available DNA-protein interactions are involved in many essential biological activities. Because there is no simple mapping code between DNA base pairs and protein amino acids, the prediction of DNA-protein interactions is a challenging problem. Here, we present a novel computational approach for predicting DNA-binding protein residues and DNA-protein interaction modes without knowing its specific DNA target sequence. Given the structure of a DNA-binding protein, the method first generates an ensemble of complex structures obtained by rigid-body docking with a nonspecific canonical B-DNA. Representative models are subsequently selected through clustering and ranking by their DNA-protein interfacial energy. Analysis of these encounter complex models suggests that the recognition sites for specific DNA binding are usually favorable interaction sites for the nonspecific DNA probe and that nonspecific DNA-protein interaction modes exhibit some similarity to specific DNA-protein binding modes. Although the method requires as input the knowledge that the protein binds DNA, in benchmark tests, it achieves better performance in identifying DNA-binding sites than three previously established methods, which are based on sophisticated machine-learning techniques. We further apply our method to protein structures predicted through modeling and demonstrate that our method performs satisfactorily on protein models whose root-mean-square Calpha deviation from native is up to 5 A from their native structures. This study provides valuable structural insights into how a specific DNA-binding protein interacts with a nonspecific DNA sequence. The similarity between the specific DNA-protein interaction mode and nonspecific interaction modes may reflect an important sampling step in search of its specific DNA targets by a DNA-binding protein.

  18. Students’ Mathematical Problem-Solving Abilities Through The Application of Learning Models Problem Based Learning

    Science.gov (United States)

    Nasution, M. L.; Yerizon, Y.; Gusmiyanti, R.

    2018-04-01

    One of the purpose mathematic learning is to develop problem solving abilities. Problem solving is obtained through experience in questioning non-routine. Improving students’ mathematical problem-solving abilities required an appropriate strategy in learning activities one of them is models problem based learning (PBL). Thus, the purpose of this research is to determine whether the problem solving abilities of mathematical students’ who learn to use PBL better than on the ability of students’ mathematical problem solving by applying conventional learning. This research included quasi experiment with static group design and population is students class XI MIA SMAN 1 Lubuk Alung. Class experiment in the class XI MIA 5 and class control in the class XI MIA 6. The instrument of final test students’ mathematical problem solving used essay form. The result of data final test in analyzed with t-test. The result is students’ mathematical problem solving abilities with PBL better then on the ability of students’ mathematical problem solving by applying conventional learning. It’s seen from the high percentage achieved by the group of students who learn to use PBL for each indicator of students’ mathematical problem solving.

  19. X based interactive computer graphics applications for aerodynamic design and education

    Science.gov (United States)

    Benson, Thomas J.; Higgs, C. Fred, III

    1995-01-01

    Six computer applications packages have been developed to solve a variety of aerodynamic problems in an interactive environment on a single workstation. The packages perform classical one dimensional analysis under the control of a graphical user interface and can be used for preliminary design or educational purposes. The programs were originally developed on a Silicon Graphics workstation and used the GL version of the FORMS library as the graphical user interface. These programs have recently been converted to the XFORMS library of X based graphics widgets and have been tested on SGI, IBM, Sun, HP and PC-Lunix computers. The paper will show results from the new VU-DUCT program as a prime example. VU-DUCT has been developed as an educational package for the study of subsonic open and closed loop wind tunnels.

  20. Mono- and Di-Alkylation Processes of DNA Bases by Nitrogen Mustard Mechlorethamine.

    Science.gov (United States)

    Larrañaga, Olatz; de Cózar, Abel; Cossío, Fernando P

    2017-12-06

    The reactivity of nitrogen mustard mechlorethamine (mec) with purine bases towards formation of mono- (G-mec and A-mec) and dialkylated (AA-mec, GG-mec and AG-mec) adducts has been studied using density functional theory (DFT). To gain a complete overview of DNA-alkylation processes, direct chloride substitution and formation through activated aziridinium species were considered as possible reaction paths for adduct formation. Our results confirm that DNA alkylation by mec occurs via aziridine intermediates instead of direct substitution. Consideration of explicit water molecules in conjunction with polarizable continuum model (PCM) was shown as an adequate computational method for a proper representation of the system. Moreover, Runge-Kutta numerical kinetic simulations including the possible bisadducts have been performed. These simulations predicted a product ratio of 83:17 of GG-mec and AG-mec diadducts, respectively. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Ultraviolet enhancement of DNA base release by bleomycin

    International Nuclear Information System (INIS)

    Kakinuma, J.; Tanabe, M.; Orii, H.

    1984-01-01

    The effect of UV irradiation on base-releasing activity of bleomycin was studied on bleomycin A 2 -DNA reaction mixture in the presence of Fe(II) and 2-mercaptoethanol. This effect was measured by the release of free bases from calf thymus DNA with high-performance liquid chromatography. UV irradiation enhanced DNA base-releasing activity of bleomycin and simultaneously caused disappearance of fluorescence emission maximum at 355 nm assigned to bithiazole rings and increase in the intensity of a peak at 400 nm. UV irradiation at 295 nm, the UV absorption maximum of bleomycin, is the most effective in releasing free bases and in changing fluorescence emission patterns. From these results, we suggest that some alterations in the bithiazole group of bleomycin molecule were initiated by UV irradiation and contributed to increased base-releasing activity of bleomycin through a yet unexplained mechanism, presumably through bleomycin dimer formation. (orig.)

  2. Performance of the majority voting rule in solving the density classification problem in high dimensions

    Energy Technology Data Exchange (ETDEWEB)

    Gomez Soto, Jose Manuel [Unidad Academica de Matematicas, Universidad Autonoma de Zacatecas, Calzada Solidaridad entronque Paseo a la Bufa, Zacatecas, Zac. (Mexico); Fuks, Henryk, E-mail: jmgomezgoo@gmail.com, E-mail: hfuks@brocku.ca [Department of Mathematics, Brock University, St. Catharines, ON (Canada)

    2011-11-04

    The density classification problem (DCP) is one of the most widely studied problems in the theory of cellular automata. After it was shown that the DCP cannot be solved perfectly, the research in this area has been focused on finding better rules that could solve the DCP approximately. In this paper, we argue that the majority voting rule in high dimensions can achieve high performance in solving the DCP, and that its performance increases with dimension. We support this conjecture with arguments based on the mean-field approximation and direct computer simulations. (paper)

  3. Performance of the majority voting rule in solving the density classification problem in high dimensions

    International Nuclear Information System (INIS)

    Gomez Soto, Jose Manuel; Fuks, Henryk

    2011-01-01

    The density classification problem (DCP) is one of the most widely studied problems in the theory of cellular automata. After it was shown that the DCP cannot be solved perfectly, the research in this area has been focused on finding better rules that could solve the DCP approximately. In this paper, we argue that the majority voting rule in high dimensions can achieve high performance in solving the DCP, and that its performance increases with dimension. We support this conjecture with arguments based on the mean-field approximation and direct computer simulations. (paper)

  4. Developing Instructional Mathematical Physics Book Based on Inquiry Approach to Improve Students’ Mathematical Problem Solving Ability

    Directory of Open Access Journals (Sweden)

    Syarifah Fadillah

    2017-03-01

    Full Text Available The problem in this research is to know how the process of developing mathematics physics instructional book based on inquiry approach and its supporting documents to improve students' mathematical problem-solving ability. The purpose of this research is to provide mathematical physics instruction based on inquiry approach and its supporting documents (semester learning activity plan, lesson plan and mathematical problem-solving test to improve students' mathematical problem-solving ability. The development of textbook refers to the ADDIE model, including analysis, design, development, implementation, and evaluation. The validation result from the expert team shows that the textbook and its supporting documents are valid. The test results of the mathematical problem-solving skills show that all test questions are valid and reliable. The result of the incorporation of the textbook in teaching and learning process revealed that students' mathematical problem-solving ability using mathematical physics instruction based on inquiry approach book was better than the students who use the regular book.

  5. Silver(I)-Mediated Base Pairs in DNA Sequences Containing 7-Deazaguanine/Cytosine: towards DNA with Entirely Metallated Watson-Crick Base Pairs.

    Science.gov (United States)

    Méndez-Arriaga, José M; Maldonado, Carmen R; Dobado, José A; Galindo, Miguel A

    2018-03-26

    DNA sequences comprising noncanonical 7-deazaguanine ( 7C G) and canonical cytosine (C) are capable of forming Watson-Crick base pairs via hydrogen bonds as well as silver(I)-mediated base pairs by coordination to central silver(I) ions. Duplexes I and II containing 7C G and C have been synthesized and characterized. The incorporation of silver(I) ions into these duplexes has been studied by means of temperature-dependent UV spectroscopy, circular dichroism, and DFT calculations. The results suggest the formation of DNA molecules comprising contiguous metallated 7C G-Ag I -C Watson-Crick base pairs that preserve the original B-type conformation. Furthermore, additional studies performed on duplex III indicated that, in the presence of Ag I ions, 7C G-C and 7C A-T Watson-Crick base pairs ( 7C A, 7-deazadenine; T, thymine) can be converted to metallated 7C G-Ag I -C and 7C A-Ag I -T base pairs inside the same DNA molecule whilst maintaining its initial double helix conformation. These findings are very important for the development of customized silver-DNA nanostructures based on a Watson-Crick complementarity pattern. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Parallel processing using an optical delay-based reservoir computer

    Science.gov (United States)

    Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy

    2016-04-01

    Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015

  7. Proceedings of the workshop. Recognition of DNA damage as onset of successful repair. Computational and experimental approaches

    International Nuclear Information System (INIS)

    Pinak, Miroslav

    2002-03-01

    This was held at The Tokai Research Establishment, Japan Atomic Energy Research Institute, on the 18th and 19th of December 2001. The Laboratory of Radiation Risk Analysis of JAERI organized the workshop. The main subject of the workshop was the DNA damage and its repair. Presented works described the leading experimental as well computational approaches, focusing mainly on the formation of DNA damage, its proliferation, enzymatic recognition and repair, and finally imaging and detection of lesions on a DNA molecule. The 19 of the presented papers are indexed individually. (J.P.N.)

  8. Model-based clustering of DNA methylation array data: a recursive-partitioning algorithm for high-dimensional data arising as a mixture of beta distributions

    Directory of Open Access Journals (Sweden)

    Wiemels Joseph

    2008-09-01

    Full Text Available Abstract Background Epigenetics is the study of heritable changes in gene function that cannot be explained by changes in DNA sequence. One of the most commonly studied epigenetic alterations is cytosine methylation, which is a well recognized mechanism of epigenetic gene silencing and often occurs at tumor suppressor gene loci in human cancer. Arrays are now being used to study DNA methylation at a large number of loci; for example, the Illumina GoldenGate platform assesses DNA methylation at 1505 loci associated with over 800 cancer-related genes. Model-based cluster analysis is often used to identify DNA methylation subgroups in data, but it is unclear how to cluster DNA methylation data from arrays in a scalable and reliable manner. Results We propose a novel model-based recursive-partitioning algorithm to navigate clusters in a beta mixture model. We present simulations that show that the method is more reliable than competing nonparametric clustering approaches, and is at least as reliable as conventional mixture model methods. We also show that our proposed method is more computationally efficient than conventional mixture model approaches. We demonstrate our method on the normal tissue samples and show that the clusters are associated with tissue type as well as age. Conclusion Our proposed recursively-partitioned mixture model is an effective and computationally efficient method for clustering DNA methylation data.

  9. Learning-based computing techniques in geoid modeling for precise height transformation

    Science.gov (United States)

    Erol, B.; Erol, S.

    2013-03-01

    Precise determination of local geoid is of particular importance for establishing height control in geodetic GNSS applications, since the classical leveling technique is too laborious. A geoid model can be accurately obtained employing properly distributed benchmarks having GNSS and leveling observations using an appropriate computing algorithm. Besides the classical multivariable polynomial regression equations (MPRE), this study attempts an evaluation of learning based computing algorithms: artificial neural networks (ANNs), adaptive network-based fuzzy inference system (ANFIS) and especially the wavelet neural networks (WNNs) approach in geoid surface approximation. These algorithms were developed parallel to advances in computer technologies and recently have been used for solving complex nonlinear problems of many applications. However, they are rather new in dealing with precise modeling problem of the Earth gravity field. In the scope of the study, these methods were applied to Istanbul GPS Triangulation Network data. The performances of the methods were assessed considering the validation results of the geoid models at the observation points. In conclusion the ANFIS and WNN revealed higher prediction accuracies compared to ANN and MPRE methods. Beside the prediction capabilities, these methods were also compared and discussed from the practical point of view in conclusions.

  10. Solving global optimization problems on GPU cluster

    Energy Technology Data Exchange (ETDEWEB)

    Barkalov, Konstantin; Gergel, Victor; Lebedev, Ilya [Lobachevsky State University of Nizhni Novgorod, Gagarin Avenue 23, 603950 Nizhni Novgorod (Russian Federation)

    2016-06-08

    The paper contains the results of investigation of a parallel global optimization algorithm combined with a dimension reduction scheme. This allows solving multidimensional problems by means of reducing to data-independent subproblems with smaller dimension solved in parallel. The new element implemented in the research consists in using several graphic accelerators at different computing nodes. The paper also includes results of solving problems of well-known multiextremal test class GKLS on Lobachevsky supercomputer using tens of thousands of GPU cores.

  11. A method of non-contact reading code based on computer vision

    Science.gov (United States)

    Zhang, Chunsen; Zong, Xiaoyu; Guo, Bingxuan

    2018-03-01

    With the purpose of guarantee the computer information exchange security between internal and external network (trusted network and un-trusted network), A non-contact Reading code method based on machine vision has been proposed. Which is different from the existing network physical isolation method. By using the computer monitors, camera and other equipment. Deal with the information which will be on exchanged, Include image coding ,Generate the standard image , Display and get the actual image , Calculate homography matrix, Image distort correction and decoding in calibration, To achieve the computer information security, Non-contact, One-way transmission between the internal and external network , The effectiveness of the proposed method is verified by experiments on real computer text data, The speed of data transfer can be achieved 24kb/s. The experiment shows that this algorithm has the characteristics of high security, fast velocity and less loss of information. Which can meet the daily needs of the confidentiality department to update the data effectively and reliably, Solved the difficulty of computer information exchange between Secret network and non-secret network, With distinctive originality, practicability, and practical research value.

  12. Logical NAND and NOR Operations Using Algorithmic Self-assembly of DNA Molecules

    Science.gov (United States)

    Wang, Yanfeng; Cui, Guangzhao; Zhang, Xuncai; Zheng, Yan

    DNA self-assembly is the most advanced and versatile system that has been experimentally demonstrated for programmable construction of patterned systems on the molecular scale. It has been demonstrated that the simple binary arithmetic and logical operations can be computed by the process of self assembly of DNA tiles. Here we report a one-dimensional algorithmic self-assembly of DNA triple-crossover molecules that can be used to execute five steps of a logical NAND and NOR operations on a string of binary bits. To achieve this, abstract tiles were translated into DNA tiles based on triple-crossover motifs. Serving as input for the computation, long single stranded DNA molecules were used to nucleate growth of tiles into algorithmic crystals. Our method shows that engineered DNA self-assembly can be treated as a bottom-up design techniques, and can be capable of designing DNA computer organization and architecture.

  13. Making DNA Fingerprints.

    Science.gov (United States)

    Nunley, Kathie F.

    1996-01-01

    Presents an activity to simulate electrophoresis using everyday items. Uses adding machine paper to construct a set of DNA fingerprints that can be used to solve crime cases designed by students in any biology class. (JRH)

  14. ALGORITHMIC LOGIC TO SOLVE COMPUTATIONAL PROGRAMMING PROBLEMS: A DIDACTIC PROPOSAL / LÓGICA ALGORÍTMICA PARA LA RESOLUCIÓN DE PROBLEMAS DE PROGRAMACIÓN COMPUTACIONAL: UNA PROPUESTA DIDÁCTICA

    Directory of Open Access Journals (Sweden)

    Yaritza Tardo Fernández

    2013-02-01

    Full Text Available The cultural, technological and eminently social character of the computer programming problems solving process, joined with the complexity and difficulties detected in their teaching, has contributed to increase the concern about the study of the processes of communication, transmission and understanding of computer programming and to attract the attention of a wide scientific community in correspondence with the growing development that this reaches at the present time. That is the reason why this paper has the objective of discover, from the didactic point of view, the integrators axes of an algorithmic logic that solves the contradiction that is revealed in the formative process between the mathematic modeling and their algorithmic systematization to empower an efficient performance of the professionals of Computer Science and Computer Engineering. In this sense a new didactic proposal is based, that consist in an algorithmic logic, in which are specified and explained those essentials processes that should be carry out to solve computer programming problems. Based on the theoretical fundaments, we concluded that these processes constitute didactics moments, required in order to solve the contradiction mentioned before.RESUMENEl carácter eminentemente social, cultural y tecnológico del proceso de resolución de problemas de programación computacional, junto a la complejidad y dificultades detectadas en su enseñanza, han contribuido a despertar la preocupación por el estudio de los procesos de comunicación, transmisión y comprensión de la Programación y a interesar a una amplia comunidad científica en correspondencia con el creciente desarrollo que ésta alcanza en la actualidad. Razón por la cual este trabajo tiene como objetivo que se develen, desde el punto de vista didáctico, los ejes integradores de una lógica algorítmica que sea contentiva de la solución a la contradicción que se revela en el proceso formativo entre la

  15. DNA Photo Lithography with Cinnamate-based Photo-Bio-Nano-Glue

    Science.gov (United States)

    Feng, Lang; Li, Minfeng; Romulus, Joy; Sha, Ruojie; Royer, John; Wu, Kun-Ta; Xu, Qin; Seeman, Nadrian; Weck, Marcus; Chaikin, Paul

    2013-03-01

    We present a technique to make patterned functional surfaces, using a cinnamate photo cross-linker and photolithography. We have designed and modified a complementary set of single DNA strands to incorporate a pair of opposing cinnamate molecules. On exposure to 360nm UV, the cinnamate makes a highly specific covalent bond permanently linking only the complementary strands containing the cinnamates. We have studied this specific and efficient crosslinking with cinnamate-containing DNA in solution and on particles. UV addressability allows us to pattern surfaces functionally. The entire surface is coated with a DNA sequence A incorporating cinnamate. DNA strands A'B with one end containing a complementary cinnamated sequence A' attached to another sequence B, are then hybridized to the surface. UV photolithography is used to bind the A'B strand in a specific pattern. The system is heated and the unbound DNA is washed away. The pattern is then observed by thermo-reversibly hybridizing either fluorescently dyed B' strands complementary to B, or colloids coated with B' strands. Our techniques can be used to reversibly and/or permanently bind, via DNA linkers, an assortment of molecules, proteins and nanostructures. Potential applications range from advanced self-assembly, such as templated self-replication schemes recently reported, to designed physical and chemical patterns, to high-resolution multi-functional DNA surfaces for genetic detection or DNA computing.

  16. The last Viking King: a royal maternity case solved by ancient DNA analysis

    DEFF Research Database (Denmark)

    Dissing, Jørgen; Binladen, Jonas; Hansen, Anders

    2006-01-01

    Estridsen to haplogroup H; Estrid's sequence differed from that of Sven at two positions in HVR-1, 16093T-->C and 16304T-->C, indicating that she belongs to subgroup H5a. Given the maternal inheritance of mtDNA, offspring will have the same mtDNA sequence as their mother with the exception of rare cases...... doubts among historians whether the woman entombed was indeed Estrid. To shed light on this problem, we have extracted and analysed mitochondrial DNA (mtDNA) from pulp of teeth from each of the two royals. Four overlapping DNA-fragments covering about 400bp of hypervariable region 1 (HVR-1) of the D...

  17. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    International Nuclear Information System (INIS)

    Jackson, Christopher B.; Gallati, Sabina; Schaller, André

    2012-01-01

    Highlights: ► Serial qPCR accurately determines fragmentation state of any given DNA sample. ► Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. ► Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. ► Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze–thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA (λ nDNA ) and mtDNA (λ mtDNA ) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two

  18. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Gallati, Sabina, E-mail: sabina.gallati@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Schaller, Andre, E-mail: andre.schaller@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland)

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in

  19. The development and evaluation of a web-based programme to support problem-solving skills following brain injury.

    Science.gov (United States)

    Powell, Laurie Ehlhardt; Wild, Michelle R; Glang, Ann; Ibarra, Summer; Gau, Jeff M; Perez, Amanda; Albin, Richard W; O'Neil-Pirozzi, Therese M; Wade, Shari L; Keating, Tom; Saraceno, Carolyn; Slocumb, Jody

    2017-10-24

    Cognitive impairments following brain injury, including difficulty with problem solving, can pose significant barriers to successful community reintegration. Problem-solving strategy training is well-supported in the cognitive rehabilitation literature. However, limitations in insurance reimbursement have resulted in fewer services to train such skills to mastery and to support generalization of those skills into everyday environments. The purpose of this project was to develop and evaluate an integrated, web-based programme, ProSolv, which uses a small number of coaching sessions to support problem solving in everyday life following brain injury. We used participatory action research to guide the iterative development, usability testing, and within-subject pilot testing of the ProSolv programme. The finalized programme was then evaluated in a between-subjects group study and a non-experimental single case study. Results were mixed across studies. Participants demonstrated that it was feasible to learn and use the ProSolv programme for support in problem solving. They highly recommended the programme to others and singled out the importance of the coach. Limitations in app design were cited as a major reason for infrequent use of the app outside of coaching sessions. Results provide mixed evidence regarding the utility of web-based mobile apps, such as ProSolv to support problem solving following brain injury. Implications for Rehabilitation People with cognitive impairments following brain injury often struggle with problem solving in everyday contexts. Research supports problem solving skills training following brain injury. Assistive technology for cognition (smartphones, selected apps) offers a means of supporting problem solving for this population. This project demonstrated the feasibility of a web-based programme to address this need.

  20. DNA Based Electrochromic and Photovoltaic Cells

    Science.gov (United States)

    2012-01-01

    using deoxyribonucleic acid complex as an electron blocking layer App. Phys. Lett. 88 (2006) 171109. 23. F.H.C. Crick , J.D. Watson . The complementary...9550-09-1-0647 final 01-09-2009 ; 30-11-2011 DNA Based Electrochromic and Photovoltaic Cells FA 9550-09-1-0647 Pawlicka, Agnieszka, J. Instituto de...Available. DNA is an abundant natural product with very good biodegradation properties and can be used to obtain gel polymer electrolytes (GPEs) with high

  1. The meshless local Petrov-Galerkin method based on moving Kriging interpolation for solving the time fractional Navier-Stokes equations.

    Science.gov (United States)

    Thamareerat, N; Luadsong, A; Aschariyaphotha, N

    2016-01-01

    In this paper, we present a numerical scheme used to solve the nonlinear time fractional Navier-Stokes equations in two dimensions. We first employ the meshless local Petrov-Galerkin (MLPG) method based on a local weak formulation to form the system of discretized equations and then we will approximate the time fractional derivative interpreted in the sense of Caputo by a simple quadrature formula. The moving Kriging interpolation which possesses the Kronecker delta property is applied to construct shape functions. This research aims to extend and develop further the applicability of the truly MLPG method to the generalized incompressible Navier-Stokes equations. Two numerical examples are provided to illustrate the accuracy and efficiency of the proposed algorithm. Very good agreement between the numerically and analytically computed solutions can be observed in the verification. The present MLPG method has proved its efficiency and reliability for solving the two-dimensional time fractional Navier-Stokes equations arising in fluid dynamics as well as several other problems in science and engineering.

  2. Design of tailor-made chemical blend using a decomposition-based computer-aided approach

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Gernaey, Krist; Manan, Z.A.

    2011-01-01

    Computer aided techniques form an efficient approach to solve chemical product design problems such as the design of blended liquid products (chemical blending). In chemical blending, one tries to find the best candidate, which satisfies the product targets defined in terms of desired product...... methodology for blended liquid products that identifies a set of feasible chemical blends. The blend design problem is formulated as a Mixed Integer Nonlinear Programming (MINLP) model where the objective is to find the optimal blended gasoline or diesel product subject to types of chemicals...... and their compositions and a set of desired target properties of the blended product as design constraints. This blend design problem is solved using a decomposition approach, which eliminates infeasible and/or redundant candidates gradually through a hierarchy of (property) model based constraints. This decomposition...

  3. The implementation of CP1 computer code in the Honeywell Bull computer in Brazilian Nuclear Energy Commission (CNEN)

    International Nuclear Information System (INIS)

    Couto, R.T.

    1987-01-01

    The implementation of the CP1 computer code in the Honeywell Bull computer in Brazilian Nuclear Energy Comission is presented. CP1 is a computer code used to solve the equations of punctual kinetic with Doppler feed back from the system temperature variation based on the Newton refrigeration equation (E.G.) [pt

  4. A Rewritable, Random-Access DNA-Based Storage System.

    Science.gov (United States)

    Yazdi, S M Hossein Tabatabaei; Yuan, Yongbo; Ma, Jian; Zhao, Huimin; Milenkovic, Olgica

    2015-09-18

    We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.

  5. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Karimi, M.; Bakar, A.H.A.; Mohamad, Hasmaini

    2014-01-01

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  6. Solving the flexible job shop problem by hybrid metaheuristics-based multiagent model

    Science.gov (United States)

    Nouri, Houssem Eddine; Belkahla Driss, Olfa; Ghédira, Khaled

    2018-03-01

    The flexible job shop scheduling problem (FJSP) is a generalization of the classical job shop scheduling problem that allows to process operations on one machine out of a set of alternative machines. The FJSP is an NP-hard problem consisting of two sub-problems, which are the assignment and the scheduling problems. In this paper, we propose how to solve the FJSP by hybrid metaheuristics-based clustered holonic multiagent model. First, a neighborhood-based genetic algorithm (NGA) is applied by a scheduler agent for a global exploration of the search space. Second, a local search technique is used by a set of cluster agents to guide the research in promising regions of the search space and to improve the quality of the NGA final population. The efficiency of our approach is explained by the flexible selection of the promising parts of the search space by the clustering operator after the genetic algorithm process, and by applying the intensification technique of the tabu search allowing to restart the search from a set of elite solutions to attain new dominant scheduling solutions. Computational results are presented using four sets of well-known benchmark literature instances. New upper bounds are found, showing the effectiveness of the presented approach.

  7. Computer-based learning: games as an instructional strategy.

    Science.gov (United States)

    Blake, J; Goodman, J

    1999-01-01

    Games are a creative teaching strategy that enhances learning and problem solving. Gaming strategies are being used by the authors to make learning interesting, stimulating and fun. This article focuses on the development and implementation of computer games as an instructional strategy. Positive outcomes have resulted from the use of games in the classroom.

  8. Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    James (Jong Hyuk Park

    2016-09-01

    Full Text Available Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.

  9. Solving fault diagnosis problems linear synthesis techniques

    CERN Document Server

    Varga, Andreas

    2017-01-01

    This book addresses fault detection and isolation topics from a computational perspective. Unlike most existing literature, it bridges the gap between the existing well-developed theoretical results and the realm of reliable computational synthesis procedures. The model-based approach to fault detection and diagnosis has been the subject of ongoing research for the past few decades. While the theoretical aspects of fault diagnosis on the basis of linear models are well understood, most of the computational methods proposed for the synthesis of fault detection and isolation filters are not satisfactory from a numerical standpoint. Several features make this book unique in the fault detection literature: Solution of standard synthesis problems in the most general setting, for both continuous- and discrete-time systems, regardless of whether they are proper or not; consequently, the proposed synthesis procedures can solve a specific problem whenever a solution exists Emphasis on the best numerical algorithms to ...

  10. An Integrated Conceptual Environment based on Collective Intelligence and Distributed Artificial Intelligence for Connecting People on Problem Solving

    Directory of Open Access Journals (Sweden)

    Vasile MAZILESCU

    2012-12-01

    Full Text Available This paper aims to analyze the different forms of intelligence within organizations in a systemic and inclusive vision, in order to conceptualize an integrated environment based on Distributed Artificial Intelligence (DAI and Collective Intelligence (CI. In this way we effectively shift the classical approaches of connecting people with people using collaboration tools (which allow people to work together, such as videoconferencing or email, groupware in virtual space, forums, workflow, of connecting people with a series of content management knowledge (taxonomies and documents classification, ontologies or thesauri, search engines, portals, to the current approaches of connecting people on the use (automatic of operational knowledge to solve problems and make decisions based on intellectual cooperation. The best way to use collective intelligence is based on knowledge mobilization and semantic technologies. We must not let computers to imitate people but to support people think and develop their ideas within a group. CI helps people to think together, while DAI tries to support people so as to limit human error. Within an organization, to manage CI is to combine instruments like Semantic Technologies (STs, knowledge mobilization methods for developing Knowledge Management (KM strategies, and the processes that promote connection and collaboration between individual minds in order to achieve collective objectives, to perform a task or to solve increasingly economic complex problems.

  11. PROBLEM SOLVING IN SCHOOL MATHEMATICS BASED ON HEURISTIC STRATEGIES

    Directory of Open Access Journals (Sweden)

    NOVOTNÁ, Jarmila

    2014-03-01

    Full Text Available The paper describes one of the ways of developing pupils’ creative approach to problem solving. The described experiment is a part of a longitudinal research focusing on improvement of culture of problem solving by pupils. It deals with solving of problems using the following heuristic strategies: Analogy, Guess – check – revise, Systematic experimentation, Problem reformulation, Solution drawing, Way back and Use of graphs of functions. Most attention is paid to the question whether short-term work, in this case only over the period of three months, can result in improvement of pupils’ abilities to solve problems whose solving algorithms are easily accessible. It also answers the question which strategies pupils will prefer and with what results. The experiment shows that even short-term work can bear positive results as far as pupils’ approach to problem solving is concerned.

  12. Solving the inverse heat conduction problem using NVLink capable Power architecture

    Directory of Open Access Journals (Sweden)

    Sándor Szénási

    2017-11-01

    Full Text Available The accurate knowledge of Heat Transfer Coefficients is essential for the design of precise heat transfer operations. The determination of these values requires Inverse Heat Transfer Calculations, which are usually based on heuristic optimisation techniques, like Genetic Algorithms or Particle Swarm Optimisation. The main bottleneck of these heuristics is the high computational demand of the cost function calculation, which is usually based on heat transfer simulations producing the thermal history of the workpiece at given locations. This Direct Heat Transfer Calculation is a well parallelisable process, making it feasible to implement an efficient GPU kernel for this purpose. This paper presents a novel step forward: based on the special requirements of the heuristics solving the inverse problem (executing hundreds of simulations in a parallel fashion at the end of each iteration, it is possible to gain a higher level of parallelism using multiple graphics accelerators. The results show that this implementation (running on 4 GPUs is about 120 times faster than a traditional CPU implementation using 20 cores. The latest developments of the GPU-based High Power Computations area were also analysed, like the new NVLink connection between the host and the devices, which tries to solve the long time existing data transfer handicap of GPU programming.

  13. How stable are the mutagenic tautomers of DNA bases?

    Directory of Open Access Journals (Sweden)

    Brovarets’ O. O.

    2010-02-01

    Full Text Available Aim. To determine the lifetime of the mutagenic tautomers of DNA base pairs through the investigation of the physicochemical mechanisms of their intramolecular proton transfer. Methods. Non-empirical quantum chemistry, the analysis of the electron density by means of Bader’s atom in molecules (AIM theory and physicochemical kinetics were used. Results. Physicochemical character of the transition state of the intramolecular tautomerisation of DNA bases was investigated, the lifetime of mutagenic tautomers was calculated. Conclusions. The lifetime of the DNA bases mutagenic tautomers by 3–10 orders exceeds typical time of DNA replication in the cell (~103 s. This fact confirms that the postulate, on which the Watson-Crick tautomeric hypothesis of spontaneous transitions grounds, is adequate. The absence of intramolecular H-bonds in the canonical and mutagenic tautomeric forms determine their high stability

  14. Communication: Electron ionization of DNA bases

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, M. A.; Krishnakumar, E., E-mail: ekkumar@tifr.res.in

    2016-04-28

    No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve the existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.

  15. Finite element method for solving Kohn-Sham equations based on self-adaptive tetrahedral mesh

    International Nuclear Information System (INIS)

    Zhang Dier; Shen Lihua; Zhou Aihui; Gong Xingao

    2008-01-01

    A finite element (FE) method with self-adaptive mesh-refinement technique is developed for solving the density functional Kohn-Sham equations. The FE method adopts local piecewise polynomials basis functions, which produces sparsely structured matrices of Hamiltonian. The method is well suitable for parallel implementation without using Fourier transform. In addition, the self-adaptive mesh-refinement technique can control the computational accuracy and efficiency with optimal mesh density in different regions

  16. Outsourcing Set Intersection Computation Based on Bloom Filter for Privacy Preservation in Multimedia Processing

    Directory of Open Access Journals (Sweden)

    Hongliang Zhu

    2018-01-01

    Full Text Available With the development of cloud computing, the advantages of low cost and high computation ability meet the demands of complicated computation of multimedia processing. Outsourcing computation of cloud could enable users with limited computing resources to store and process distributed multimedia application data without installing multimedia application software in local computer terminals, but the main problem is how to protect the security of user data in untrusted public cloud services. In recent years, the privacy-preserving outsourcing computation is one of the most common methods to solve the security problems of cloud computing. However, the existing computation cannot meet the needs for the large number of nodes and the dynamic topologies. In this paper, we introduce a novel privacy-preserving outsourcing computation method which combines GM homomorphic encryption scheme and Bloom filter together to solve this problem and propose a new privacy-preserving outsourcing set intersection computation protocol. Results show that the new protocol resolves the privacy-preserving outsourcing set intersection computation problem without increasing the complexity and the false positive probability. Besides, the number of participants, the size of input secret sets, and the online time of participants are not limited.

  17. PCR-based cDNA library construction: general cDNA libraries at the level of a few cells.

    OpenAIRE

    Belyavsky, A; Vinogradova, T; Rajewsky, K

    1989-01-01

    A procedure for the construction of general cDNA libraries is described which is based on the amplification of total cDNA in vitro. The first cDNA strand is synthesized from total RNA using an oligo(dT)-containing primer. After oligo(dG) tailing the total cDNA is amplified by PCR using two primers complementary to oligo(dA) and oligo(dG) ends of the cDNA. For insertion of the cDNA into a vector a controlled trimming of the 3' ends of the cDNA by Klenow enzyme was used. Starting from 10 J558L ...

  18. PCR-based detection of a rare linear DNA in cell culture

    Directory of Open Access Journals (Sweden)

    Saveliev Sergei V.

    2002-01-01

    Full Text Available The described method allows for detection of rare linear DNA fragments generated during genomic deletions. The predicted limit of the detection is one DNA molecule per 107 or more cells. The method is based on anchor PCR and involves gel separation of the linear DNA fragment and chromosomal DNA before amplification. The detailed chemical structure of the ends of the linear DNA can be defined with the use of additional PCR-based protocols. The method was applied to study the short-lived linear DNA generated during programmed genomic deletions in a ciliate. It can be useful in studies of spontaneous DNA deletions in cell culture or for tracking intracellular modifications at the ends of transfected DNA during gene therapy trials.

  19. PCR-based detection of a rare linear DNA in cell culture.

    Science.gov (United States)

    Saveliev, Sergei V.

    2002-11-11

    The described method allows for detection of rare linear DNA fragments generated during genomic deletions. The predicted limit of the detection is one DNA molecule per 10(7) or more cells. The method is based on anchor PCR and involves gel separation of the linear DNA fragment and chromosomal DNA before amplification. The detailed chemical structure of the ends of the linear DNA can be defined with the use of additional PCR-based protocols. The method was applied to study the short-lived linear DNA generated during programmed genomic deletions in a ciliate. It can be useful in studies of spontaneous DNA deletions in cell culture or for tracking intracellular modifications at the ends of transfected DNA during gene therapy trials.

  20. Identification of Forensic Samples via Mitochondrial DNA in the Undergraduate Biochemistry Laboratory

    Science.gov (United States)

    Millard, Julie T.; Pilon, André M.

    2003-04-01

    A recent forensic approach for identification of unknown biological samples is mitochondrial DNA (mtDNA) sequencing. We describe a laboratory exercise suitable for an undergraduate biochemistry course in which the polymerase chain reaction is used to amplify a 440 base pair hypervariable region of human mtDNA from a variety of "crime scene" samples (e.g., teeth, hair, nails, cigarettes, envelope flaps, toothbrushes, and chewing gum). Amplification is verified via agarose gel electrophoresis and then samples are subjected to cycle sequencing. Sequence alignments are made via the program CLUSTAL W, allowing students to compare samples and solve the "crime."

  1. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  2. Solving a multi-objective manufacturing cell scheduling problem with the consideration of warehouses using a simulated annealing based procedure

    Directory of Open Access Journals (Sweden)

    Adrián A. Toncovich

    2019-01-01

    Full Text Available The competition manufacturing companies face has driven the development of novel and efficient methods that enhance the decision making process. In this work, a specific flow shop scheduling problem of practical interest in the industry is presented and formalized using a mathematical programming model. The problem considers a manufacturing system arranged as a work cell that takes into account the transport operations of raw material and final products between the manufacturing cell and warehouses. For solving this problem, we present a multiobjective metaheuristic strategy based on simulated annealing, the Pareto Archived Simulated Annealing (PASA. We tested this strategy on two kinds of benchmark problem sets proposed by the authors. The first group is composed by small-sized problems. On these tests, PASA was able to obtain optimal or near-optimal solutions in significantly short computing times. In order to complete the analysis, we compared these results to the exact Pareto front of the instances obtained with augmented ε-constraint method. Then, we also tested the algorithm in a set of larger problems to evaluate its performance in more extensive search spaces. We performed this assessment through an analysis of the hypervolume metric. Both sets of tests showed the competitiveness of the Pareto Archived Simulated Annealing to efficiently solve this problem and obtain good quality solutions while using reasonable computational resources.

  3. A discriminatory function for prediction of protein-DNA interactions based on alpha shape modeling.

    Science.gov (United States)

    Zhou, Weiqiang; Yan, Hong

    2010-10-15

    Protein-DNA interaction has significant importance in many biological processes. However, the underlying principle of the molecular recognition process is still largely unknown. As more high-resolution 3D structures of protein-DNA complex are becoming available, the surface characteristics of the complex become an important research topic. In our work, we apply an alpha shape model to represent the surface structure of the protein-DNA complex and developed an interface-atom curvature-dependent conditional probability discriminatory function for the prediction of protein-DNA interaction. The interface-atom curvature-dependent formalism captures atomic interaction details better than the atomic distance-based method. The proposed method provides good performance in discriminating the native structures from the docking decoy sets, and outperforms the distance-dependent formalism in terms of the z-score. Computer experiment results show that the curvature-dependent formalism with the optimal parameters can achieve a native z-score of -8.17 in discriminating the native structure from the highest surface-complementarity scored decoy set and a native z-score of -7.38 in discriminating the native structure from the lowest RMSD decoy set. The interface-atom curvature-dependent formalism can also be used to predict apo version of DNA-binding proteins. These results suggest that the interface-atom curvature-dependent formalism has a good prediction capability for protein-DNA interactions. The code and data sets are available for download on http://www.hy8.com/bioinformatics.htm kenandzhou@hotmail.com.

  4. The Ins and Outs of DNA Fingerprinting the Infectious Fungi

    Science.gov (United States)

    Soll, David R.

    2000-01-01

    DNA fingerprinting methods have evolved as major tools in fungal epidemiology. However, no single method has emerged as the method of choice, and some methods perform better than others at different levels of resolution. In this review, requirements for an effective DNA fingerprinting method are proposed and procedures are described for testing the efficacy of a method. In light of the proposed requirements, the most common methods now being used to DNA fingerprint the infectious fungi are described and assessed. These methods include restriction fragment length polymorphisms (RFLP), RFLP with hybridization probes, randomly amplified polymorphic DNA and other PCR-based methods, electrophoretic karyotyping, and sequencing-based methods. Procedures for computing similarity coefficients, generating phylogenetic trees, and testing the stability of clusters are then described. To facilitate the analysis of DNA fingerprinting data, computer-assisted methods are described. Finally, the problems inherent in the collection of test and control isolates are considered, and DNA fingerprinting studies of strain maintenance during persistent or recurrent infections, microevolution in infecting strains, and the origin of nosocomial infections are assessed in light of the preceding discussion of the ins and outs of DNA fingerprinting. The intent of this review is to generate an awareness of the need to verify the efficacy of each DNA fingerprinting method for the level of genetic relatedness necessary to answer the epidemiological question posed, to use quantitative methods to analyze DNA fingerprint data, to use computer-assisted DNA fingerprint analysis systems to analyze data, and to file data in a form that can be used in the future for retrospective and comparative studies. PMID:10756003

  5. Improved chaos-based video steganography using DNA alphabets

    Directory of Open Access Journals (Sweden)

    Nirmalya Kar

    2018-03-01

    Full Text Available DNA based steganography plays a vital role in the field of privacy and secure communication. Here, we propose a DNA properties-based mechanism to send data hidden inside a video file. Initially, the video file is converted into image frames. Random frames are then selected and data is hidden in these at random locations by using the Least Significant Bit substitution method. We analyze the proposed architecture in terms of peak signal-to-noise ratio as well as mean squared error measured between the original and steganographic files averaged over all video frames. The results show minimal degradation of the steganographic video file. Keywords: Chaotic map, DNA, Linear congruential generator, Video steganography, Least significant bit

  6. Effectiveness of an Online Social Constructivist Mathematical Problem Solving Course for Malaysian Pre-Service Teachers

    Directory of Open Access Journals (Sweden)

    Kim-Leong Lai

    2009-07-01

    Full Text Available This study assessed the effectiveness of an online mathematical problem solving course designed using a social constructivist approach for pre-service teachers. Thirty-seven pre-service teachers at the Batu Lintang Teacher Institute, Sarawak, Malaysia were randomly selected to participate in the study. The participants were required to complete the course online without the typical face-to-face classes and they were also required to solve authentic mathematical problems in small groups of 4-5 participants based on the Polya’s Problem Solving Model via asynchronous online discussions. Quantitative and qualitative methods such as questionnaires and interviews were used to evaluate the effects of the online learning course. Findings showed that a majority of the participants were satisfied with their learning experiences in the course. There were no significant changes in the participants’ attitudes toward mathematics, while the participants’ skills in problem solving for “understand the problem” and “devise a plan” steps based on the Polya Model were significantly enhanced, though no improvement was apparent for “carry out the plan” and “review”. The results also showed that there were significant improvements in the participants’ critical thinking skills. Furthermore, participants with higher initial computer skills were also found to show higher performance in mathematical problem solving as compared to those with lower computer skills. However, there were no significant differences in the participants’ achievements in the course based on gender. Generally, the online social constructivist mathematical problem solving course is beneficial to the participants and ought to be given the attention it deserves as an alternative to traditional classes. Nonetheless, careful considerations need to be made in the designing and implementing of online courses to minimize problems that participants might encounter while

  7. Self-Regulation and Problem Solving Ability in 7E-Learning Cycle Based Goal Orientation

    Science.gov (United States)

    Mulyono; Noor, N. L.

    2017-04-01

    Goal orientation differences between mastery goals and performance goals can be a cause of high and low self-regulation and problem-solving abilities. To overcome these problems applied 7E-learning cycle in which students learn and develop ways to optimise the power of reason through the learning phase elicit, engage, explore, explain, elaborate, evaluate, and extend. This study aimed to test the effectiveness of learning by 7E-learning cycle and describe self-regulation and mathematics problem solving based on goal-orientation after the implementation 7E-learning cycle. This study used mix method design with research subject is graders XII sciences MA NU Nurul Ulum Jekulo Kudus which divided into goal orientation is mastery goal and performance goal. The independent variable of this research is learning model, while the dependent variable is problem solving and self-regulation. Then, collecting data using scale, interviews and tests. The data processed with the proportion of test, t-test, paired samples t-test, and Normality-gain. The results show problem-solving abilities of students through 7E-learning cycle the average of mathematical problem-solving capability class, self-regulation at 7E-learning cycle is better than the traditional model study. The problem-solving skills at 7E-learning cycle are better than the traditional model study, there is an increase in self-regulation through 7E-learning cycle of 0.4 (medium), and there is an increased problem-solving ability through 7E-learning cycle by 0.79 (high). Based on the qualitative analysis, self-regulation and problem-solving ability after the implementation of 7E-learning cycle students of a mastery goal group are better than the performance goal team. It is suggested to implement 7E-learning cycle to improve self-regulation and problem-solving ability as well as directing and fostering mastery goal on the student in the learning process.

  8. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  9. The use of computer based instructions to enhance Rwandan ...

    African Journals Online (AJOL)

    This study intended to investigate into the extent to which computers and Internet that are being availed to schools in Rwanda can be used to enhance teachers' ICT competency and continuous professional development. In order to attain this ultimate aim, researchers undertook a Problem Solving and Theory Testing ...

  10. Planning under uncertainty solving large-scale stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  11. DNA hybridization sensor based on pentacene thin film transistor.

    Science.gov (United States)

    Kim, Jung-Min; Jha, Sandeep Kumar; Chand, Rohit; Lee, Dong-Hoon; Kim, Yong-Sang

    2011-01-15

    A DNA hybridization sensor using pentacene thin film transistors (TFTs) is an excellent candidate for disposable sensor applications due to their low-cost fabrication process and fast detection. We fabricated pentacene TFTs on glass substrate for the sensing of DNA hybridization. The ss-DNA (polyA/polyT) or ds-DNA (polyA/polyT hybrid) were immobilized directly on the surface of the pentacene, producing a dramatic change in the electrical properties of the devices. The electrical characteristics of devices were studied as a function of DNA immobilization, single-stranded vs. double-stranded DNA, DNA length and concentration. The TFT device was further tested for detection of λ-phage genomic DNA using probe hybridization. Based on these results, we propose that a "label-free" detection technique for DNA hybridization is possible through direct measurement of electrical properties of DNA-immobilized pentacene TFTs. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  13. The Effect of Learning Environments Based on Problem Solving on Students' Achievements of Problem Solving

    Science.gov (United States)

    Karatas, Ilhan; Baki, Adnan

    2013-01-01

    Problem solving is recognized as an important life skill involving a range of processes including analyzing, interpreting, reasoning, predicting, evaluating and reflecting. For that reason educating students as efficient problem solvers is an important role of mathematics education. Problem solving skill is the centre of mathematics curriculum.…

  14. Computer program to solve two-dimensional shock-wave interference problems with an equilibrium chemically reacting air model

    Science.gov (United States)

    Glass, Christopher E.

    1990-08-01

    The computer program EASI, an acronym for Equilibrium Air Shock Interference, was developed to calculate the inviscid flowfield, the maximum surface pressure, and the maximum heat flux produced by six shock wave interference patterns on a 2-D, cylindrical configuration. Thermodynamic properties of the inviscid flowfield are determined using either an 11-specie, 7-reaction equilibrium chemically reacting air model or a calorically perfect air model. The inviscid flowfield is solved using the integral form of the conservation equations. Surface heating calculations at the impingement point for the equilibrium chemically reacting air model use variable transport properties and specific heat. However, for the calorically perfect air model, heating rate calculations use a constant Prandtl number. Sample calculations of the six shock wave interference patterns, a listing of the computer program, and flowcharts of the programming logic are included.

  15. Triple-helix molecular switch-based aptasensors and DNA sensors.

    Science.gov (United States)

    Bagheri, Elnaz; Abnous, Khalil; Alibolandi, Mona; Ramezani, Mohammad; Taghdisi, Seyed Mohammad

    2018-07-15

    Utilization of traditional analytical techniques is limited because they are generally time-consuming and require high consumption of reagents, complicated sample preparation and expensive equipment. Therefore, it is of great interest to achieve sensitive, rapid and simple detection methods. It is believed that nucleic acids assays, especially aptamers, are very important in modern life sciences for target detection and biological analysis. Aptamers and DNA-based sensors have been widely used for the design of various sensors owing to their unique features. In recent years, triple-helix molecular switch (THMS)-based aptasensors and DNA sensors have been broadly utilized for the detection and analysis of different targets. The THMS relies on the formation of DNA triplex via Watson-Crick and Hoogsteen base pairings under optimal conditions. This review focuses on recent progresses in the development and applications of electrochemical, colorimetric, fluorescence and SERS aptasensors and DNA sensors, which are based on THMS. Also, the advantages and drawbacks of these methods are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. DNA-informed breeding of rosaceous crops: promises, progress and prospects

    Science.gov (United States)

    Peace, Cameron P

    2017-01-01

    Crops of the Rosaceae family provide valuable contributions to rural economies and human health and enjoyment. Sustained solutions to production challenges and market demands can be met with genetically improved new cultivars. Traditional rosaceous crop breeding is expensive and time-consuming and would benefit from improvements in efficiency and accuracy. Use of DNA information is becoming conventional in rosaceous crop breeding, contributing to many decisions and operations, but only after past decades of solved challenges and generation of sufficient resources. Successes in deployment of DNA-based knowledge and tools have arisen when the ‘chasm’ between genomics discoveries and practical application is bridged systematically. Key steps are establishing breeder desire for use of DNA information, adapting tools to local breeding utility, identifying efficient application schemes, accessing effective services in DNA-based diagnostics and gaining experience in integrating DNA information into breeding operations and decisions. DNA-informed germplasm characterization for revealing identity and relatedness has benefitted many programs and provides a compelling entry point to reaping benefits of genomics research. DNA-informed germplasm evaluation for predicting trait performance has enabled effective reallocation of breeding resources when applied in pioneering programs. DNA-based diagnostics is now expanding from specific loci to genome-wide considerations. Realizing the full potential of this expansion will require improved accuracy of predictions, multi-trait DNA profiling capabilities, streamlined breeding information management systems, strategies that overcome plant-based features that limit breeding progress and widespread training of current and future breeding personnel and allied scientists. PMID:28326185

  17. Perturbed soliton excitations in inhomogeneous DNA

    International Nuclear Information System (INIS)

    Daniel, M.; Vasumathi, V.

    2005-05-01

    We study nonlinear dynamics of inhomogeneous DNA double helical chain under dynamic plane-base rotator model by considering angular rotation of bases in a plane normal to the helical axis. The DNA dynamics in this case is found to be governed by a perturbed sine-Gordon equation when taking into account the interstrand hydrogen bonding energy and intrastrand inhomogeneous stacking energy and making an analogy with the Heisenberg model of the Hamiltonian for an inhomogeneous anisotropic spin ladder with ferromagnetic legs and antiferromagentic rung coupling. In the homogeneous limit the dynamics is governed by the kink-antikink soliton of the sine-Gordon equation which represents the formation of open state configuration in DNA double helix. The effect of inhomogeneity in stacking energy in the form of localized and periodic variations on the formation of open states in DNA is studied under perturbation. The perturbed soliton is obtained using a multiple scale soliton perturbation theory by solving the associated linear eigen value problem and constructing the complete set of eigen functions. The inhomogeneity in stacking energy is found to modulate the width and speed of the soliton depending on the nature of inhomogeneity. Also it introduces fluctuations in the form of train of pulses or periodic oscillation in the open state configuration (author)

  18. Solving the Curriculum Sequencing Problem with DNA Computing Approach

    Science.gov (United States)

    Debbah, Amina; Ben Ali, Yamina Mohamed

    2014-01-01

    In the e-learning systems, a learning path is known as a sequence of learning materials linked to each others to help learners achieving their learning goals. As it is impossible to have the same learning path that suits different learners, the Curriculum Sequencing problem (CS) consists of the generation of a personalized learning path for each…

  19. The current state of eukaryotic DNA base damage and repair.

    Science.gov (United States)

    Bauer, Nicholas C; Corbett, Anita H; Doetsch, Paul W

    2015-12-02

    DNA damage is a natural hazard of life. The most common DNA lesions are base, sugar, and single-strand break damage resulting from oxidation, alkylation, deamination, and spontaneous hydrolysis. If left unrepaired, such lesions can become fixed in the genome as permanent mutations. Thus, evolution has led to the creation of several highly conserved, partially redundant pathways to repair or mitigate the effects of DNA base damage. The biochemical mechanisms of these pathways have been well characterized and the impact of this work was recently highlighted by the selection of Tomas Lindahl, Aziz Sancar and Paul Modrich as the recipients of the 2015 Nobel Prize in Chemistry for their seminal work in defining DNA repair pathways. However, how these repair pathways are regulated and interconnected is still being elucidated. This review focuses on the classical base excision repair and strand incision pathways in eukaryotes, considering both Saccharomyces cerevisiae and humans, and extends to some important questions and challenges facing the field of DNA base damage repair. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Practical scientific computing

    CERN Document Server

    Muhammad, A

    2011-01-01

    Scientific computing is about developing mathematical models, numerical methods and computer implementations to study and solve real problems in science, engineering, business and even social sciences. Mathematical modelling requires deep understanding of classical numerical methods. This essential guide provides the reader with sufficient foundations in these areas to venture into more advanced texts. The first section of the book presents numEclipse, an open source tool for numerical computing based on the notion of MATLAB®. numEclipse is implemented as a plug-in for Eclipse, a leading integ

  1. A Structural Equation Model to Analyse the Antecedents to Students' Web-Based Problem-Solving Performance

    Science.gov (United States)

    Hwang, Gwo-Jen; Kuo, Fan-Ray

    2015-01-01

    Web-based problem-solving, a compound ability of critical thinking, creative thinking, reasoning thinking and information-searching abilities, has been recognised as an important competence for elementary school students. Some researchers have reported the possible correlations between problem-solving competence and information searching ability;…

  2. Teaching genetics using hands-on models, problem solving, and inquiry-based methods

    Science.gov (United States)

    Hoppe, Stephanie Ann

    Teaching genetics can be challenging because of the difficulty of the content and misconceptions students might hold. This thesis focused on using hands-on model activities, problem solving, and inquiry-based teaching/learning methods in order to increase student understanding in an introductory biology class in the area of genetics. Various activities using these three methods were implemented into the classes to address any misconceptions and increase student learning of the difficult concepts. The activities that were implemented were shown to be successful based on pre-post assessment score comparison. The students were assessed on the subjects of inheritance patterns, meiosis, and protein synthesis and demonstrated growth in all of the areas. It was found that hands-on models, problem solving, and inquiry-based activities were more successful in learning concepts in genetics and the students were more engaged than tradition styles of lecture.

  3. Radiobiological significance of DNA repair

    International Nuclear Information System (INIS)

    Kuzin, A.M.

    1978-01-01

    A short outline is given on the history of the problem relating to the repair of radiation injuries, specifically its molecular mechanisms. The most urgent problems which currently confront the researchers are noted. This is a further study on the role of DNA repair in post-radiation recovery, search for ways to activate and suppress DNA repair, investigations into the activity balance of various repair enzymes as well as the problem of errors in the structure of repairing DNA. An important role is attached to the investigations of DNA repair in solving a number of practical problems

  4. A control volume based finite difference method for solving the equilibrium equations in terms of displacements

    DEFF Research Database (Denmark)

    Hattel, Jesper; Hansen, Preben

    1995-01-01

    This paper presents a novel control volume based FD method for solving the equilibrium equations in terms of displacements, i.e. the generalized Navier equations. The method is based on the widely used cv-FDM solution of heat conduction and fluid flow problems involving a staggered grid formulati....... The resulting linear algebraic equations are solved by line-Gauss-Seidel....

  5. Problem-Based Learning: Student Engagement, Learning and Contextualized Problem-Solving. Occasional Paper

    Science.gov (United States)

    Mossuto, Mark

    2009-01-01

    The adoption of problem-based learning as a teaching method in the advertising and public relations programs offered by the Business TAFE (Technical and Further Education) School at RMIT University is explored in this paper. The effect of problem-based learning on student engagement, student learning and contextualised problem-solving was…

  6. A constriction factor based particle swarm optimisation algorithm to solve the economic dispatch problem including losses

    Energy Technology Data Exchange (ETDEWEB)

    Young, Steven; Montakhab, Mohammad; Nouri, Hassan

    2011-07-15

    Economic dispatch (ED) is one of the most important problems to be solved in power generation as fractional percentage fuel reductions represent significant cost savings. ED wishes to optimise the power generated by each generating unit in a system in order to find the minimum operating cost at a required load demand, whilst ensuring both equality and inequality constraints are met. For the process of optimisation, a model must be created for each generating unit. The particle swarm optimisation technique is an evolutionary computation technique with one of the most powerful methods for solving global optimisation problems. The aim of this paper is to add in a constriction factor to the particle swarm optimisation algorithm (CFBPSO). Results show that the algorithm is very good at solving the ED problem and that CFBPSO must be able to work in a practical environment and so a valve point effect with transmission losses should be included in future work.

  7. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  8. Analysis of students’ creative thinking level in problem solving based on national council of teachers of mathematics

    Science.gov (United States)

    Hobri; Suharto; Rifqi Naja, Ahmad

    2018-04-01

    This research aims to determine students’ creative thinking level in problem solving based on NCTM in function subject. The research type is descriptive with qualitative approach. Data collection methods which were used are test and interview. Creative thinking level in problem solving based on NCTM indicators consists of (1) Make mathematical model from a contextual problem and solve the problem, (2) Solve problem using various possible alternatives, (3) Find new alternative(s) to solve the problem, (4) Determine the most efficient and effective alternative for that problem, (5) Review and correct mistake(s) on the process of problem solving. Result of the research showed that 10 students categorized in very satisfying level, 23 students categorized in satisfying level and 1 students categorized in less satisfying level. Students in very satisfying level meet all indicators, students in satisfying level meet first, second, fourth, and fifth indicator, while students in less satisfying level only meet first and fifth indicator.

  9. Facilitating case reuse during problem solving in algebra-based physics

    Science.gov (United States)

    Mateycik, Frances Ann

    This research project investigates students' development of problem solving schemata while using strategies that facilitate the process of using solved examples to assist with a new problem (case reuse). Focus group learning interviews were used to explore students' perceptions and understanding of several problem solving strategies. Individual clinical interviews were conducted and quantitative examination data were collected to assess students' conceptual understanding, knowledge organization, and problem solving performance on a variety of problem tasks. The study began with a short one-time treatment of two independent, research-based strategies chosen to facilitate case reuse. Exploration of students' perceptions and use of the strategies lead investigators to select one of the two strategies to be implemented over a full semester of focus group interviews. The strategy chosen was structure mapping. Structure maps are defined as visual representations of quantities and their associations. They were created by experts to model the appropriate mental organization of knowledge elements for a given physical concept. Students were asked to use these maps as they were comfortable while problem solving. Data obtained from this phase of our study (Phase I) offered no evidence of improved problem solving schema. The 11 contact hour study was barely sufficient time for students to become comfortable using the maps. A set of simpler strategies were selected for their more explicit facilitation of analogical reasoning, and were used together during two more semester long focus group treatments (Phase II and Phase III of this study). These strategies included the use of a step-by-step process aimed at reducing cognitive load associated with mathematical procedure, direct reflection of principles involved in a given set of problems, and the direct comparison of problem pairs designed to be void of surface similarities (similar objects or object orientations) and sharing

  10. Computational electromagnetic-aerodynamics

    CERN Document Server

    Shang, Joseph J S

    2016-01-01

    Presents numerical algorithms, procedures, and techniques required to solve engineering problems relating to the interactions between electromagnetic fields, fluid flow, and interdisciplinary technology for aerodynamics, electromagnetics, chemical-physics kinetics, and plasmadynamics This book addresses modeling and simulation science and technology for studying ionized gas phenomena in engineering applications. Computational Electromagnetic-Aerodynamics is organized into ten chapters. Chapter one to three introduce the fundamental concepts of plasmadynamics, chemical-physics of ionization, classical magnetohydrodynamics, and their extensions to plasma-based flow control actuators, high-speed flows of interplanetary re-entry, and ion thrusters in space exploration. Chapter four to six explain numerical algorithms and procedures for solving Maxwell’s equation in the time domain for computational electromagnetics, plasma wave propagation, and the time-dependent c mpressible Navier-Stokes equation for aerodyn...

  11. Upgrade of the computer-based information systems on USNRC simulators

    International Nuclear Information System (INIS)

    Griffin, J.I.

    1998-01-01

    In late 1995, the U.S. Nuclear Regulatory Commission (USNRC) began a project to upgrade the computer-based information systems on its BWR/6 and BandW Simulators. The existing display generation hardware was very old and in need of replacement due to difficulty in obtaining spare parts and technical support. In addition, the display systems used currently each require a SEL 32/55 computer system, which is also obsolete, running the Real Time Monitor (RTM) operating system. An upgrade of the display hardware and display generation systems not only solves the problem of obsolescence of that equipment but also allows removal of the 32/55 systems. These computers are used only to support the existing display generation systems. Shortly after purchase of the replacement equipment, it was learned that the vendor was no longer going to support the methodology. Instead of implementing an unsupported concept, it was decided to implement the display systems upgrades using the Picasso-3 UIMS (User Interface Management System) and the purchased hardware. This paper describes the upgraded display systems for the BWR/6 and BandW Simulators, including the design concept, display development, hardware requirements, the simulator interface software, and problems encountered. (author)

  12. Base excision repair deficient mice lacking the Aag alkyladenine DNA glycosylase.

    NARCIS (Netherlands)

    B.P. Engelward (Bevin); G. Weeda (Geert); M.D. Wyatt; J.L.M. Broekhof (Jose'); J. de Wit (Jan); I. Donker (Ingrid); J.M. Allan (James); B. Gold (Bert); J.H.J. Hoeijmakers (Jan); L.D. Samson (Leona)

    1997-01-01

    textabstract3-methyladenine (3MeA) DNA glycosylases remove 3MeAs from alkylated DNA to initiate the base excision repair pathway. Here we report the generation of mice deficient in the 3MeA DNA glycosylase encoded by the Aag (Mpg) gene. Alkyladenine DNA glycosylase turns out to be the major DNA

  13. Development of syntax of intuition-based learning model in solving mathematics problems

    Science.gov (United States)

    Yeni Heryaningsih, Nok; Khusna, Hikmatul

    2018-01-01

    The aim of the research was to produce syntax of Intuition Based Learning (IBL) model in solving mathematics problem for improving mathematics students’ achievement that valid, practical and effective. The subject of the research were 2 classes in grade XI students of SMAN 2 Sragen, Central Java. The type of the research was a Research and Development (R&D). Development process adopted Plomp and Borg & Gall development model, they were preliminary investigation step, design step, realization step, evaluation and revision step. Development steps were as follow: (1) Collected the information and studied of theories in Preliminary Investigation step, studied about intuition, learning model development, students condition, and topic analysis, (2) Designed syntax that could bring up intuition in solving mathematics problem and then designed research instruments. They were several phases that could bring up intuition, Preparation phase, Incubation phase, Illumination phase and Verification phase, (3) Realized syntax of Intuition Based Learning model that has been designed to be the first draft, (4) Did validation of the first draft to the validator, (5) Tested the syntax of Intuition Based Learning model in the classrooms to know the effectiveness of the syntax, (6) Conducted Focus Group Discussion (FGD) to evaluate the result of syntax model testing in the classrooms, and then did the revision on syntax IBL model. The results of the research were produced syntax of IBL model in solving mathematics problems that valid, practical and effective. The syntax of IBL model in the classroom were, (1) Opening with apperception, motivations and build students’ positive perceptions, (2) Teacher explains the material generally, (3) Group discussion about the material, (4) Teacher gives students mathematics problems, (5) Doing exercises individually to solve mathematics problems with steps that could bring up students’ intuition: Preparations, Incubation, Illumination, and

  14. Solving the Schroedinger equation using Smolyak interpolants

    International Nuclear Information System (INIS)

    Avila, Gustavo; Carrington, Tucker Jr.

    2013-01-01

    In this paper, we present a new collocation method for solving the Schroedinger equation. Collocation has the advantage that it obviates integrals. All previous collocation methods have, however, the crucial disadvantage that they require solving a generalized eigenvalue problem. By combining Lagrange-like functions with a Smolyak interpolant, we device a collocation method that does not require solving a generalized eigenvalue problem. We exploit the structure of the grid to develop an efficient algorithm for evaluating the matrix-vector products required to compute energy levels and wavefunctions. Energies systematically converge as the number of points and basis functions are increased

  15. Solving Inverse Kinematics – A New Approach to the Extended Jacobian Technique

    Directory of Open Access Journals (Sweden)

    M. Šoch

    2005-01-01

    Full Text Available This paper presents a brief summary of current numerical algorithms for solving the Inverse Kinematics problem. Then a new approach based on the Extended Jacobian technique is compared with the current Jacobian Inversion method. The presented method is intended for use in the field of computer graphics for animation of articulated structures. 

  16. Solvated protein–DNA docking using HADDOCK

    International Nuclear Information System (INIS)

    Dijk, Marc van; Visscher, Koen M.; Kastritis, Panagiotis L.; Bonvin, Alexandre M. J. J.

    2013-01-01

    Interfacial water molecules play an important role in many aspects of protein–DNA specificity and recognition. Yet they have been mostly neglected in the computational modeling of these complexes. We present here a solvated docking protocol that allows explicit inclusion of water molecules in the docking of protein–DNA complexes and demonstrate its feasibility on a benchmark of 30 high-resolution protein–DNA complexes containing crystallographically-determined water molecules at their interfaces. Our protocol is capable of reproducing the solvation pattern at the interface and recovers hydrogen-bonded water-mediated contacts in many of the benchmark cases. Solvated docking leads to an overall improvement in the quality of the generated protein–DNA models for cases with limited conformational change of the partners upon complex formation. The applicability of this approach is demonstrated on real cases by docking a representative set of 6 complexes using unbound protein coordinates, model-built DNA and knowledge-based restraints. As HADDOCK supports the inclusion of a variety of NMR restraints, solvated docking is also applicable for NMR-based structure calculations of protein–DNA complexes.

  17. Solvated protein-DNA docking using HADDOCK

    Energy Technology Data Exchange (ETDEWEB)

    Dijk, Marc van; Visscher, Koen M.; Kastritis, Panagiotis L.; Bonvin, Alexandre M. J. J., E-mail: a.m.j.j.bonvin@uu.nl [Utrecht University, Bijvoet Center for Biomolecular Research, Faculty of Science-Chemistry (Netherlands)

    2013-05-15

    Interfacial water molecules play an important role in many aspects of protein-DNA specificity and recognition. Yet they have been mostly neglected in the computational modeling of these complexes. We present here a solvated docking protocol that allows explicit inclusion of water molecules in the docking of protein-DNA complexes and demonstrate its feasibility on a benchmark of 30 high-resolution protein-DNA complexes containing crystallographically-determined water molecules at their interfaces. Our protocol is capable of reproducing the solvation pattern at the interface and recovers hydrogen-bonded water-mediated contacts in many of the benchmark cases. Solvated docking leads to an overall improvement in the quality of the generated protein-DNA models for cases with limited conformational change of the partners upon complex formation. The applicability of this approach is demonstrated on real cases by docking a representative set of 6 complexes using unbound protein coordinates, model-built DNA and knowledge-based restraints. As HADDOCK supports the inclusion of a variety of NMR restraints, solvated docking is also applicable for NMR-based structure calculations of protein-DNA complexes.

  18. Using an Interactive Web-Based Learning NMR Spectroscopy as a Means to Improve Problem Solving Skills for Undergraduates

    International Nuclear Information System (INIS)

    Supasorn, Saksri; Vibuljun, Sunantha; Panijpan, Bhinyo; Rajviroongit, Shuleewan

    2005-10-01

    An Interactive Web-Based Learning NMR Spectroscopy course is developed to improve and facilitate student ' s learning as well as achievement of learning objectives in the concepts of multiplicity, chemical shift, and problem solving. This web-based learning course is emphasized on NMR problem solving, therefore, the concepts of multiplicity and chemical shift, basic concepts for practice problem solving, are also emphasized. Most of animations and pictures in this web-based learning are new created and simplified to explain processes and principles in NMR spectroscopy. With meaningful animations and pictures, simplified English language used, step-by-step problem solving, and interactive test, it can be self-learning web site and best on the student ' s convenience

  19. Recovery Based Nanowire Field-Effect Transistor Detection of Pathogenic Avian Influenza DNA

    Science.gov (United States)

    Lin, Chih-Heng; Chu, Chia-Jung; Teng, Kang-Ning; Su, Yi-Jr; Chen, Chii-Dong; Tsai, Li-Chu; Yang, Yuh-Shyong

    2012-02-01

    Fast and accurate diagnosis is critical in infectious disease surveillance and management. We proposed a DNA recovery system that can easily be adapted to DNA chip or DNA biosensor for fast identification and confirmation of target DNA. This method was based on the re-hybridization of DNA target with a recovery DNA to free the DNA probe. Functionalized silicon nanowire field-effect transistor (SiNW FET) was demonstrated to monitor such specific DNA-DNA interaction using high pathogenic strain virus hemagglutinin 1 (H1) DNA of avian influenza (AI) as target. Specific electric changes were observed in real-time for AI virus DNA sensing and device recovery when nanowire surface of SiNW FET was modified with complementary captured DNA probe. The recovery based SiNW FET biosensor can be further developed for fast identification and further confirmation of a variety of influenza virus strains and other infectious diseases.

  20. Dreams and creative problem-solving.

    Science.gov (United States)

    Barrett, Deirdre

    2017-10-01

    Dreams have produced art, music, novels, films, mathematical proofs, designs for architecture, telescopes, and computers. Dreaming is essentially our brain thinking in another neurophysiologic state-and therefore it is likely to solve some problems on which our waking minds have become stuck. This neurophysiologic state is characterized by high activity in brain areas associated with imagery, so problems requiring vivid visualization are also more likely to get help from dreaming. This article reviews great historical dreams and modern laboratory research to suggest how dreams can aid creativity and problem-solving. © 2017 New York Academy of Sciences.

  1. Induced Polarization Influences the Fundamental Forces in DNA Base Flipping

    OpenAIRE

    Lemkul, Justin A.; Savelyev, Alexey; MacKerell, Alexander D.

    2014-01-01

    Base flipping in DNA is an important process involved in genomic repair and epigenetic control of gene expression. The driving forces for these processes are not fully understood, especially in the context of the underlying dynamics of the DNA and solvent effects. We studied double-stranded DNA oligomers that have been previously characterized by imino proton exchange NMR using both additive and polarizable force fields. Our results highlight the importance of induced polarization on the base...

  2. Knowledge acquisition from natural language for expert systems based on classification problem-solving methods

    Science.gov (United States)

    Gomez, Fernando

    1989-01-01

    It is shown how certain kinds of domain independent expert systems based on classification problem-solving methods can be constructed directly from natural language descriptions by a human expert. The expert knowledge is not translated into production rules. Rather, it is mapped into conceptual structures which are integrated into long-term memory (LTM). The resulting system is one in which problem-solving, retrieval and memory organization are integrated processes. In other words, the same algorithm and knowledge representation structures are shared by these processes. As a result of this, the system can answer questions, solve problems or reorganize LTM.

  3. Detailed screening of the soil faunal diversity using a tiered DNA metabarcoding approach

    DEFF Research Database (Denmark)

    Groot, G.A. de; Geisen, S.; Costa, D.

    is not a realistic proposition. DNA-based approaches, especially high-throughput DNA metabarcoding assays, potentially solve this issue, but the development of such methods targeting soil fauna lags far behind that of soil microbes. Within the EU FP7-project EcoFINDERS, we developed and tested a framework...... analyzed to obtain high resolution data for six different groups: mites, collembola, enchytraeids, nematodes, earthworms and protists. New primer sets, as well as reference barcode datasets were established for several of them. Here, we show the results of two test runs based on 454 pyrosequencing....... In the first run, artificially created DNA pools of known composition were analysed to test to which extent the taxonomic composition could successfully be retrieved. Preliminary results show that for all groups the majority of species in the DNA pool were recovered by the metabarcoding approach. By comparing...

  4. DNA cross-linking by dehydromonocrotaline lacks apparent base sequence preference.

    Science.gov (United States)

    Rieben, W Kurt; Coulombe, Roger A

    2004-12-01

    Pyrrolizidine alkaloids (PAs) are ubiquitous plant toxins, many of which, upon oxidation by hepatic mixed-function oxidases, become reactive bifunctional pyrrolic electrophiles that form DNA-DNA and DNA-protein cross-links. The anti-mitotic, toxic, and carcinogenic action of PAs is thought to be caused, at least in part, by these cross-links. We wished to determine whether the activated PA pyrrole dehydromonocrotaline (DHMO) exhibits base sequence preferences when cross-linked to a set of model duplex poly A-T 14-mer oligonucleotides with varying internal and/or end 5'-d(CG), 5'-d(GC), 5'-d(TA), 5'-d(CGCG), or 5'-d(GCGC) sequences. DHMO-DNA cross-links were assessed by electrophoretic mobility shift assay (EMSA) of 32P endlabeled oligonucleotides and by HPLC analysis of cross-linked DNAs enzymatically digested to their constituent deoxynucleosides. The degree of DNA cross-links depended upon the concentration of the pyrrole, but not on the base sequence of the oligonucleotide target. Likewise, HPLC chromatograms of cross-linked and digested DNAs showed no discernible sequence preference for any nucleotide. Added glutathione, tyrosine, cysteine, and aspartic acid, but not phenylalanine, threonine, serine, lysine, or methionine competed with DNA as alternate nucleophiles for cross-linking by DHMO. From these data it appears that DHMO exhibits no strong base preference when forming cross-links with DNA, and that some cellular nucleophiles can inhibit DNA cross-link formation.

  5. Problem solving and Program design using the TI-92

    NARCIS (Netherlands)

    Ir.ing. Ton Marée; ir Martijn van Dongen

    2000-01-01

    This textbook is intended for a basic course in problem solving and program design needed by scientists and engineers using the TI-92. The TI-92 is an extremely powerful problem solving tool that can help you manage complicated problems quickly. We assume no prior knowledge of computers or

  6. Hypergraph partitioning implementation for parallelizing matrix-vector multiplication using CUDA GPU-based parallel computing

    Science.gov (United States)

    Murni, Bustamam, A.; Ernastuti, Handhika, T.; Kerami, D.

    2017-07-01

    Calculation of the matrix-vector multiplication in the real-world problems often involves large matrix with arbitrary size. Therefore, parallelization is needed to speed up the calculation process that usually takes a long time. Graph partitioning techniques that have been discussed in the previous studies cannot be used to complete the parallelized calculation of matrix-vector multiplication with arbitrary size. This is due to the assumption of graph partitioning techniques that can only solve the square and symmetric matrix. Hypergraph partitioning techniques will overcome the shortcomings of the graph partitioning technique. This paper addresses the efficient parallelization of matrix-vector multiplication through hypergraph partitioning techniques using CUDA GPU-based parallel computing. CUDA (compute unified device architecture) is a parallel computing platform and programming model that was created by NVIDIA and implemented by the GPU (graphics processing unit).

  7. A QM/MM refinement of an experimental DNA structure with metal-mediated base pairs.

    Science.gov (United States)

    Kumbhar, Sadhana; Johannsen, Silke; Sigel, Roland K O; Waller, Mark P; Müller, Jens

    2013-10-01

    A series of hybrid quantum mechanical/molecular mechanical (QM/MM) calculations was performed on models of a DNA duplex with artificial silver(I)-mediated imidazole base pairs. The optimized structures were compared to the original experimental NMR structure (Nat. Chem. 2 (2010) 229-234). The metal⋯metal distances are significantly shorter (~0.5Å) in the QM/MM model than in the original NMR structure. As a result, argentophilic interactions are feasible between the silver(I) ions of neighboring metal-mediated base pairs. Using the computationally determined metal⋯metal distances, a re-refined NMR solution structure of the DNA duplex was obtained. In this new NMR structure, all experimental constraints remain fulfilled. The new NMR structure shows less deviation from the regular B-type conformation than the original one. This investigation shows that the application of QM/MM models to generate additional constraints to be used during NMR structural refinements represents an elegant approach to obtaining high-resolution NMR structures. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis

    Science.gov (United States)

    Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan

    2009-01-01

    DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301

  9. Web-Based Undergraduate Chemistry Problem-Solving: The Interplay of Task Performance, Domain Knowledge and Web-Searching Strategies

    Science.gov (United States)

    She, Hsiao-Ching; Cheng, Meng-Tzu; Li, Ta-Wei; Wang, Chia-Yu; Chiu, Hsin-Tien; Lee, Pei-Zon; Chou, Wen-Chi; Chuang, Ming-Hua

    2012-01-01

    This study investigates the effect of Web-based Chemistry Problem-Solving, with the attributes of Web-searching and problem-solving scaffolds, on undergraduate students' problem-solving task performance. In addition, the nature and extent of Web-searching strategies students used and its correlation with task performance and domain knowledge also…

  10. High-resolution NMR studies of chimeric DNA-RNA-DNA duplexes, heteronomous base pairing, and continuous base stacking at junctions

    International Nuclear Information System (INIS)

    Chou, Shanho; Flynn, P.; Wang, A.; Reid, B.

    1991-01-01

    Two symmetrical DNA-RNA-DNA duplex chimeras, d(CGCG)r(AAUU)d(CGCG) (designated rAAUU) and d(CGCG)r(UAUA)d(CGCG) (designated rUAUA), and a nonsymmetrical chimeric duplex, d(CGTT)r(AUAA)d(TGCG)/d(CGCA)r(UUAU)d(AACG) (designated rAUAA), as well as their pure DNA analogues, containing dU instead of T, have been synthesized by solid-phase phosphoramidite methods and studied by high-resolution NMR techniques. The 1D imino proton NOE spectra of these d-r-d chimeras indicate normal Watson-Crick hydrogen bonding and base stacking at the junction region. Preliminary qualitative NOESY, COSY, and chemical shift data suggest that the internal RNA segment contains C3'-endo (A-type) sugar conformations except for the first RNA residues (position 5 and 17) following the 3' end of the DNA block, which, unlike the other six ribonucleotides, exhibit detectable H1'-H2' J coupling. The nucleosides of the two flanking DNA segments appear to adopt a fairly normal C2'-endo B-DNA conformation except at the junction with the RNA blocks (residues 4 and 16), where the last DNA residue appears to adopt an intermediate sugar conformation. The data indicate that A-type and B-type conformations can coexist in a single short continuous nucleic acid duplex, but these results differ somewhat from previous theoretical model studies

  11. Physics: Quantum problems solved through games

    Science.gov (United States)

    Maniscalco, Sabrina

    2016-04-01

    Humans are better than computers at performing certain tasks because of their intuition and superior visual processing. Video games are now being used to channel these abilities to solve problems in quantum physics. See Letter p.210

  12. BioWires: Conductive DNA Nanowires in a Computationally-Optimized, Synthetic Biological Platform for Nanoelectronic Fabrication

    Science.gov (United States)

    Vecchioni, Simon; Toomey, Emily; Capece, Mark C.; Rothschild, Lynn; Wind, Shalom

    2017-01-01

    DNA is an ideal template for a biological nanowire-it has a linear structure several atoms thick; it possesses addressable nucleobase geometry that can be precisely defined; and it is massively scalable into branched networks. Until now, the drawback of DNA as a conducting nanowire been, simply put, its low conductance. To address this deficiency, we extensively characterize a chemical variant of canonical DNA that exploits the affinity of natural cytosine bases for silver ions. We successfully construct chains of single silver ions inside double-stranded DNA, confirm the basic dC-Ag+-dC bond geometry and kinetics, and show length-tunability dependent on mismatch distribution, ion availability and enzyme activity. An analysis of the absorbance spectra of natural DNA and silver-binding, poly-cytosine DNA demonstrates the heightened thermostability of the ion chain and its resistance to aqueous stresses such as precipitation, dialysis and forced reduction. These chemically critical traits lend themselves to an increase in electrical conductivity of over an order of magnitude for 11-base silver-paired duplexes over natural strands when assayed by STM break junction. We further construct and implement a genetic pathway in the E. coli bacterium for the biosynthesis of highly ionizable DNA sequences. Toward future circuits, we construct a model of transcription network architectures to determine the most efficient and robust connectivity for cell-based fabrication, and we perform sequence optimization with a genetic algorithm to identify oligonucleotides robust to changes in the base-pairing energy landscape. We propose that this system will serve as a synthetic biological fabrication platform for more complex DNA nanotechnology and nanoelectronics with applications to deep space and low resource environments.

  13. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  14. cgDNA: a software package for the prediction of sequence-dependent coarse-grain free energies of B-form DNA.

    Science.gov (United States)

    Petkevičiūtė, D; Pasi, M; Gonzalez, O; Maddocks, J H

    2014-11-10

    cgDNA is a package for the prediction of sequence-dependent configuration-space free energies for B-form DNA at the coarse-grain level of rigid bases. For a fragment of any given length and sequence, cgDNA calculates the configuration of the associated free energy minimizer, i.e. the relative positions and orientations of each base, along with a stiffness matrix, which together govern differences in free energies. The model predicts non-local (i.e. beyond base-pair step) sequence dependence of the free energy minimizer. Configurations can be input or output in either the Curves+ definition of the usual helical DNA structural variables, or as a PDB file of coordinates of base atoms. We illustrate the cgDNA package by comparing predictions of free energy minimizers from (a) the cgDNA model, (b) time-averaged atomistic molecular dynamics (or MD) simulations, and (c) NMR or X-ray experimental observation, for (i) the Dickerson-Drew dodecamer and (ii) three oligomers containing A-tracts. The cgDNA predictions are rather close to those of the MD simulations, but many orders of magnitude faster to compute. Both the cgDNA and MD predictions are in reasonable agreement with the available experimental data. Our conclusion is that cgDNA can serve as a highly efficient tool for studying structural variations in B-form DNA over a wide range of sequences. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Mechanistic Basis for the Bypass of a Bulky DNA Adduct Catalyzed by a Y-Family DNA Polymerase

    Science.gov (United States)

    Vyas, Rajan; Efthimiopoulos, Georgia; Tokarsky, E. John; Malik, Chanchal K.; Basu, Ashis K.; Suo, Zucai

    2015-01-01

    1-Nitropyrene (1-NP), an environmental pollutant, induces DNA damage in vivo and is considered to be carcinogenic. The DNA adducts formed by the 1-NP metabolites stall replicative DNA polymerases but are presumably bypassed by error-prone Y-family DNA polymerases at the expense of replication fidelity and efficiency in vivo. Our running start assays confirmed that a site-specifically placed 8-(deoxyguanosin-N2-yl)-1-aminopyrene (dG1,8), one of the DNA adducts derived from 1-NP, can be bypassed by Sulfolobus solfataricus DNA polymerase IV (Dpo4), although this representative Y-family enzyme was paused strongly by the lesion. Pre-steady-state kinetic assays were employed to determine the low nucleotide incorporation fidelity and establish a minimal kinetic mechanism for the dG1,8 bypass by Dpo4. To reveal a structural basis for dCTP incorporation opposite dG1,8, we solved the crystal structures of the complexes of Dpo4 and DNA containing a templating dG1,8 lesion in the absence or presence of dCTP. The Dpo4·DNA-dG1,8 binary structure shows that the aminopyrene moiety of the lesion stacks against the primer/template junction pair, while its dG moiety projected into the cleft between the Finger and Little Finger domains of Dpo4. In the Dpo4·DNA-dG1,8·dCTP ternary structure, the aminopyrene moiety of the dG1,8 lesion, is sandwiched between the nascent and junction base pairs, while its base is present in the major groove. Moreover, dCTP forms a Watson–Crick base pair with dG, two nucleotides upstream from the dG1,8 site, creating a complex for “-2” frameshift mutation. Mechanistically, these crystal structures provide additional insight into the aforementioned minimal kinetic mechanism. PMID:26327169

  16. An adaptive multi-spline refinement algorithm in simulation based sailboat trajectory optimization using onboard multi-core computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2016-06-01

    Full Text Available A new dynamic programming based parallel algorithm adapted to on-board heterogeneous computers for simulation based trajectory optimization is studied in the context of “high-performance sailing”. The algorithm uses a new discrete space of continuously differentiable functions called the multi-splines as its search space representation. A basic version of the algorithm is presented in detail (pseudo-code, time and space complexity, search space auto-adaptation properties. Possible extensions of the basic algorithm are also described. The presented experimental results show that contemporary heterogeneous on-board computers can be effectively used for solving simulation based trajectory optimization problems. These computers can be considered micro high performance computing (HPC platforms-they offer high performance while remaining energy and cost efficient. The simulation based approach can potentially give highly accurate results since the mathematical model that the simulator is built upon may be as complex as required. The approach described is applicable to many trajectory optimization problems due to its black-box represented performance measure and use of OpenCL.

  17. Whose DNA is this?

    DEFF Research Database (Denmark)

    Taroni, Franco; Biedermann, Alex; Vuille, Joëlle

    2013-01-01

    This communication seeks to draw the attention of researchers and practitioners dealing with forensic DNA profiling analyses to the following question: is a scientist's report, offering support to a hypothesis according to which a particular individual is the source of DNA detected during...... evoked during the international conference "The hidden side of DNA profiles. Artifacts, errors and uncertain evidence" held in Rome (April 27th to 28th, 2012). Indeed, despite the fact that this conference brought together some of the world's leading forensic DNA specialists, it appeared clearly...... talk considerably different languages. It thus is fundamental to address this issue of communication about results of forensic DNA analyses, and open a dialogue with practicing non-scientists at large who need to make meaningful use of scientific results to approach and help solve judicial cases...

  18. Students’ Relational Understanding in Quadrilateral Problem Solving Based on Adversity Quotient

    Science.gov (United States)

    Safitri, A. N.; Juniati, D.; Masriyah

    2018-01-01

    The type of research is qualitative approach which aims to describe how students’ relational understanding of solving mathematic problem that was seen from Adversity Quotient aspect. Research subjects were three 7th grade students of Junior High School. They were taken by category of Adversity Quotient (AQ) such quitter, camper, and climber. Data collected based on problem solving and interview. The research result showed that (1) at the stage of understanding the problem, the subjects were able to state and write down what is known and asked, and able to mention the concepts associated with the quadrilateral problem. (2) The three subjects devise a plan by linking concepts relating to quadrilateral problems. (3) The three subjects were able to solve the problem. (4) The three subjects were able to look back the answers. The three subjects were able to understand the problem, devise a plan, carry out the plan and look back. However, the quitter and camper subjects have not been able to give a reason for the steps they have taken.

  19. Nanochannel Device with Embedded Nanopore: a New Approach for Single-Molecule DNA Analysis and Manipulation

    Science.gov (United States)

    Zhang, Yuning; Reisner, Walter

    2013-03-01

    Nanopore and nanochannel based devices are robust methods for biomolecular sensing and single DNA manipulation. Nanopore-based DNA sensing has attractive features that make it a leading candidate as a single-molecule DNA sequencing technology. Nanochannel based extension of DNA, combined with enzymatic or denaturation-based barcoding schemes, is already a powerful approach for genome analysis. We believe that there is revolutionary potential in devices that combine nanochannels with embedded pore detectors. In particular, due to the fast translocation of a DNA molecule through a standard nanopore configuration, there is an unfavorable trade-off between signal and sequence resolution. With a combined nanochannel-nanopore device, based on embedding a pore inside a nanochannel, we can in principle gain independent control over both DNA translocation speed and sensing signal, solving the key draw-back of the standard nanopore configuration. We demonstrate that we can optically detect successful translocation of DNA from the nanochannel out through the nanopore, a possible method to 'select' a given barcode for further analysis. In particular, we show that in equilibrium DNA will not escape through an embedded sub-persistence length nanopore, suggesting that the pore could be used as a nanoscale window through which to interrogate a nanochannel extended DNA molecule. Furthermore, electrical measurements through the nanopore are performed, indicating that DNA sensing is feasible using the nanochannel-nanopore device.

  20. Solving the neutron diffusion equation on combinatorial geometry computational cells for reactor physics calculations

    International Nuclear Information System (INIS)

    Azmy, Y. Y.

    2004-01-01

    An approach is developed for solving the neutron diffusion equation on combinatorial geometry computational cells, that is computational cells composed by combinatorial operations involving simple-shaped component cells. The only constraint on the component cells from which the combinatorial cells are assembled is that they possess a legitimate discretization of the underlying diffusion equation. We use the Finite Difference (FD) approximation of the x, y-geometry diffusion equation in this work. Performing the same combinatorial operations involved in composing the combinatorial cell on these discrete-variable equations yields equations that employ new discrete variables defined only on the combinatorial cell's volume and faces. The only approximation involved in this process, beyond the truncation error committed in discretizing the diffusion equation over each component cell, is a consistent-order Legendre series expansion. Preliminary results for simple configurations establish the accuracy of the solution to the combinatorial geometry solution compared to straight FD as the system dimensions decrease. Furthermore numerical results validate the consistent Legendre-series expansion order by illustrating the second order accuracy of the combinatorial geometry solution, the same as standard FD. Nevertheless the magnitude of the error for the new approach is larger than FD's since it incorporates the additional truncated series approximation. (authors)