Li, Kenli; Zou, Shuting; Xv, Jin
2008-01-01
Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2(n)), n in Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2(n)) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.
Fast parallel DNA-based algorithms for molecular computation: the set-partition problem.
Chang, Weng-Long
2007-12-01
This paper demonstrates that basic biological operations can be used to solve the set-partition problem. In order to achieve this, we propose three DNA-based algorithms, a signed parallel adder, a signed parallel subtractor and a signed parallel comparator, that formally verify our designed molecular solutions for solving the set-partition problem.
Chang, Weng-Long
2012-03-01
Assume that n is a positive integer. If there is an integer such that M (2) ≡ C (mod n), i.e., the congruence has a solution, then C is said to be a quadratic congruence (mod n). If the congruence does not have a solution, then C is said to be a quadratic noncongruence (mod n). The task of solving the problem is central to many important applications, the most obvious being cryptography. In this article, we describe a DNA-based algorithm for solving quadratic congruence and factoring integers. In additional to this novel contribution, we also show the utility of our encoding scheme, and of the algorithm's submodules. We demonstrate how a variety of arithmetic, shifted and comparative operations, namely bitwise and full addition, subtraction, left shifter and comparison perhaps are performed using strands of DNA.
Fast parallel molecular algorithms for DNA-based computation: factoring integers.
Chang, Weng-Long; Guo, Minyi; Ho, Michael Shan-Hui
2005-06-01
The RSA public-key cryptosystem is an algorithm that converts input data to an unrecognizable encryption and converts the unrecognizable data back into its original decryption form. The security of the RSA public-key cryptosystem is based on the difficulty of factoring the product of two large prime numbers. This paper demonstrates to factor the product of two large prime numbers, and is a breakthrough in basic biological operations using a molecular computer. In order to achieve this, we propose three DNA-based algorithms for parallel subtractor, parallel comparator, and parallel modular arithmetic that formally verify our designed molecular solutions for factoring the product of two large prime numbers. Furthermore, this work indicates that the cryptosystems using public-key are perhaps insecure and also presents clear evidence of the ability of molecular computing to perform complicated mathematical operations.
Solving computationally expensive engineering problems
Leifsson, Leifur; Yang, Xin-She
2014-01-01
Computational complexity is a serious bottleneck for the design process in virtually any engineering area. While migration from prototyping and experimental-based design validation to verification using computer simulation models is inevitable and has a number of advantages, high computational costs of accurate, high-fidelity simulations can be a major issue that slows down the development of computer-aided design methodologies, particularly those exploiting automated design improvement procedures, e.g., numerical optimization. The continuous increase of available computational resources does not always translate into shortening of the design cycle because of the growing demand for higher accuracy and necessity to simulate larger and more complex systems. Accurate simulation of a single design of a given system may be as long as several hours, days or even weeks, which often makes design automation using conventional methods impractical or even prohibitive. Additional problems include numerical noise often pr...
AI tools in computer based problem solving
Beane, Arthur J.
1988-01-01
The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.
Using Computer Simulations in Chemistry Problem Solving
Avramiotis, Spyridon; Tsaparlis, Georgios
2013-01-01
This study is concerned with the effects of computer simulations of two novel chemistry problems on the problem solving ability of students. A control-experimental group, equalized by pair groups (n[subscript Exp] = n[subscript Ctrl] = 78), research design was used. The students had no previous experience of chemical practical work. Student…
Statistical length of DNA based on AFM image measured by a computer
International Nuclear Information System (INIS)
Chen Xinqing; Qiu Xijun; Zhang Yi; Hu Jun; Wu Shiying; Huang Yibo; Ai Xiaobai; Li Minqian
2001-01-01
Taking advantage of image processing technology, the contour length of DNA molecule was measured automatically by a computer. Based on the AFM image of DNA, the topography of DNA was simulated into a curve. Then the DNA length was measured automatically by inserting mode. It was shown that the experimental length of a naturally deposited DNA (180.4 +- 16.4 nm) was well consistent with the theoretical length (185.0 nm). Comparing to other methods, the present approach had advantages of precision and automatism. The stretched DNA was also measured. It present approach had advantages of precision and automatism. The stretched DNA was also measured. It was shown that the experimental length (343.6 +- 20.7 nm) was much longer than the theoretical length (307.0 nm). This result indicated that the stretching process had a distinct effect on the DNA length. However, the method provided here avoided the DNA-stretching effect
Validation of DNA-based identification software by computation of pedigree likelihood ratios.
Slooten, K
2011-08-01
Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Engineering and Computing Portal to Solve Environmental Problems
Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.
2018-01-01
This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.
Computational physics problem solving with Python
Landau, Rubin H; Bordeianu, Cristian C
2015-01-01
The use of computation and simulation has become an essential part of the scientific process. Being able to transform a theory into an algorithm requires significant theoretical insight, detailed physical and mathematical understanding, and a working level of competency in programming. This upper-division text provides an unusually broad survey of the topics of modern computational physics from a multidisciplinary, computational science point of view. Its philosophy is rooted in learning by doing (assisted by many model programs), with new scientific materials as well as with the Python progr
Comprehension and computation in Bayesian problem solving
Directory of Open Access Journals (Sweden)
Eric D. Johnson
2015-07-01
Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.
A Cognitive Model for Problem Solving in Computer Science
Parham, Jennifer R.
2009-01-01
According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…
Emotion Oriented Programming: Computational Abstractions for AI Problem Solving
Darty , Kevin; Sabouret , Nicolas
2012-01-01
International audience; In this paper, we present a programming paradigm for AI problem solving based on computational concepts drawn from Affective Computing. It is believed that emotions participate in human adaptability and reactivity, in behaviour selection and in complex and dynamic environments. We propose to define a mechanism inspired from this observation for general AI problem solving. To this purpose, we synthesize emotions as programming abstractions that represent the perception ...
Solving the Stokes problem on a massively parallel computer
DEFF Research Database (Denmark)
Axelsson, Owe; Barker, Vincent A.; Neytcheva, Maya
2001-01-01
boundary value problem for each velocity component, are solved by the conjugate gradient method with a preconditioning based on the algebraic multi‐level iteration (AMLI) technique. The velocity is found from the computed pressure. The method is optimal in the sense that the computational work...... is proportional to the number of unknowns. Further, it is designed to exploit a massively parallel computer with distributed memory architecture. Numerical experiments on a Cray T3E computer illustrate the parallel performance of the method....
Solving satisfiability problems by the ground-state quantum computer
International Nuclear Information System (INIS)
Mao Wenjin
2005-01-01
A quantum algorithm is proposed to solve the satisfiability (SAT) problems by the ground-state quantum computer. The scale of the energy gap of the ground-state quantum computer is analyzed for the 3-bit exact cover problem. The time cost of this algorithm on the general SAT problems is discussed
Experimental quantum computing to solve systems of linear equations.
Cai, X-D; Weedbrook, C; Su, Z-E; Chen, M-C; Gu, Mile; Zhu, M-J; Li, Li; Liu, Nai-Le; Lu, Chao-Yang; Pan, Jian-Wei
2013-06-07
Solving linear systems of equations is ubiquitous in all areas of science and engineering. With rapidly growing data sets, such a task can be intractable for classical computers, as the best known classical algorithms require a time proportional to the number of variables N. A recently proposed quantum algorithm shows that quantum computers could solve linear systems in a time scale of order log(N), giving an exponential speedup over classical computers. Here we realize the simplest instance of this algorithm, solving 2×2 linear equations for various input vectors on a quantum computer. We use four quantum bits and four controlled logic gates to implement every subroutine required, demonstrating the working principle of this algorithm.
6th International Conference on Soft Computing for Problem Solving
Bansal, Jagdish; Das, Kedar; Lal, Arvind; Garg, Harish; Nagar, Atulya; Pant, Millie
2017-01-01
This two-volume book gathers the proceedings of the Sixth International Conference on Soft Computing for Problem Solving (SocProS 2016), offering a collection of research papers presented during the conference at Thapar University, Patiala, India. Providing a veritable treasure trove for scientists and researchers working in the field of soft computing, it highlights the latest developments in the broad area of “Computational Intelligence” and explores both theoretical and practical aspects using fuzzy logic, artificial neural networks, evolutionary algorithms, swarm intelligence, soft computing, computational intelligence, etc.
Applying natural evolution for solving computational problems - Lecture 1
CERN. Geneva
2017-01-01
Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.
Applying natural evolution for solving computational problems - Lecture 2
CERN. Geneva
2017-01-01
Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.
Internet computer coaches for introductory physics problem solving
Xu Ryan, Qing
The ability to solve problems in a variety of contexts is becoming increasingly important in our rapidly changing technological society. Problem-solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving skills throughout the educational system, national studies have shown that the majority of students emerge from such courses having made little progress toward developing good problem-solving skills. The Physics Education Research Group at the University of Minnesota has been developing Internet computer coaches to help students become more expert-like problem solvers. During the Fall 2011 and Spring 2013 semesters, the coaches were introduced into large sections (200+ students) of the calculus based introductory mechanics course at the University of Minnesota. This dissertation, will address the research background of the project, including the pedagogical design of the coaches and the assessment of problem solving. The methodological framework of conducting experiments will be explained. The data collected from the large-scale experimental studies will be discussed from the following aspects: the usage and usability of these coaches; the usefulness perceived by students; and the usefulness measured by final exam and problem solving rubric. It will also address the implications drawn from this study, including using this data to direct future coach design and difficulties in conducting authentic assessment of problem-solving.
Second International Conference on Soft Computing for Problem Solving
Nagar, Atulya; Deep, Kusum; Pant, Millie; Bansal, Jagdish; Ray, Kanad; Gupta, Umesh
2014-01-01
The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2012), held at JK Lakshmipat University, Jaipur, India. This book provides the latest developments in the area of soft computing and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining, etc. The objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.
Third International Conference on Soft Computing for Problem Solving
Deep, Kusum; Nagar, Atulya; Bansal, Jagdish
2014-01-01
The present book is based on the research papers presented in the 3rd International Conference on Soft Computing for Problem Solving (SocProS 2013), held as a part of the golden jubilee celebrations of the Saharanpur Campus of IIT Roorkee, at the Noida Campus of Indian Institute of Technology Roorkee, India. This book is divided into two volumes and covers a variety of topics including mathematical modelling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining etc. Particular emphasis is laid on soft computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems, which are otherwise difficult to solve by the usual and traditional methods. The book is directed ...
Using Computer Symbolic Algebra to Solve Differential Equations.
Mathews, John H.
1989-01-01
This article illustrates that mathematical theory can be incorporated into the process to solve differential equations by a computer algebra system, muMATH. After an introduction to functions of muMATH, several short programs for enhancing the capabilities of the system are discussed. Listed are six references. (YP)
Exploring hadronic physics by solving QCD with a teraflops computer
International Nuclear Information System (INIS)
Negele, J.
1993-01-01
Quantum chromodynamics, the theory believed to govern the nucleons, mesons, and other strongly interacting particles making up most of the known mass of the universe is such a challenging, nonlinear many-body problem that it has never been solved using conventional analytical techniques. This talk will describe how this theory can be solved numerically on a space-time lattice, show what has already been understood about the structure of hadrons and the quark gluon phase transition. and describe an exciting initiative to build a dedicated Teraflops computer capable of performing 10 12 operations per second to make fundamental advances in QCD
4th International Conference on Soft Computing for Problem Solving
Deep, Kusum; Pant, Millie; Bansal, Jagdish; Nagar, Atulya
2015-01-01
This two volume book is based on the research papers presented at the 4th International Conference on Soft Computing for Problem Solving (SocProS 2014) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and healthcare, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.
Gonzalez, E; Lino, J; Deriabina, A; Herrera, J N F; Poltev, V I
2013-01-01
To elucidate details of the DNA-water interactions we performed the calculations and systemaitic search for minima of interaction energy of the systems consisting of one of DNA bases and one or two water molecules. The results of calculations using two force fields of molecular mechanics (MM) and correlated ab initio method MP2/6-31G(d, p) of quantum mechanics (QM) have been compared with one another and with experimental data. The calculations demonstrated a qualitative agreement between geometry characteristics of the most of local energy minima obtained via different methods. The deepest minima revealed by MM and QM methods correspond to water molecule position between two neighbor hydrophilic centers of the base and to the formation by water molecule of hydrogen bonds with them. Nevertheless, the relative depth of some minima and peculiarities of mutual water-base positions in' these minima depend on the method used. The analysis revealed insignificance of some differences in the results of calculations performed via different methods and the importance of other ones for the description of DNA hydration. The calculations via MM methods enable us to reproduce quantitatively all the experimental data on the enthalpies of complex formation of single water molecule with the set of mono-, di-, and trimethylated bases, as well as on water molecule locations near base hydrophilic atoms in the crystals of DNA duplex fragments, while some of these data cannot be rationalized by QM calculations.
Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills
Polyak, Stephen T.; von Davier, Alina A.; Peterschmidt, Kurt
2017-01-01
This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses. PMID:29238314
Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills
Directory of Open Access Journals (Sweden)
Stephen T. Polyak
2017-11-01
Full Text Available This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.
Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills.
Polyak, Stephen T; von Davier, Alina A; Peterschmidt, Kurt
2017-01-01
This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.
5th International Conference on Soft Computing for Problem Solving
Deep, Kusum; Bansal, Jagdish; Nagar, Atulya; Das, Kedar
2016-01-01
This two volume book is based on the research papers presented at the 5th International Conference on Soft Computing for Problem Solving (SocProS 2015) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.
1991-06-01
Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent
Solving a Hamiltonian Path Problem with a bacterial computer
Baumgardner, Jordan; Acker, Karen; Adefuye, Oyinade; Crowley, Samuel Thomas; DeLoache, Will; Dickson, James O; Heard, Lane; Martens, Andrew T; Morton, Nickolaus; Ritter, Michelle; Shoecraft, Amber; Treece, Jessica; Unzicker, Matthew; Valencia, Amanda; Waters, Mike; Campbell, A Malcolm; Heyer, Laurie J; Poet, Jeffrey L; Eckdahl, Todd T
2009-01-01
Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node directed graph. This proof
Solving a Hamiltonian Path Problem with a bacterial computer
Directory of Open Access Journals (Sweden)
Treece Jessica
2009-07-01
Full Text Available Abstract Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node
Wang, Fuan; Willner, Bilha; Willner, Itamar
2014-01-01
The base sequence in nucleic acids encodes substantial structural and functional information into the biopolymer. This encoded information provides the basis for the tailoring and assembly of DNA machines. A DNA machine is defined as a molecular device that exhibits the following fundamental features. (1) It performs a fuel-driven mechanical process that mimics macroscopic machines. (2) The mechanical process requires an energy input, "fuel." (3) The mechanical operation is accompanied by an energy consumption process that leads to "waste products." (4) The cyclic operation of the DNA devices, involves the use of "fuel" and "anti-fuel" ingredients. A variety of DNA-based machines are described, including the construction of "tweezers," "walkers," "robots," "cranes," "transporters," "springs," "gears," and interlocked cyclic DNA structures acting as reconfigurable catenanes, rotaxanes, and rotors. Different "fuels", such as nucleic acid strands, pH (H⁺/OH⁻), metal ions, and light, are used to trigger the mechanical functions of the DNA devices. The operation of the devices in solution and on surfaces is described, and a variety of optical, electrical, and photoelectrochemical methods to follow the operations of the DNA machines are presented. We further address the possible applications of DNA machines and the future perspectives of molecular DNA devices. These include the application of DNA machines as functional structures for the construction of logic gates and computing, for the programmed organization of metallic nanoparticle structures and the control of plasmonic properties, and for controlling chemical transformations by DNA machines. We further discuss the future applications of DNA machines for intracellular sensing, controlling intracellular metabolic pathways, and the use of the functional nanostructures for drug delivery and medical applications.
Internet Computer Coaches for Introductory Physics Problem Solving
Xu Ryan, Qing
2013-01-01
The ability to solve problems in a variety of contexts is becoming increasingly important in our rapidly changing technological society. Problem-solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving skills throughout the…
Molecular computing towards a novel computing architecture for complex problem solving
Chang, Weng-Long
2014-01-01
This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...
Solving Dynamic Battlespace Movement Problems Using Dynamic Distributed Computer Networks
National Research Council Canada - National Science Library
Bradford, Robert
2000-01-01
.... The thesis designs a system using this architecture that invokes operations research network optimization algorithms to solve problems involving movement of people and equipment over dynamic road networks...
The Role of the Goal in Solving Hard Computational Problems: Do People Really Optimize?
Carruthers, Sarah; Stege, Ulrike; Masson, Michael E. J.
2018-01-01
The role that the mental, or internal, representation plays when people are solving hard computational problems has largely been overlooked to date, despite the reality that this internal representation drives problem solving. In this work we investigate how performance on versions of two hard computational problems differs based on what internal…
Computer as a Medium for Overcoming Misconceptions in Solving Inequalities
Abramovich, Sergei; Ehrlich, Amos
2007-01-01
Inequalities are considered among the most useful tools of investigation in pure and applied mathematics; yet their didactical aspects have not received much attention in mathematics education research until recently. An important aspect of teaching problem solving at the secondary level deals with the notion of equivalence of algebraic…
A heterogeneous computing environment to solve the 768-bit RSA challenge
Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz
2010-01-01
In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.
Parallel computation for solving the tridiagonal linear system of equations
International Nuclear Information System (INIS)
Ishiguro, Misako; Harada, Hiroo; Fujii, Minoru; Fujimura, Toichiro; Nakamura, Yasuhiro; Nanba, Katsumi.
1981-09-01
Recently, applications of parallel computation for scientific calculations have increased from the need of the high speed calculation of large scale programs. At the JAERI computing center, an array processor FACOM 230-75 APU has installed to study the applicability of parallel computation for nuclear codes. We made some numerical experiments by using the APU on the methods of solution of tridiagonal linear equation which is an important problem in scientific calculations. Referring to the recent papers with parallel methods, we investigate eight ones. These are Gauss elimination method, Parallel Gauss method, Accelerated parallel Gauss method, Jacobi method, Recursive doubling method, Cyclic reduction method, Chebyshev iteration method, and Conjugate gradient method. The computing time and accuracy were compared among the methods on the basis of the numerical experiments. As the result, it is found that the Cyclic reduction method is best both in computing time and accuracy and the Gauss elimination method is the second one. (author)
Comparison of evolutionary computation algorithms for solving bi ...
Indian Academy of Sciences (India)
failure probability. Multiobjective Evolutionary Computation algorithms (MOEAs) are well-suited for Multiobjective task scheduling on heterogeneous environment. The two Multi-Objective Evolutionary Algorithms such as Multiobjective Genetic. Algorithm (MOGA) and Multiobjective Evolutionary Programming (MOEP) with.
Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story
Gunbas, N.
2015-01-01
The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…
USING CLOUD COMPUTING IN SOLVING THE PROBLEMS OF LOGIC
Directory of Open Access Journals (Sweden)
Pavlo V. Mykytenko
2017-02-01
Full Text Available The article provides an overview of the most popular cloud services, in particular those which have their complete office suites, the basic functional characteristics and highlights the advantages and disadvantages of cloud services in the educational process. It was made a comparative analysis of the spreadsheets that are in office suites such cloud services like Zoho Office Suite, Microsoft Office 365 and Google Docs. On the basis of the research and the findings it was suggested the best cloud services for use in the educational process. The possibility of using spreadsheets in the study of logic, from creating formulas that implement logical operations, the creation of means of automation of problem solving process was considered.
Lee, Young-Jin
2017-01-01
Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…
Solving wood chip transport problems with computer simulation.
Dennis P. Bradley; Sharon A. Winsauer
1976-01-01
Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.
Solving the 0/1 Knapsack Problem by a Biomolecular DNA Computer
Directory of Open Access Journals (Sweden)
Hassan Taghipour
2013-01-01
Full Text Available Solving some mathematical problems such as NP-complete problems by conventional silicon-based computers is problematic and takes so long time. DNA computing is an alternative method of computing which uses DNA molecules for computing purposes. DNA computers have massive degrees of parallel processing capability. The massive parallel processing characteristic of DNA computers is of particular interest in solving NP-complete and hard combinatorial problems. NP-complete problems such as knapsack problem and other hard combinatorial problems can be easily solved by DNA computers in a very short period of time comparing to conventional silicon-based computers. Sticker-based DNA computing is one of the methods of DNA computing. In this paper, the sticker based DNA computing was used for solving the 0/1 knapsack problem. At first, a biomolecular solution space was constructed by using appropriate DNA memory complexes. Then, by the application of a sticker-based parallel algorithm using biological operations, knapsack problem was resolved in polynomial time.
Computer programs for solving systems of nonlinear equations
International Nuclear Information System (INIS)
Asaoka, Takumi
1978-03-01
Computer programs to find a solution, usually the one closest to some guess, of a system of simultaneous nonlinear equations are provided for real functions of the real arguments. These are based on quasi-Newton methods or projection methods, which are briefly reviewed in the present report. Benchmark tests were performed on these subroutines to grasp their characteristics. As the program not requiring analytical forms of the derivatives of the Jacobian matrix, we have dealt with NS01A of Powell, NS03A of Reid for a system with the sparse Jacobian and NONLIN of Brown. Of these three subroutines of quasi-Newton methods, NONLIN is shown to be the most useful because of its stable algorithm and short computation time. On the other hand, as the subroutine for which the derivatives of the Jacobian are to be supplied analytically, we have tested INTECH of a quasi-Newton method based on the Boggs' algorithm, PROJA of Georg and Keller based on the projection method and an option of NS03A. The results have shown that INTECH, treating variables which appear only linearly in the functions separately, takes the shortest computation time, on the whole, while the projection method requires further research to find an optimal algorithm. (auth.)
Solving overvoltage protection problems by means of an analogue computer
Energy Technology Data Exchange (ETDEWEB)
Stephanides, N
1964-03-21
The importance of improving overvoltage protection and reducing insulation level for voltages of 525 and 765 kV is fully realized. A digital computer may be used to determine, according to the Bergson procedure, the voltage distribution at different points of a given network but this procedure is very time-wasting. An analogue simulation is described, which, by giving an instantaneous display of the overvoltage wave on the screen of a cathode ray oscillograph, is better suited for the overvoltage protection study and satisfies also the conditions related to wave reproducibility. The method of simulating inductors, capacitors, and lightning arrestors (by using transistors) is shown and special emphasis is put on the surge generator analogue for which thyration tubes are used in order to get a linear front-increase of the impulse testing wave. The results obtained are accurate within 1 to 2% as compared with calculated values. Ten figures and seven references are given.
Chen, Chiu-Jung; Liu, Pei-Lin
2007-01-01
This study evaluated the effects of a personalized computer-assisted mathematics problem-solving program on the performance and attitude of Taiwanese fourth grade students. The purpose of this study was to determine whether the personalized computer-assisted program improved student performance and attitude over the nonpersonalized program.…
Computer-Presented Organizational/Memory Aids as Instruction for Solving Pico-Fomi Problems.
Steinberg, Esther R.; And Others
1985-01-01
Describes investigation of effectiveness of computer-presented organizational/memory aids (matrix and verbal charts controlled by computer or learner) as instructional technique for solving Pico-Fomi problems, and the acquisition of deductive inference rules when such aids are present. Results indicate chart use control should be adapted to…
Computer Problem-Solving Coaches for Introductory Physics: Design and Usability Studies
Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew
2016-01-01
The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how…
Data science in R a case studies approach to computational reasoning and problem solving
Nolan, Deborah
2015-01-01
Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar
Nonlinear evolution equations and solving algebraic systems: the importance of computer algebra
International Nuclear Information System (INIS)
Gerdt, V.P.; Kostov, N.A.
1989-01-01
In the present paper we study the application of computer algebra to solve the nonlinear polynomial systems which arise in investigation of nonlinear evolution equations. We consider several systems which are obtained in classification of integrable nonlinear evolution equations with uniform rank. Other polynomial systems are related with the finding of algebraic curves for finite-gap elliptic potentials of Lame type and generalizations. All systems under consideration are solved using the method based on construction of the Groebner basis for corresponding polynomial ideals. The computations have been carried out using computer algebra systems. 20 refs
Engineering Courses on Computational Thinking Through Solving Problems in Artificial Intelligence
Directory of Open Access Journals (Sweden)
Piyanuch Silapachote
2017-09-01
Full Text Available Computational thinking sits at the core of every engineering and computing related discipline. It has increasingly emerged as its own subject in all levels of education. It is a powerful cornerstone for cognitive development, creative problem solving, algorithmic thinking and designs, and programming. How to effectively teach computational thinking skills poses real challenges and creates opportunities. Targeting entering computer science and engineering undergraduates, we resourcefully integrate elements from artificial intelligence (AI into introductory computing courses. In addition to comprehension of the essence of computational thinking, practical exercises in AI enable inspirations of collaborative problem solving beyond abstraction, logical reasoning, critical and analytical thinking. Problems in machine intelligence systems intrinsically connect students to algorithmic oriented computing and essential mathematical foundations. Beyond knowledge representation, AI fosters a gentle introduction to data structures and algorithms. Focused on engaging mental tool, a computer is never a necessity. Neither coding nor programming is ever required. Instead, students enjoy constructivist classrooms designed to always be active, flexible, and highly dynamic. Learning to learn and reflecting on cognitive experiences, they rigorously construct knowledge from collectively solving exciting puzzles, competing in strategic games, and participating in intellectual discussions.
A homotopy method for solving Riccati equations on a shared memory parallel computer
International Nuclear Information System (INIS)
Zigic, D.; Watson, L.T.; Collins, E.G. Jr.; Davis, L.D.
1993-01-01
Although there are numerous algorithms for solving Riccati equations, there still remains a need for algorithms which can operate efficiently on large problems and on parallel machines. This paper gives a new homotopy-based algorithm for solving Riccati equations on a shared memory parallel computer. The central part of the algorithm is the computation of the kernel of the Jacobian matrix, which is essential for the corrector iterations along the homotopy zero curve. Using a Schur decomposition the tensor product structure of various matrices can be efficiently exploited. The algorithm allows for efficient parallelization on shared memory machines
Computer science. Heads-up limit hold'em poker is solved.
Bowling, Michael; Burch, Neil; Johanson, Michael; Tammelin, Oskari
2015-01-09
Poker is a family of games that exhibit imperfect information, where players do not have full knowledge of past events. Whereas many perfect-information games have been solved (e.g., Connect Four and checkers), no nontrivial imperfect-information game played competitively by humans has previously been solved. Here, we announce that heads-up limit Texas hold'em is now essentially weakly solved. Furthermore, this computation formally proves the common wisdom that the dealer in the game holds a substantial advantage. This result was enabled by a new algorithm, CFR(+), which is capable of solving extensive-form games orders of magnitude larger than previously possible. Copyright © 2015, American Association for the Advancement of Science.
Computer problem-solving coaches for introductory physics: Design and usability studies
Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew
2016-06-01
The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how effective such coaches might be, they will only be useful if they are attractive to students. We describe the design and testing of a set of web-based computer programs that act as personal coaches to students while they practice solving problems from introductory physics. The coaches are designed to supplement regular human instruction, giving students access to effective forms of practice outside class. We present results from large-scale usability tests of the computer coaches and discuss their implications for future versions of the coaches.
The benefits of computer-generated feedback for mathematics problem solving.
Fyfe, Emily R; Rittle-Johnson, Bethany
2016-07-01
The goal of the current research was to better understand when and why feedback has positive effects on learning and to identify features of feedback that may improve its efficacy. In a randomized experiment, second-grade children received instruction on a correct problem-solving strategy and then solved a set of relevant problems. Children were assigned to receive no feedback, immediate feedback, or summative feedback from the computer. On a posttest the following day, feedback resulted in higher scores relative to no feedback for children who started with low prior knowledge. Immediate feedback was particularly effective, facilitating mastery of the material for children with both low and high prior knowledge. Results suggest that minimal computer-generated feedback can be a powerful form of guidance during problem solving. Copyright © 2016 Elsevier Inc. All rights reserved.
Proceedings of the International Conference on Soft Computing for Problem Solving
Nagar, Atulya; Pant, Millie; Bansal, Jagdish
2012-01-01
The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.
Proceedings of the International Conference on Soft Computing for Problem Solving
Nagar, Atulya; Pant, Millie; Bansal, Jagdish
2012-01-01
The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.
Ergul, Ozgur
2014-01-01
The Multilevel Fast Multipole Algorithm (MLFMA) for Solving Large-Scale Computational Electromagnetic Problems provides a detailed and instructional overview of implementing MLFMA. The book: Presents a comprehensive treatment of the MLFMA algorithm, including basic linear algebra concepts, recent developments on the parallel computation, and a number of application examplesCovers solutions of electromagnetic problems involving dielectric objects and perfectly-conducting objectsDiscusses applications including scattering from airborne targets, scattering from red
A Computer Algebra Approach to Solving Chemical Equilibria in General Chemistry
Kalainoff, Melinda; Lachance, Russ; Riegner, Dawn; Biaglow, Andrew
2012-01-01
In this article, we report on a semester-long study of the incorporation into our general chemistry course, of advanced algebraic and computer algebra techniques for solving chemical equilibrium problems. The method presented here is an alternative to the commonly used concentration table method for describing chemical equilibria in general…
Resource-constrained project scheduling: computing lower bounds by solving minimum cut problems
Möhring, R.H.; Nesetril, J.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen
1999-01-01
We present a novel approach to compute Lagrangian lower bounds on the objective function value of a wide class of resource-constrained project scheduling problems. The basis is a polynomial-time algorithm to solve the following scheduling problem: Given a set of activities with start-time dependent
EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.
Jarvis, John J.; And Others
Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…
Effect of Computer-Presented Organizational/Memory Aids on Problem Solving Behavior.
Steinberg, Esther R.; And Others
This research studied the effects of computer-presented organizational/memory aids on problem solving behavior. The aids were either matrix or verbal charts shown on the display screen next to the problem. The 104 college student subjects were randomly assigned to one of the four conditions: type of chart (matrix or verbal chart) and use of charts…
DEMONSTRATION COMPUTER MODELS USE WHILE SOLVING THE BUILDING OF THE CUT OF THE CYLINDER
Directory of Open Access Journals (Sweden)
Inna O. Gulivata
2010-10-01
Full Text Available Relevance of material presented in the article is the use of effective methods to illustrate the geometric material for the development of spatial imagination of students. As one of the ways to improve problem solving offer to illustrate the use of display computer model (DCM investigated objects created by the software environment PowerPoint. The technique of applying DCM while solving the problems to build a section of the cylinder makes it allows to build effective learning process and promotes the formation of spatial representations of students taking into account their individual characteristics and principles of differentiated instruction.
Ontology Design for Solving Computationally-Intensive Problems on Heterogeneous Architectures
Directory of Open Access Journals (Sweden)
Hossam M. Faheem
2018-02-01
Full Text Available Viewing a computationally-intensive problem as a self-contained challenge with its own hardware, software and scheduling strategies is an approach that should be investigated. We might suggest assigning heterogeneous hardware architectures to solve a problem, while parallel computing paradigms may play an important role in writing efficient code to solve the problem; moreover, the scheduling strategies may be examined as a possible solution. Depending on the problem complexity, finding the best possible solution using an integrated infrastructure of hardware, software and scheduling strategy can be a complex job. Developing and using ontologies and reasoning techniques play a significant role in reducing the complexity of identifying the components of such integrated infrastructures. Undertaking reasoning and inferencing regarding the domain concepts can help to find the best possible solution through a combination of hardware, software and scheduling strategies. In this paper, we present an ontology and show how we can use it to solve computationally-intensive problems from various domains. As a potential use for the idea, we present examples from the bioinformatics domain. Validation by using problems from the Elastic Optical Network domain has demonstrated the flexibility of the suggested ontology and its suitability for use with any other computationally-intensive problem domain.
Kuncoro, K. S.; Junaedi, I.; Dwijanto
2018-03-01
This study aimed to reveal the effectiveness of Project Based Learning with Resource Based Learning approach computer-aided program and analyzed problem-solving abilities in terms of problem-solving steps based on Polya stages. The research method used was mixed method with sequential explanatory design. The subject of this research was the students of math semester 4. The results showed that the S-TPS (Strong Top Problem Solving) and W-TPS (Weak Top Problem Solving) had good problem-solving abilities in each problem-solving indicator. The problem-solving ability of S-MPS (Strong Middle Problem Solving) and (Weak Middle Problem Solving) in each indicator was good. The subject of S-BPS (Strong Bottom Problem Solving) had a difficulty in solving the problem with computer program, less precise in writing the final conclusion and could not reflect the problem-solving process using Polya’s step. While the Subject of W-BPS (Weak Bottom Problem Solving) had not been able to meet almost all the indicators of problem-solving. The subject of W-BPS could not precisely made the initial table of completion so that the completion phase with Polya’s step was constrained.
Excited state dynamics of DNA bases
Czech Academy of Sciences Publication Activity Database
Kleinermanns, K.; Nachtigallová, Dana; de Vries, M. S.
2013-01-01
Roč. 32, č. 2 (2013), s. 308-342 ISSN 0144-235X R&D Projects: GA ČR GAP208/12/1318 Grant - others:National Science Foundation(US) CHE-0911564; NASA (US) NNX12AG77G; Deutsche Forschungsgemeinschaft(DE) SFB 663; Deutsche Forschungsgemeinschaft(DE) KI 531-29 Institutional support: RVO:61388963 Keywords : DNA bases * nucleobases * excited state * dynamics * computations * gas phase * conical intersections Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 4.920, year: 2013
Awange, Joseph L
2004-01-01
While preparing and teaching 'Introduction to Geodesy I and II' to - dergraduate students at Stuttgart University, we noticed a gap which motivated the writing of the present book: Almost every topic that we taughtrequiredsomeskillsinalgebra,andinparticular,computeral- bra! From positioning to transformation problems inherent in geodesy and geoinformatics, knowledge of algebra and application of computer algebra software were required. In preparing this book therefore, we haveattemptedtoputtogetherbasicconceptsofabstractalgebra which underpin the techniques for solving algebraic problems. Algebraic c- putational algorithms useful for solving problems which require exact solutions to nonlinear systems of equations are presented and tested on various problems. Though the present book focuses mainly on the two ?elds,theconceptsand techniquespresented hereinarenonetheless- plicable to other ?elds where algebraic computational problems might be encountered. In Engineering for example, network densi?cation and robo...
Solving linear systems in FLICA-4, thermohydraulic code for 3-D transient computations
International Nuclear Information System (INIS)
Allaire, G.
1995-01-01
FLICA-4 is a computer code, developed at the CEA (France), devoted to steady state and transient thermal-hydraulic analysis of nuclear reactor cores, for small size problems (around 100 mesh cells) as well as for large ones (more than 100000), on, either standard workstations or vector super-computers. As for time implicit codes, the largest time and memory consuming part of FLICA-4 is the routine dedicated to solve the linear system (the size of which is of the order of the number of cells). Therefore, the efficiency of the code is crucially influenced by the optimization of the algorithms used in assembling and solving linear systems: direct methods as the Gauss (or LU) decomposition for moderate size problems, iterative methods as the preconditioned conjugate gradient for large problems. 6 figs., 13 refs
Solving large sets of coupled equations iteratively by vector processing on the CYBER 205 computer
International Nuclear Information System (INIS)
Tolsma, L.D.
1985-01-01
The set of coupled linear second-order differential equations which has to be solved for the quantum-mechanical description of inelastic scattering of atomic and nuclear particles can be rewritten as an equivalent set of coupled integral equations. When some type of functions is used as piecewise analytic reference solutions, the integrals that arise in this set can be evaluated analytically. The set of integral equations can be solved iteratively. For the results mentioned an inward-outward iteration scheme has been applied. A concept of vectorization of coupled-channel Fortran programs, based on this integral method, is presented for the use on the Cyber 205 computer. It turns out that, for two heavy ion nuclear scattering test cases, this vector algorithm gives an overall speed-up of about a factor of 2 to 3 compared to a highly optimized scalar algorithm for a one vector pipeline computer
An improved computational version of the LTSN method to solve transport problems in a slab
International Nuclear Information System (INIS)
Cardona, Augusto V.; Oliveira, Jose Vanderlei P. de; Vilhena, Marco Tullio de; Segatto, Cynthia F.
2008-01-01
In this work, we present an improved computational version of the LTS N method to solve transport problems in a slab. The key feature relies on the reordering of the set of S N equations. This procedure reduces by a factor of two the task of evaluating the eigenvalues of the matrix associated to SN approximations. We present numerical simulations and comparisons with the ones of the classical LTS N approach. (author)
Alam Khan, Najeeb; Razzaq, Oyoon Abdul
2016-03-01
In the present work a wavelets approximation method is employed to solve fuzzy boundary value differential equations (FBVDEs). Essentially, a truncated Legendre wavelets series together with the Legendre wavelets operational matrix of derivative are utilized to convert FB- VDE into a simple computational problem by reducing it into a system of fuzzy algebraic linear equations. The capability of scheme is investigated on second order FB- VDE considered under generalized H-differentiability. Solutions are represented graphically showing competency and accuracy of this method.
Solving the Coupled System Improves Computational Efficiency of the Bidomain Equations
Southern, J.A.; Plank, G.; Vigmond, E.J.; Whiteley, J.P.
2009-01-01
The bidomain equations are frequently used to model the propagation of cardiac action potentials across cardiac tissue. At the whole organ level, the size of the computational mesh required makes their solution a significant computational challenge. As the accuracy of the numerical solution cannot be compromised, efficiency of the solution technique is important to ensure that the results of the simulation can be obtained in a reasonable time while still encapsulating the complexities of the system. In an attempt to increase efficiency of the solver, the bidomain equations are often decoupled into one parabolic equation that is computationally very cheap to solve and an elliptic equation that is much more expensive to solve. In this study, the performance of this uncoupled solution method is compared with an alternative strategy in which the bidomain equations are solved as a coupled system. This seems counterintuitive as the alternative method requires the solution of a much larger linear system at each time step. However, in tests on two 3-D rabbit ventricle benchmarks, it is shown that the coupled method is up to 80% faster than the conventional uncoupled method-and that parallel performance is better for the larger coupled problem.
Solving the Coupled System Improves Computational Efficiency of the Bidomain Equations
Southern, J.A.
2009-10-01
The bidomain equations are frequently used to model the propagation of cardiac action potentials across cardiac tissue. At the whole organ level, the size of the computational mesh required makes their solution a significant computational challenge. As the accuracy of the numerical solution cannot be compromised, efficiency of the solution technique is important to ensure that the results of the simulation can be obtained in a reasonable time while still encapsulating the complexities of the system. In an attempt to increase efficiency of the solver, the bidomain equations are often decoupled into one parabolic equation that is computationally very cheap to solve and an elliptic equation that is much more expensive to solve. In this study, the performance of this uncoupled solution method is compared with an alternative strategy in which the bidomain equations are solved as a coupled system. This seems counterintuitive as the alternative method requires the solution of a much larger linear system at each time step. However, in tests on two 3-D rabbit ventricle benchmarks, it is shown that the coupled method is up to 80% faster than the conventional uncoupled method-and that parallel performance is better for the larger coupled problem.
International Nuclear Information System (INIS)
Kirk, B.L.; Azmy, Y.Y.
1992-01-01
In this paper the one-group, steady-state neutron diffusion equation in two-dimensional Cartesian geometry is solved using the nodal integral method. The discrete variable equations comprise loosely coupled sets of equations representing the nodal balance of neutrons, as well as neutron current continuity along rows or columns of computational cells. An iterative algorithm that is more suitable for solving large problems concurrently is derived based on the decomposition of the spatial domain and is accelerated using successive overrelaxation. This algorithm is very well suited for parallel computers, especially since the spatial domain decomposition occurs naturally, so that the number of iterations required for convergence does not depend on the number of processors participating in the calculation. Implementation of the authors' algorithm on the Intel iPSC/2 hypercube and Sequent Balance 8000 parallel computer is presented, and measured speedup and efficiency for test problems are reported. The results suggest that the efficiency of the hypercube quickly deteriorates when many processors are used, while the Sequent Balance retains very high efficiency for a comparable number of participating processors. This leads to the conjecture that message-passing parallel computers are not as well suited for this algorithm as shared-memory machines
EDDYMULT: a computing system for solving eddy current problems in a multi-torus system
International Nuclear Information System (INIS)
Nakamura, Yukiharu; Ozeki, Takahisa
1989-03-01
A new computing system EDDYMULT based on the finite element circuit method has been developed to solve actual eddy current problems in a multi-torus system, which consists of many torus-conductors and various kinds of axisymmetric poloidal field coils. The EDDYMULT computing system can deal three-dimensionally with the modal decomposition of eddy current in a multi-torus system, the transient phenomena of eddy current distributions and the resultant magnetic field. Therefore, users can apply the computing system to the solution of the eddy current problems in a tokamak fusion device, such as the design of poloidal field coil power supplies, the mechanical stress design of the intensive electromagnetic loading on device components and the control analysis of plasma position. The present report gives a detailed description of the EDDYMULT system as an user's manual: 1) theory, 2) structure of the code system, 3) input description, 4) problem restrictions, 5) description of the subroutines, etc. (author)
Solving Large-Scale Computational Problems Using Insights from Statistical Physics
Energy Technology Data Exchange (ETDEWEB)
Selman, Bart [Cornell University
2012-02-29
Many challenging problems in computer science and related fields can be formulated as constraint satisfaction problems. Such problems consist of a set of discrete variables and a set of constraints between those variables, and represent a general class of so-called NP-complete problems. The goal is to find a value assignment to the variables that satisfies all constraints, generally requiring a search through and exponentially large space of variable-value assignments. Models for disordered systems, as studied in statistical physics, can provide important new insights into the nature of constraint satisfaction problems. Recently, work in this area has resulted in the discovery of a new method for solving such problems, called the survey propagation (SP) method. With SP, we can solve problems with millions of variables and constraints, an improvement of two orders of magnitude over previous methods.
Solving Coupled Gross--Pitaevskii Equations on a Cluster of PlayStation 3 Computers
Edwards, Mark; Heward, Jeffrey; Clark, C. W.
2009-05-01
At Georgia Southern University we have constructed an 8+1--node cluster of Sony PlayStation 3 (PS3) computers with the intention of using this computing resource to solve problems related to the behavior of ultra--cold atoms in general with a particular emphasis on studying bose--bose and bose--fermi mixtures confined in optical lattices. As a first project that uses this computing resource, we have implemented a parallel solver of the coupled time--dependent, one--dimensional Gross--Pitaevskii (TDGP) equations. These equations govern the behavior of dual-- species bosonic mixtures. We chose the split--operator/FFT to solve the coupled 1D TDGP equations. The fast Fourier transform component of this solver can be readily parallelized on the PS3 cpu known as the Cell Broadband Engine (CellBE). Each CellBE chip contains a single 64--bit PowerPC Processor Element known as the PPE and eight ``Synergistic Processor Element'' identified as the SPE's. We report on this algorithm and compare its performance to a non--parallel solver as applied to modeling evaporative cooling in dual--species bosonic mixtures.
Wang, Zhaocai; Pu, Jun; Cao, Liling; Tan, Jian
2015-10-23
The unbalanced assignment problem (UAP) is to optimally resolve the problem of assigning n jobs to m individuals (m applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn) time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.
An Application of Computer Vision Systems to Solve the Problem of Unmanned Aerial Vehicle Control
Directory of Open Access Journals (Sweden)
Aksenov Alexey Y.
2014-09-01
Full Text Available The paper considers an approach for application of computer vision systems to solve the problem of unmanned aerial vehicle control. The processing of images obtained through onboard camera is required for absolute positioning of aerial platform (automatic landing and take-off, hovering etc. used image processing on-board camera. The proposed method combines the advantages of existing systems and gives the ability to perform hovering over a given point, the exact take-off and landing. The limitations of implemented methods are determined and the algorithm is proposed to combine them in order to improve the efficiency.
Experimental realization of a one-way quantum computer algorithm solving Simon's problem.
Tame, M S; Bell, B A; Di Franco, C; Wadsworth, W J; Rarity, J G
2014-11-14
We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's problem-a black-box period-finding problem that has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.
Bass, Gideon; Tomlin, Casey; Kumar, Vaibhaw; Rihaczek, Pete; Dulny, Joseph, III
2018-04-01
NP-hard optimization problems scale very rapidly with problem size, becoming unsolvable with brute force methods, even with supercomputing resources. Typically, such problems have been approximated with heuristics. However, these methods still take a long time and are not guaranteed to find an optimal solution. Quantum computing offers the possibility of producing significant speed-up and improved solution quality. Current quantum annealing (QA) devices are designed to solve difficult optimization problems, but they are limited by hardware size and qubit connectivity restrictions. We present a novel heterogeneous computing stack that combines QA and classical machine learning, allowing the use of QA on problems larger than the hardware limits of the quantum device. These results represent experiments on a real-world problem represented by the weighted k-clique problem. Through this experiment, we provide insight into the state of quantum machine learning.
Wang, Zhaocai; Huang, Dongmei; Meng, Huajun; Tang, Chengpei
2013-10-01
The minimum spanning tree (MST) problem is to find minimum edge connected subsets containing all the vertex of a given undirected graph. It is a vitally important NP-complete problem in graph theory and applied mathematics, having numerous real life applications. Moreover in previous studies, DNA molecular operations usually were used to solve NP-complete head-to-tail path search problems, rarely for NP-hard problems with multi-lateral path solutions result, such as the minimum spanning tree problem. In this paper, we present a new fast DNA algorithm for solving the MST problem using DNA molecular operations. For an undirected graph with n vertex and m edges, we reasonably design flexible length DNA strands representing the vertex and edges, take appropriate steps and get the solutions of the MST problem in proper length range and O(3m+n) time complexity. We extend the application of DNA molecular operations and simultaneity simplify the complexity of the computation. Results of computer simulative experiments show that the proposed method updates some of the best known values with very short time and that the proposed method provides a better performance with solution accuracy over existing algorithms. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Aryal, Bijaya
2016-03-01
We have studied the impacts of web-based Computer Coaches on educational outputs and outcomes. This presentation will describe the technical and conceptual framework related to the Coaches and discuss undergraduate students' favorability of the Coaches. Moreover, its impacts on students' physics problem solving performance and on their conceptual understanding of physics will be reported. We used a qualitative research technique to collect and analyze interview data from 19 undergraduate students who used the Coaches in the interview setting. The empirical results show that the favorability and efficacy of the Computer Coaches differ considerably across students of different educational backgrounds, preparation levels, attitudes and epistemologies about physics learning. The interview data shows that female students tend to have more favorability supporting the use of the Coach. Likewise, our assessment suggests that female students seem to benefit more from the Coaches in their problem solving performance and in conceptual learning of physics. Finally, the analysis finds evidence that the Coach has potential for increasing efficiency in usage and for improving students' educational outputs and outcomes under its customized usage. This work was partially supported by the Center for Educational Innovation, Office of the Senior Vice President for Academic Affairs and Provost, University of Minnesota.
Desktop Grid Computing with BOINC and its Use for Solving the RND telecommunication Problem
International Nuclear Information System (INIS)
Vega-Rodriguez, M. A.; Vega-Perez, D.; Gomez-Pulido, J. A.; Sanchez-Perez, J. M.
2007-01-01
An important problem in mobile/cellular technology is trying to cover a certain geographical area by using the smallest number of radio antennas, and looking for the biggest cover rate. This is the well known Telecommunication problem identified as Radio Network Design (RND). This optimization problem can be solved by bio-inspired algorithms, among other options. In this work we use the PBIL (Population-Based Incremental Learning) algorithm, that has been little studied in this field but we have obtained very good results with it. PBIL is based on genetic algorithms and competitive learning (typical in neural networks), being a population evolution model based on probabilistic models. Due to the high number of configuration parameters of the PBIL, and because we want to test the RND problem with numerous variants, we have used grid computing with BOINC (Berkeley Open Infrastructure for Network Computing). In this way, we have been able to execute thousands of experiments in few days using around 100 computers at the same time. In this paper we present the most interesting results from our work. (Author)
Scilab software as an alternative low-cost computing in solving the linear equations problem
Agus, Fahrul; Haviluddin
2017-02-01
Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.
Thomas, Thaddeus P.; Anderson, Donald D.; Willis, Andrew R.; Liu, Pengcheng; Frank, Matthew C.; Marsh, J. Lawrence; Brown, Thomas D.
2011-01-01
Reconstructing highly comminuted articular fractures poses a difficult surgical challenge, akin to solving a complicated three-dimensional (3D) puzzle. Pre-operative planning using CT is critically important, given the desirability of less invasive surgical approaches. The goal of this work is to advance 3D puzzle solving methods toward use as a pre-operative tool for reconstructing these complex fractures. Methodology for generating typical fragmentation/dispersal patterns was developed. Five identical replicas of human distal tibia anatomy, were machined from blocks of high-density polyetherurethane foam (bone fragmentation surrogate), and were fractured using an instrumented drop tower. Pre- and post-fracture geometries were obtained using laser scans and CT. A semi-automatic virtual reconstruction computer program aligned fragment native (non-fracture) surfaces to a pre-fracture template. The tibias were precisely reconstructed with alignment accuracies ranging from 0.03-0.4mm. This novel technology has potential to significantly enhance surgical techniques for reconstructing comminuted intra-articular fractures, as illustrated for a representative clinical case. PMID:20924863
Solving black box computation problems using expert knowledge theory and methods
International Nuclear Information System (INIS)
Booker, Jane M.; McNamara, Laura A.
2004-01-01
The challenge problems for the Epistemic Uncertainty Workshop at Sandia National Laboratories provide common ground for comparing different mathematical theories of uncertainty, referred to as General Information Theories (GITs). These problems also present the opportunity to discuss the use of expert knowledge as an important constituent of uncertainty quantification. More specifically, how do the principles and methods of eliciting and analyzing expert knowledge apply to these problems and similar ones encountered in complex technical problem solving and decision making? We will address this question, demonstrating how the elicitation issues and the knowledge that experts provide can be used to assess the uncertainty in outputs that emerge from a black box model or computational code represented by the challenge problems. In our experience, the rich collection of GITs provides an opportunity to capture the experts' knowledge and associated uncertainties consistent with their thinking, problem solving, and problem representation. The elicitation process is rightly treated as part of an overall analytical approach, and the information elicited is not simply a source of data. In this paper, we detail how the elicitation process itself impacts the analyst's ability to represent, aggregate, and propagate uncertainty, as well as how to interpret uncertainties in outputs. While this approach does not advocate a specific GIT, answers under uncertainty do result from the elicitation
Directory of Open Access Journals (Sweden)
Zhaocai Wang
2015-10-01
Full Text Available The unbalanced assignment problem (UAP is to optimally resolve the problem of assigning n jobs to m individuals (m < n, such that minimum cost or maximum profit obtained. It is a vitally important Non-deterministic Polynomial (NP complete problem in operation management and applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.
Kozbelt, Aaron; Dexter, Scott; Dolese, Melissa; Meredith, Daniel; Ostrofsky, Justin
2015-01-01
We applied computer-based text analyses of regressive imagery to verbal protocols of individuals engaged in creative problem-solving in two domains: visual art (23 experts, 23 novices) and computer programming (14 experts, 14 novices). Percentages of words involving primary process and secondary process thought, plus emotion-related words, were…
DNA based radiological dosimetry technology
International Nuclear Information System (INIS)
Diaz Quijada, Gerardo A.; Roy, Emmanuel; Veres, Teodor; Dumoulin, Michel M.; Vachon, Caroline; Blagoeva, Rosita; Pierre, Martin
2008-01-01
Full text: The purpose of this project is to develop a personal and wearable dosimeter using a highly-innovative approach based on the specific recognition of DNA damage with a polymer hybrid. Our biosensor will be sensitive to breaks in nucleic acid macromolecules and relevant to mixed-field radiation. The dosimeter proposed will be small, field deployable and will sense damages for all radiation types at the DNA level. The generalized concept for the novel-based radiological dosimeter: 1) Single or double stranded oligonucleotide is immobilized on surface; 2) Single stranded has higher cross-section for fragmentation; 3) Double stranded is more biological relevant; 4) Radiation induces fragmentation; 5) Ultra-sensitive detection of fragments provides radiation dose. Successful efforts have been made towards a proof-of-concept personal wearable DNA-based dosimeter that is appropriate for mixed-field radiation. The covalent immobilization of oligonucleotides on large areas of plastic surfaces has been demonstrated and corroborated spectroscopically. The surface concentration of DNA was determined to be 8 x 1010 molecules/cm 2 from a Ce(IV) catalyzed hydrolysis study of a fluorescently labelled oligonucleotide. Current efforts are being directed at studying radiation induced fragmentation of DNA followed by its ultra-sensitive detection via a novel method. In addition, proof-of-concept wearable personal devices and a detection platform are presently being fabricated. (author)
International Nuclear Information System (INIS)
Azmy, Y. Y.
2004-01-01
An approach is developed for solving the neutron diffusion equation on combinatorial geometry computational cells, that is computational cells composed by combinatorial operations involving simple-shaped component cells. The only constraint on the component cells from which the combinatorial cells are assembled is that they possess a legitimate discretization of the underlying diffusion equation. We use the Finite Difference (FD) approximation of the x, y-geometry diffusion equation in this work. Performing the same combinatorial operations involved in composing the combinatorial cell on these discrete-variable equations yields equations that employ new discrete variables defined only on the combinatorial cell's volume and faces. The only approximation involved in this process, beyond the truncation error committed in discretizing the diffusion equation over each component cell, is a consistent-order Legendre series expansion. Preliminary results for simple configurations establish the accuracy of the solution to the combinatorial geometry solution compared to straight FD as the system dimensions decrease. Furthermore numerical results validate the consistent Legendre-series expansion order by illustrating the second order accuracy of the combinatorial geometry solution, the same as standard FD. Nevertheless the magnitude of the error for the new approach is larger than FD's since it incorporates the additional truncated series approximation. (authors)
Özyurt, Özcan
2015-01-01
Problem solving is an indispensable part of engineering. Improving critical thinking dispositions for solving engineering problems is one of the objectives of engineering education. In this sense, knowing critical thinking and problem solving skills of engineering students is of importance for engineering education. This study aims to determine…
Hill, Mary C.
1990-01-01
This report documents PCG2 : a numerical code to be used with the U.S. Geological Survey modular three-dimensional, finite-difference, ground-water flow model . PCG2 uses the preconditioned conjugate-gradient method to solve the equations produced by the model for hydraulic head. Linear or nonlinear flow conditions may be simulated. PCG2 includes two reconditioning options : modified incomplete Cholesky preconditioning, which is efficient on scalar computers; and polynomial preconditioning, which requires less computer storage and, with modifications that depend on the computer used, is most efficient on vector computers . Convergence of the solver is determined using both head-change and residual criteria. Nonlinear problems are solved using Picard iterations. This documentation provides a description of the preconditioned conjugate gradient method and the two preconditioners, detailed instructions for linking PCG2 to the modular model, sample data inputs, a brief description of PCG2, and a FORTRAN listing.
Use of a genetic algorithm to solve two-fluid flow problems on an NCUBE multiprocessor computer
International Nuclear Information System (INIS)
Pryor, R.J.; Cline, D.D.
1992-01-01
A method of solving the two-phase fluid flow equations using a genetic algorithm on a NCUBE multiprocessor computer is presented. The topics discussed are the two-phase flow equations, the genetic representation of the unknowns, the fitness function, the genetic operators, and the implementation of the algorithm on the NCUBE computer. The efficiency of the implementation is investigated using a pipe blowdown problem. Effects of varying the genetic parameters and the number of processors are presented
Use of a genetic agorithm to solve two-fluid flow problems on an NCUBE multiprocessor computer
International Nuclear Information System (INIS)
Pryor, R.J.; Cline, D.D.
1993-01-01
A method of solving the two-phases fluid flow equations using a genetic algorithm on a NCUBE multiprocessor computer is presented. The topics discussed are the two-phase flow equations, the genetic representation of the unkowns, the fitness function, the genetic operators, and the implementation of the algorithm on the NCUBE computer. The efficiency of the implementation is investigated using a pipe blowdown problem. Effects of varying the genetic parameters and the number of processors are presented. (orig.)
Hickendorff, Marian
2013-01-01
The results of an exploratory study into measurement of elementary mathematics ability are presented. The focus is on the abilities involved in solving standard computation problems on the one hand and problems presented in a realistic context on the other. The objectives were to assess to what extent these abilities are shared or distinct, and…
Beal, Carole R.; Rosenblum, L. Penny
2018-01-01
Introduction: The authors examined a tablet computer application (iPad app) for its effectiveness in helping students studying prealgebra to solve mathematical word problems. Methods: Forty-three visually impaired students (that is, those who are blind or have low vision) completed eight alternating mathematics units presented using their…
Energy Technology Data Exchange (ETDEWEB)
Moryakov, A. V., E-mail: sailor@orc.ru [National Research Centre Kurchatov Institute (Russian Federation)
2016-12-15
An algorithm for solving the time-dependent transport equation in the P{sub m}S{sub n} group approximation with the use of parallel computations is presented. The algorithm is implemented in the LUCKY-TD code for supercomputers employing the MPI standard for the data exchange between parallel processes.
Fonger, Nicole L.; Davis, Jon D.; Rohwer, Mary Lou
2018-01-01
This research addresses the issue of how to support students' representational fluency--the ability to create, move within, translate across, and derive meaning from external representations of mathematical ideas. The context of solving linear equations in a combined computer algebra system (CAS) and paper-and-pencil classroom environment is…
Backtrack Programming: A Computer-Based Approach to Group Problem Solving.
Scott, Michael D.; Bodaken, Edward M.
Backtrack problem-solving appears to be a viable alternative to current problem-solving methodologies. It appears to have considerable heuristic potential as a conceptual and operational framework for small group communication research, as well as functional utility for the student group in the small group class or the management team in the…
Mead, C.; Horodyskyj, L.; Buxner, S.; Semken, S. C.; Anbar, A. D.
2016-12-01
Developing scientific reasoning skills is a common learning objective for general-education science courses. However, effective assessments for such skills typically involve open-ended questions or tasks, which must be hand-scored and may not be usable online. Using computer-based learning environments, reasoning can be assessed automatically by analyzing student actions within the learning environment. We describe such an assessment under development and present pilot results. In our content-neutral instrument, students solve a problem by collecting and interpreting data in a logical, systematic manner. We then infer reasoning skill automatically based on student actions. Specifically, students investigate why Earth has seasons, a scientifically simple but commonly misunderstood topic. Students are given three possible explanations and asked to select a set of locations on a world map from which to collect temperature data. They then explain how the data support or refute each explanation. The best approaches will use locations in both the Northern and Southern hemispheres to argue that the contrasting seasonality of the hemispheres supports only the correct explanation. We administered a pilot version to students at the beginning of an online, introductory science course (n = 223) as an optional extra credit exercise. We were able to categorize students' data collection decisions as more and less logically sound. Students who choose the most logical measurement locations earned higher course grades, but not significantly higher. This result is encouraging, but not definitive. In the future, we will clarify our results in two ways. First, we plan to incorporate more open-ended interactions into the assessment to improve the resolving power of this tool. Second, to avoid relying on course grades, we will independently measure reasoning skill with one of the existing hand-scored assessments (e.g., Critical Thinking Assessment Test) to cross-validate our new
Energy Technology Data Exchange (ETDEWEB)
Moryakov, A. V., E-mail: sailor@orc.ru [National Research Centre Kurchatov Institute (Russian Federation)
2016-12-15
An algorithm for solving the linear Cauchy problem for large systems of ordinary differential equations is presented. The algorithm for systems of first-order differential equations is implemented in the EDELWEISS code with the possibility of parallel computations on supercomputers employing the MPI (Message Passing Interface) standard for the data exchange between parallel processes. The solution is represented by a series of orthogonal polynomials on the interval [0, 1]. The algorithm is characterized by simplicity and the possibility to solve nonlinear problems with a correction of the operator in accordance with the solution obtained in the previous iterative process.
International Nuclear Information System (INIS)
Azmy, Y.Y.; Kirk, B.L.
1990-01-01
Modern parallel computer architectures offer an enormous potential for reducing CPU and wall-clock execution times of large-scale computations commonly performed in various applications in science and engineering. Recently, several authors have reported their efforts in developing and implementing parallel algorithms for solving the neutron diffusion equation on a variety of shared- and distributed-memory parallel computers. Testing of these algorithms for a variety of two- and three-dimensional meshes showed significant speedup of the computation. Even for very large problems (i.e., three-dimensional fine meshes) executed concurrently on a few nodes in serial (nonvector) mode, however, the measured computational efficiency is very low (40 to 86%). In this paper, the authors present a highly efficient (∼85 to 99.9%) algorithm for solving the two-dimensional nodal diffusion equations on the Sequent Balance 8000 parallel computer. Also presented is a model for the performance, represented by the efficiency, as a function of problem size and the number of participating processors. The model is validated through several tests and then extrapolated to larger problems and more processors to predict the performance of the algorithm in more computationally demanding situations
PEAKS: Computer code for solving partly overlapped photopeak in gamma spectrometry
International Nuclear Information System (INIS)
Jerez Vergueria, Sergio; Jerez Vergueria, Pablo
1996-01-01
The paper describes the main elements of the code according to purposes and contents. The PEAKS code is a useful tool of comfortable and easy handling for solving, partly overlapped photopeak in gamma spectrometry with NaI(Ti) detector
Towards high-performance symbolic computing using MuPAD as a problem solving environment
Sorgatz, A
1999-01-01
This article discusses the approach of developing MuPAD into an open and parallel problem solving environment for mathematical applications. It introduces the key technologies domains and dynamic modules and describes the current $9 state of macro parallelism which covers three fields of parallel programming: message passing, network variables and work groups. First parallel algorithms and examples of using the prototype of the MuPAD problem solving environment $9 are demonstrated. (12 refs).
Development of Procedures to Assess Problem-Solving Competence in Computing Engineering
Pérez, Jorge; Vizcarro, Carmen; García, Javier; Bermúdez, Aurelio; Cobos, Ruth
2017-01-01
In the context of higher education, a competence may be understood as the combination of skills, knowledge, attitudes, values, and abilities that underpin effective and/or superior performance in a professional area. The aim of the work reported here was to design a set of procedures to assess a transferable competence, i.e., problem solving, that…
Lin, John Jr-Hung; Lin, Sunny S. J.
2014-01-01
The present study investigated (a) whether the perceived cognitive load was different when geometry problems with various levels of configuration comprehension were solved and (b) whether eye movements in comprehending geometry problems showed sources of cognitive loads. In the first investigation, three characteristics of geometry configurations…
The Effect of Simulation Games on the Learning of Computational Problem Solving
Liu, Chen-Chung; Cheng, Yuan-Bang; Huang, Chia-Wen
2011-01-01
Simulation games are now increasingly applied to many subject domains as they allow students to engage in discovery processes, and may facilitate a flow learning experience. However, the relationship between learning experiences and problem solving strategies in simulation games still remains unclear in the literature. This study, thus, analyzed…
The Cross-Contextual Transfer of Problem Solving Strategies from Logo to Non-Computer Domains.
Swan, Karen; Black, John B.
This report investigated the relationship between learning to program LOGO and the development of problem solving skills. Subjects were 133 students in grades 4-8 who had at least 30 hours of experience with both graphics and lists programming in Logo. Students were randomly assigned to one of three contextual groupings, which received graphics,…
Schoppek, Wolfgang; Tulis, Maria
2010-01-01
The fluency of basic arithmetical operations is a precondition for mathematical problem solving. However, the training of skills plays a minor role in contemporary mathematics instruction. The authors proposed individualization of practice as a means to improve its efficiency, so that the time spent with the training of skills is minimized. As a…
An Examination of the Relationship between Computation, Problem Solving, and Reading
Cormier, Damien C.; Yeo, Seungsoo; Christ, Theodore J.; Offrey, Laura D.; Pratt, Katherine
2016-01-01
The purpose of this study is to evaluate the relationship of mathematics calculation rate (curriculum-based measurement of mathematics; CBM-M), reading rate (curriculum-based measurement of reading; CBM-R), and mathematics application and problem solving skills (mathematics screener) among students at four levels of proficiency on a statewide…
Random amplified polymorphic DNA based genetic characterization ...
African Journals Online (AJOL)
Random amplified polymorphic DNA based genetic characterization of four important species of Bamboo, found in Raigad district, Maharashtra State, India. ... Bambusoideae are differentiated from other members of the family by the presence of petiolate blades with parallel venation and stamens are three, four, six or more, ...
International Nuclear Information System (INIS)
Lee, Jin Pyo; Joo, Han Gyu
2010-01-01
In the thermo-fluid analysis code named CUPID, the linear system of pressure equations must be solved in each iteration step. The time for repeatedly solving the linear system can be quite significant because large sparse matrices of Rank more than 50,000 are involved and the diagonal dominance of the system is hardly hold. Therefore parallelization of the linear system solver is essential to reduce the computing time. Meanwhile, Graphics Processing Units (GPU) have been developed as highly parallel, multi-core processors for the global demand of high quality 3D graphics. If a suitable interface is provided, parallelization using GPU can be available to engineering computing. NVIDIA provides a Software Development Kit(SDK) named CUDA(Compute Unified Device Architecture) to code developers so that they can manage GPUs for parallelization using the C language. In this research, we implement parallel routines for the linear system solver using CUDA, and examine the performance of the parallelization. In the next section, we will describe the method of CUDA parallelization for the CUPID code, and then the performance of the CUDA parallelization will be discussed
Smith, Mike U.
1991-01-01
Criticizes an article by Browning and Lehman (1988) for (1) using "gene" instead of allele, (2) misusing the word "misconception," and (3) the possible influences of the computer environment on the results of the study. (PR)
DNA-Based Applications in Nanobiotechnology
Directory of Open Access Journals (Sweden)
Khalid M. Abu-Salah
2010-01-01
Full Text Available Biological molecules such as deoxyribonucleic acid (DNA have shown great potential in fabrication and construction of nanostructures and devices. The very properties that make DNA so effective as genetic material also make it a very suitable molecule for programmed self-assembly. The use of DNA to assemble metals or semiconducting particles has been extended to construct metallic nanowires and functionalized nanotubes. This paper highlights some important aspects of conjugating the unique physical properties of dots or wires with the remarkable recognition capabilities of DNA which could lead to miniaturizing biological electronics and optical devices, including biosensors and probes. Attempts to use DNA-based nanocarriers for gene delivery are discussed. In addition, the ecological advantages and risks of nanotechnology including DNA-based nanobiotechnology are evaluated.
Communication: Electron ionization of DNA bases
Energy Technology Data Exchange (ETDEWEB)
Rahman, M. A.; Krishnakumar, E., E-mail: ekkumar@tifr.res.in
2016-04-28
No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve the existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.
DNA Based Electrochromic and Photovoltaic Cells
2012-01-01
using deoxyribonucleic acid complex as an electron blocking layer App. Phys. Lett. 88 (2006) 171109. 23. F.H.C. Crick , J.D. Watson . The complementary...9550-09-1-0647 final 01-09-2009 ; 30-11-2011 DNA Based Electrochromic and Photovoltaic Cells FA 9550-09-1-0647 Pawlicka, Agnieszka, J. Instituto de...Available. DNA is an abundant natural product with very good biodegradation properties and can be used to obtain gel polymer electrolytes (GPEs) with high
Directory of Open Access Journals (Sweden)
Tim ePalmer
2015-10-01
Full Text Available How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete.
Garrido, Jose
2011-01-01
… offers a solid first step into scientific and technical computing for those just getting started. … Through simple examples that are both easy to conceptualize and straightforward to express mathematically (something that isn't trivial to achieve), Garrido methodically guides readers from problem statement and abstraction through algorithm design and basic programming. His approach offers those beginning in a scientific or technical discipline something unique; a simultaneous introduction to programming and computational thinking that is very relevant to the practical application of computin
Palmer, Tim N; O'Shea, Michael
2015-01-01
How is the brain configured for creativity? What is the computational substrate for 'eureka' moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete.
Glass, Christopher E.
1990-08-01
The computer program EASI, an acronym for Equilibrium Air Shock Interference, was developed to calculate the inviscid flowfield, the maximum surface pressure, and the maximum heat flux produced by six shock wave interference patterns on a 2-D, cylindrical configuration. Thermodynamic properties of the inviscid flowfield are determined using either an 11-specie, 7-reaction equilibrium chemically reacting air model or a calorically perfect air model. The inviscid flowfield is solved using the integral form of the conservation equations. Surface heating calculations at the impingement point for the equilibrium chemically reacting air model use variable transport properties and specific heat. However, for the calorically perfect air model, heating rate calculations use a constant Prandtl number. Sample calculations of the six shock wave interference patterns, a listing of the computer program, and flowcharts of the programming logic are included.
Solving the Equation: The Variables for Women's Success in Engineering and Computing
Corbett, Christianne; Hill, Catherine
2015-01-01
During the 2014 White House Science Fair, President Barack Obama used a sports metaphor to explain why we must address the shortage of women in science, technology, engineering, and mathematics (STEM), particularly in the engineering and computing fields: "Half our team, we're not even putting on the field. We've got to change those…
Cognitive processes in solving variants of computer-based problems used in logic teaching
Eysink, Tessa H.S.; Dijkstra, S.; Kuper, Jan
2001-01-01
The effect of two instructional variables, visualisation and manipulation of objects, in learning to use the logical connective, conditional, was investigated. Instructions for 66 first- year social science students were varied in the computer-based learning environment Tarski's World, designed for
Solving the Fokker-Planck equation on a massively parallel computer
International Nuclear Information System (INIS)
Mirin, A.A.
1990-01-01
The Fokker-Planck package FPPAC had been converted to the Connection Machine 2 (CM2). For fine mesh cases the CM2 outperforms the Cray-2 when it comes to time-integrating the difference equations. For long Legendre expansions the CM2 is also faster at computing the Fokker-Planck coefficients. 3 refs
Cultural Commonalities and Differences in Spatial Problem-Solving: A Computational Analysis
Lovett, Andrew; Forbus, Kenneth
2011-01-01
A fundamental question in human cognition is how people reason about space. We use a computational model to explore cross-cultural commonalities and differences in spatial cognition. Our model is based upon two hypotheses: (1) the structure-mapping model of analogy can explain the visual comparisons used in spatial reasoning; and (2) qualitative,…
Timmers, Caroline; Walraven, Amber; Veldkamp, Bernard P.
2015-01-01
This study examines the effect of regulation feedback in a computer-based formative assessment in the context of searching for information online. Fifty 13-year-old students completed two randomly selected assessment tasks, receiving automated regulation feedback between them. Student performance
Computational issues of solving the 1D steady gradually varied flow equation
Directory of Open Access Journals (Sweden)
Artichowicz Wojciech
2014-09-01
Full Text Available In this paper a problem of multiple solutions of steady gradually varied flow equation in the form of the ordinary differential energy equation is discussed from the viewpoint of its numerical solution. Using the Lipschitz theorem dealing with the uniqueness of solution of an initial value problem for the ordinary differential equation it was shown that the steady gradually varied flow equation can have more than one solution. This fact implies that the nonlinear algebraic equation approximating the ordinary differential energy equation, which additionally coincides with the wellknown standard step method usually applied for computing of the flow profile, can have variable number of roots. Consequently, more than one alternative solution corresponding to the same initial condition can be provided. Using this property it is possible to compute the water flow profile passing through the critical stage.
International Nuclear Information System (INIS)
Schaefer, A.
1979-02-01
A new code for efficient solution of the multidimensional stationary multi-group, diffusion equation, to be used within a HTGR-code model, is presented. The approximation and iteration methods are described. Spacial approximation is based on the QUABOX-coarse-mesh method, but iteration methods are different from QUABOX to give linear dependence of computation time on the number of energy groups. Results for various multidimensional multi-group problems, among them the THTR pebble bed reactor are analyzed. It is shown, that computational labor for a 3D-case is reduced by about a factor 30 in comparison with conventional finite-difference-methods. Thus 3D-full-core calculations appear to be feasible for large HTGR's. (orig.) [de
An analog computer method for solving flux distribution problems in multi region nuclear reactors
Energy Technology Data Exchange (ETDEWEB)
Radanovic, L; Bingulac, S; Lazarevic, B; Matausek, M [Boris Kidric Institute of Nuclear Sciences Vinca, Beograd (Yugoslavia)
1963-04-15
The paper describes a method developed for determining criticality conditions and plotting flux distribution curves in multi region nuclear reactors on a standard analog computer. The method, which is based on the one-dimensional two group treatment, avoids iterative procedures normally used for boundary value problems and is practically insensitive to errors in initial conditions. The amount of analog equipment required is reduced to a minimum and is independent of the number of core regions and reflectors. (author)
The NASA Computational Fluid Dynamics (CFD) program - Building technology to solve future challenges
Richardson, Pamela F.; Dwoyer, Douglas L.; Kutler, Paul; Povinelli, Louis A.
1993-01-01
This paper presents the NASA Computational Fluid Dynamics program in terms of a strategic vision and goals as well as NASA's financial commitment and personnel levels. The paper also identifies the CFD program customers and the support to those customers. In addition, the paper discusses technical emphasis and direction of the program and some recent achievements. NASA's Ames, Langley, and Lewis Research Centers are the research hubs of the CFD program while the NASA Headquarters Office of Aeronautics represents and advocates the program.
Solving Problems in Various Domains by Hybrid Models of High Performance Computations
Directory of Open Access Journals (Sweden)
Yurii Rogozhin
2014-03-01
Full Text Available This work presents a hybrid model of high performance computations. The model is based on membrane system (P~system where some membranes may contain quantum device that is triggered by the data entering the membrane. This model is supposed to take advantages of both biomolecular and quantum paradigms and to overcome some of their inherent limitations. The proposed approach is demonstrated through two selected problems: SAT, and image retrieving.
Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A; Zhong, Ning; Li, Kuncheng
2014-08-01
Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series and letter series completion tasks; even the underlying rules are identical. In the present study, we examined cortical activation as a function of two different reasoning strategies for solving series completion tasks. The retrieval strategy, used in number series completion tasks, involves direct retrieving of arithmetic knowledge to get the relations between items. The procedural strategy, used in letter series completion tasks, requires counting a certain number of times to detect the relations linking two items. The two strategies require essentially the equivalent cognitive processes, but have different working memory demands (the procedural strategy incurs greater demands). The procedural strategy produced significant greater activity in areas involved in memory retrieval (dorsolateral prefrontal cortex, DLPFC) and mental representation/maintenance (posterior parietal cortex, PPC). An ACT-R model of the tasks successfully predicted behavioral performance and BOLD responses. The present findings support a general-purpose dual-process theory of inductive reasoning regarding the cognitive architecture. Copyright © 2014 Elsevier B.V. All rights reserved.
Computer-assisted mammography in clinical practice: Another set of problems to solve
International Nuclear Information System (INIS)
Gale, A.G.; Roebuck, E.J.; Worthington, B.S.
1986-01-01
To be adopted in radiological practice, computer-assisted diagnosis must address a domain of realistic complexity and have a high performance in terms of speed and reliability. Use of a microcomputer-based system of mammographic diagnoses employing discriminant function analysis resulted in significantly fewer false-positive diagnoses while producing a similar level of correct diagnoses of cancer as normal reporting. Although such a system is a valuable teaching aid, its clinical use is constrained by the problems of unambiguously codifying descriptors, data entry time, and the tendency of radiologists to override predicted diagnoses which conflict with their own
A Computational Realization of a Semi-Lagrangian Method for Solving the Advection Equation
Directory of Open Access Journals (Sweden)
Alexander Efremov
2014-01-01
Full Text Available A parallel implementation of a method of the semi-Lagrangian type for the advection equation on a hybrid architecture computation system is discussed. The difference scheme with variable stencil is constructed on the base of an integral equality between the neighboring time levels. The proposed approach allows one to avoid the Courant-Friedrichs-Lewy restriction on the relation between time step and mesh size. The theoretical results are confirmed by numerical experiments. Performance of a sequential algorithm and several parallel implementations with the OpenMP and CUDA technologies in the C language has been studied.
Maddrey, Elizabeth
Research in academia and industry continues to identify a decline in enrollment in computer science. One major component of this decline in enrollment is a shortage of female students. The primary reasons for the gender gap presented in the research include lack of computer experience prior to their first year in college, misconceptions about the field, negative cultural stereotypes, lack of female mentors and role models, subtle discriminations in the classroom, and lack of self-confidence (Pollock, McCoy, Carberry, Hundigopal, & You, 2004). Male students are also leaving the field due to misconceptions about the field, negative cultural stereotypes, and a lack of self-confidence. Analysis of first year attrition revealed that one of the major challenges faced by students of both genders is a lack of problem-solving skills (Beaubouef, Lucas & Howatt, 2001; Olsen, 2005; Paxton & Mumey, 2001). The purpose of this study was to investigate whether specific, non-mathematical problem-solving instruction as part of introductory programming courses significantly increased computer programming self-efficacy and achievement of students. The results of this study showed that students in the experimental group had significantly higher achievement than students in the control group. While this shows statistical significance, due to the effect size and disordinal nature of the data between groups, care has to be taken in its interpretation. The study did not show significantly higher programming self-efficacy among the experimental students. There was not enough data collected to statistically analyze the effect of the treatment on self-efficacy and achievement by gender. However, differences in means were observed between the gender groups, with females in the experimental group demonstrating a higher than average degree of self-efficacy when compared with males in the experimental group and both genders in the control group. These results suggest that the treatment from this
Solving Multi-Pollutant Emission Dispatch Problem Using Computational Intelligence Technique
Directory of Open Access Journals (Sweden)
Nur Azzammudin Rahmat
2016-06-01
Full Text Available Economic dispatch is a crucial process conducted by the utilities to correctly determine the satisfying amount of power to be generated and distributed to the consumers. During the process, the utilities also consider pollutant emission as the consequences of fossil-fuel consumption. Fossil-fuel includes petroleum, coal, and natural gas; each has its unique chemical composition of pollutants i.e. sulphur oxides (SOX, nitrogen oxides (NOX and carbon oxides (COX. This paper presents multi-pollutant emission dispatch problem using computational intelligence technique. In this study, a novel emission dispatch technique is formulated to determine the amount of the pollutant level. It utilizes a pre-developed optimization technique termed as differential evolution immunized ant colony optimization (DEIANT for the emission dispatch problem. The optimization results indicated high level of COX level, regardless of any type of fossil fuel consumed.
TRANSLATION PROCESS AND THE USE OF COMPUTER A REPORT ON PROBLEM-SOLVING BEHAVIOUR DURING TRANSLATING
Directory of Open Access Journals (Sweden)
Engliana Engliana
2017-04-01
Full Text Available Emphasising on translation process including pre- and post-editing task using a text taken randomly from news on the Internet, this paper attempts to illustrate the behaviour patterns of some students currently studying English language at the university level in Jakarta. The students received texts to be translated using the computer equipped with screen recording software aimed to record all related activities during the translation process, including the pre- and post-editing. The method involves observing the participants‘ behaviour during translating focusing on the actions performed before and after using translation tool(s. The purposes of this investigation are to determine if the students: 1 use any software and the Internet to help them; 2 use the information in the translation process; 3 apply the translation theories. The results indicates that no pre-editing task was performed prior to translation
Recent progress on DNA based walkers.
Pan, Jing; Li, Feiran; Cha, Tae-Gon; Chen, Haorong; Choi, Jong Hyun
2015-08-01
DNA based synthetic molecular walkers are reminiscent of biological protein motors. They are powered by hybridization with fuel strands, environment induced conformational transitions, and covalent chemistry of oligonucleotides. Recent developments in experimental techniques enable direct observation of individual walkers with high temporal and spatial resolution. The functionalities of state-of-the-art DNA walker systems can thus be analyzed for various applications. Herein we review recent progress on DNA walker principles and characterization methods, and evaluate various aspects of their functions for future applications. Copyright © 2014 Elsevier Ltd. All rights reserved.
Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly
2014-05-01
Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological
Psycharis, Sarantos; Kallia, Maria
2017-01-01
In this paper we investigate whether computer programming has an impact on high school student's reasoning skills, problem solving and self-efficacy in Mathematics. The quasi-experimental design was adopted to implement the study. The sample of the research comprised 66 high school students separated into two groups, the experimental and the…
Energy Technology Data Exchange (ETDEWEB)
Fletcher, J K
1973-05-01
CTD is a computer program written in Fortran 4 to solve the multi-group diffusion theory equations in X, Y, Z and triangular Z geometries. A power print- out neutron balance and breeding gain are also produced. 4 references. (auth)
Chang, C.-J.; Chang, M.-H.; Liu, C.-C.; Chiu, B.-C.; Fan Chiang, S.-H.; Wen, C.-T.; Hwang, F.-K.; Chao, P.-Y.; Chen, Y.-L.; Chai, C.-S.
2017-01-01
Researchers have indicated that the collaborative problem-solving space afforded by the collaborative systems significantly impact the problem-solving process. However, recent investigations into collaborative simulations, which allow a group of students to jointly manipulate a problem in a shared problem space, have yielded divergent results…
Computer-aided detection of lung nodules on chest CT: issues to be solved before clinical use
International Nuclear Information System (INIS)
Goo, Jin Mo
2005-01-01
Given the increasing resolution of modern CT scanners, and the requirements for large-scale lung-screening examinations and diagnostic studies, there is an increased need for the accurate and reproducible analysis of the large number of images. Nodule detection is one of the main challenges of CT imaging, as they can be missed due to their small size, low relative contrast, or because they are located in an area with complex anatomy. Recent developments in computer-aided diagnosis (CAD) schemes are expected to aid radiologists in various tasks of chest imaging. In this era of multidetector row CT, the thoracic applications of greatest interest include the detection and volume measurement of lung nodules (1-7). Technology for CAD as applied to lung nodule detection on chest CT has been approved by the Food and Drug Administration and is currently commercially available. The article by Lee et al. (5) in this issue of the Korean Journal of Radiology is one of the few studies to examine the influence of a commercially available CAD system on the detection of lung nodules. In this study, some additional nodules were detected with the help of a CAD system, but at the expense of increased false positivity. The nodule detection rate of the CAD system in this study was lower than that achieved by radiologist, and the authors insist that the CAD system should be improved further. Compared to the use of CAD on mammograms, CAD evaluations of chest CTs remain limited to the laboratory setting. In this field, apart from the issues of detection rate and false positive detections, many obstacles must be overcome before CAD can be used in a true clinical reading environment. In this editorial, I will list some of these issues, but I emphasize now that I believe these issues will be solved by improved CAD versions in the near future
DENA: A Configurable Microarchitecture and Design Flow for Biomedical DNA-Based Logic Design.
Beiki, Zohre; Jahanian, Ali
2017-10-01
DNA is known as the building block for storing the life codes and transferring the genetic features through the generations. However, it is found that DNA strands can be used for a new type of computation that opens fascinating horizons in computational medicine. Significant contributions are addressed on design of DNA-based logic gates for medical and computational applications but there are serious challenges for designing the medium and large-scale DNA circuits. In this paper, a new microarchitecture and corresponding design flow is proposed to facilitate the design of multistage large-scale DNA logic systems. Feasibility and efficiency of the proposed microarchitecture are evaluated by implementing a full adder and, then, its cascadability is determined by implementing a multistage 8-bit adder. Simulation results show the highlight features of the proposed design style and microarchitecture in terms of the scalability, implementation cost, and signal integrity of the DNA-based logic system compared to the traditional approaches.
Sheriff, Kelli A; Boon, Richard T
2014-08-01
The purpose of this study was to examine the effects of computer-based graphic organizers, using Kidspiration 3© software, to solve one-step word problems. Participants included three students with mild intellectual disability enrolled in a functional academic skills curriculum in a self-contained classroom. A multiple probe single-subject research design (Horner & Baer, 1978) was used to evaluate the effectiveness of computer-based graphic organizers to solving mathematical one-step word problems. During the baseline phase, the students completed a teacher-generated worksheet that consisted of nine functional word problems in a traditional format using a pencil, paper, and a calculator. In the intervention and maintenance phases, the students were instructed to complete the word problems using a computer-based graphic organizer. Results indicated that all three of the students improved in their ability to solve the one-step word problems using computer-based graphic organizers compared to traditional instructional practices. Limitations of the study and recommendations for future research directions are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Kester, Liesbeth; Kirschner, Paul A.; Van Merriënboer, Jeroen
2007-01-01
This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram,
Kester, L.; Kirschner, P.A.; Merriënboer, J.J.G.
2005-01-01
This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition was integrated in the circuit diagram. It was hypothesized that learners in the integrated format would achieve better test results than the learne...
山口, 有美; 山口, 晴久
2001-01-01
In this paper, we describe the comparative experiments to the students on solving process of problems on typical school teaching material knowledge (caluculation, geometry, Kanji dictations, typewriting, drawing ) in exercises in both in VDT works and on desktop works by frequency analysis of Brain Wave. The cognitive states of each mental working were compared on brain waves. And α reduction rate in brain waves in each mental work (calculation, geometry, Kanji dictations, typewriting, drawin...
DEFF Research Database (Denmark)
Salvatore, Princia; Nazmutdinov, Renat R.; Ulstrup, Jens
2015-01-01
Among the low-index single-crystal gold surfaces, the Au(110) surface is the most active toward molecular adsorption and the one with fewest electrochemical adsorption data reported. Cyclic voltammetry (CV), electrochemically controlled scanning tunneling microscopy (EC-STM), and density functional......, accompanied by a pair of strong voltammetry peaks in the double-layer region in acid solutions. Adsorption of the DNA bases gives featureless voltammograms with lower double-layer capacitance, suggesting that all the bases are chemisorbed on the Au(110) surface. Further investigation of the surface structures...... of the adlayers of the four DNA bases by EC-STM disclosed lifting of the Au(110) reconstruction, specific molecular packing in dense monolayers, and pH dependence of the A and G adsorption. DFT computations based on a cluster model for the Au(110) surface were performed to investigate the adsorption energy...
International Nuclear Information System (INIS)
Bellucci, V.J.
1990-01-01
This paper describes IBM's approach to parallel computing using the IBM ES/3090 computer. Parallel processing concepts were discussed including its advantages, potential performance improvements and limitations. Particular applications and capabilities for the IBM ES/3090 were presented along with preliminary results from some utilities in the application of parallel processing to simulation of system reliability, air pollution models, and power network dynamics
Moser, Arvin; Pautler, Brent G
2016-05-15
The successful elucidation of an unknown compound's molecular structure often requires an analyst with profound knowledge and experience of advanced spectroscopic techniques, such as Nuclear Magnetic Resonance (NMR) spectroscopy and mass spectrometry. The implementation of Computer-Assisted Structure Elucidation (CASE) software in solving for unknown structures, such as isolated natural products and/or reaction impurities, can serve both as elucidation and teaching tools. As such, the introduction of CASE software with 112 exercises to train students in conjunction with the traditional pen and paper approach will strengthen their overall understanding of solving unknowns and explore of various structural end points to determine the validity of the results quickly. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Gaffney, Hannah; Mansell, Warren; Edwards, Rachel; Wright, Jason
2014-11-01
Computerized self-help that has an interactive, conversational format holds several advantages, such as flexibility across presenting problems and ease of use. We designed a new program called MYLO that utilizes the principles of METHOD of Levels (MOL) therapy--based upon Perceptual Control Theory (PCT). We tested the efficacy of MYLO, tested whether the psychological change mechanisms described by PCT mediated its efficacy, and evaluated effects of client expectancy. Forty-eight student participants were randomly assigned to MYLO or a comparison program ELIZA. Participants discussed a problem they were currently experiencing with their assigned program and completed measures of distress, resolution and expectancy preintervention, postintervention and at 2-week follow-up. MYLO and ELIZA were associated with reductions in distress, depression, anxiety and stress. MYLO was considered more helpful and led to greater problem resolution. The psychological change processes predicted higher ratings of MYLO's helpfulness and reductions in distress. Positive expectancies towards computer-based problem solving correlated with MYLO's perceived helpfulness and greater problem resolution, and this was partly mediated by the psychological change processes identified. The findings provide provisional support for the acceptability of the MYLO program in a non-clinical sample although its efficacy as an innovative computer-based aid to problem solving remains unclear. Nevertheless, the findings provide tentative early support for the mechanisms of psychological change identified within PCT and highlight the importance of client expectations on predicting engagement in computer-based self-help.
Browning, Mark; Lehman, James D.
1991-01-01
Authors respond to criticisms by Smith in the same issue and defend their use of the term "gene" and "misconception." Authors indicate that they did not believe that the use of computers significantly skewed their data concerning student errors. (PR)
How stable are the mutagenic tautomers of DNA bases?
Directory of Open Access Journals (Sweden)
Brovarets’ O. O.
2010-02-01
Full Text Available Aim. To determine the lifetime of the mutagenic tautomers of DNA base pairs through the investigation of the physicochemical mechanisms of their intramolecular proton transfer. Methods. Non-empirical quantum chemistry, the analysis of the electron density by means of Bader’s atom in molecules (AIM theory and physicochemical kinetics were used. Results. Physicochemical character of the transition state of the intramolecular tautomerisation of DNA bases was investigated, the lifetime of mutagenic tautomers was calculated. Conclusions. The lifetime of the DNA bases mutagenic tautomers by 3–10 orders exceeds typical time of DNA replication in the cell (~103 s. This fact confirms that the postulate, on which the Watson-Crick tautomeric hypothesis of spontaneous transitions grounds, is adequate. The absence of intramolecular H-bonds in the canonical and mutagenic tautomeric forms determine their high stability
International Nuclear Information System (INIS)
Tu, J.Y.; Easey, J.F.; Burch, W.M.
1997-01-01
In this paper, some work on computational modelling for industrial operations and processes will be presented, for example, the modelling of fly-ash flow and the associated prediction of erosion in power utility boilers. The introduction and use of new formulations of encapsulated radioisotopes, currently being research at ANSTO, will open up further possibilities for the utilisation of radiotracer applications for a wider range of validation work not only in industrial but also in medical investigations. Applications of developed models to solving industrial problems will also be discussed in the paper
A quantum theoretical study of reactions of methyldiazonium ion with DNA base pairs
International Nuclear Information System (INIS)
Shukla, P.K.; Ganapathy, Vinay; Mishra, P.C.
2011-01-01
Graphical abstract: Reactions of methyldiazonium ion at the different sites of the DNA bases in the Watson-Crick GC and AT base pairs were investigated employing density functional and second order Moller-Plesset (MP2) perturbation theories. Display Omitted Highlights: → Methylation of the DNA bases is important as it can cause mutation and cancer. → Methylation reactions of the GC and AT base pairs with CH 3 N 2 + were not studied earlier theoretically. → Experimental observations have been explained using theoretical methods. - Abstract: Methylation of the DNA bases in the Watson-Crick GC and AT base pairs by the methyldiazonium ion was investigated employing density functional and second order Moller-Plesset (MP2) perturbation theories. Methylation at the N3, N7 and O6 sites of guanine, N1, N3 and N7 sites of adenine, O2 and N3 sites of cytosine and the O2 and O4 sites of thymine were considered. The computed reactivities for methylation follow the order N7(guanine) > N3(adenine) > O6(guanine) which is in agreement with experiment. The base pairing in DNA is found to play a significant role with regard to reactivities of the different sites.
High-speed DNA-based rolling motors powered by RNase H
Yehl, Kevin; Mugler, Andrew; Vivek, Skanda; Liu, Yang; Zhang, Yun; Fan, Mengzhen; Weeks, Eric R.
2016-01-01
DNA-based machines that walk by converting chemical energy into controlled motion could be of use in applications such as next generation sensors, drug delivery platforms, and biological computing. Despite their exquisite programmability, DNA-based walkers are, however, challenging to work with due to their low fidelity and slow rates (~1 nm/min). Here, we report DNA-based machines that roll rather than walk, and consequently have a maximum speed and processivity that is three-orders of magnitude greater than conventional DNA motors. The motors are made from DNA-coated spherical particles that hybridise to a surface modified with complementary RNA; motion is achieved through the addition of RNase H, which selectively hydrolyses hybridised RNA. Spherical motors move in a self-avoiding manner, whereas anisotropic particles, such as dimerised particles or rod-shaped particles travel linearly without a track or external force. Finally, we demonstrate detection of single nucleotide polymorphism by measuring particle displacement using a smartphone camera. PMID:26619152
Profiling the miRNAs for Early Cancer Detection using DNA-based Logic Gates
Directory of Open Access Journals (Sweden)
Tahereh Yahya
2017-12-01
Full Text Available Abstract Background: DNA-based computing is an emerging research aspect that enables the in-vivo computation and decision making with significant correctness. Recent papers show that the expression level of miRNAs are related to the progress status of some diseases such as cancers and DNA computing is introduced as a low cost and concise technique for detection of these biomarkers. In this paper, DNA-based logic gates are implemented in the laboratory to detect the level of miR-21 as the biomarker of cancer. Materials and Methods: At the first, required strands for designing DNA gates are synthesized. Then, double stranded gate is generated in laboratory using a temperature gradient that followed by electrophoresis process. This double strand is the computation engine for detecting the miR-21 biomarker. miR-21 is as input in designed gate. At the end, the expression level of miR-21 is identified by measuring the generated fluorescent. Results: at the first stage, the proposed DNA-based logic gate is evaluated by using the synthesized input strands and then it is experimented on a tumor tissue. Experimental results on synthesized strands show that its detection quality/correctness is 2.5x better than conventional methods. Conclusion: Experimental results on the tumor tissues are successful and are matched with those are extracted from real time PCR results. Also, the results show that this method is significantly more suitable than real time PCR in view of time and cost.
Corbett, Christianne; Hill, Catherine
2015-01-01
During the 2014 White House Science Fair, President Barack Obama used a sports metaphor to explain why we must address the shortage of women in science, technology, engineering, and mathematics (STEM), particularly in the engineering and computing fields: "Half our team, we're not even putting on the field. We've got to change those…
International Nuclear Information System (INIS)
Rajagopalan, S.; Jethra, A.; Khare, A.N.; Ghodgaonkar, M.D.; Srivenkateshan, R.; Menon, S.V.G.
1990-01-01
Issues relating to implementing iterative procedures, for numerical solution of elliptic partial differential equations, on a distributed parallel computing system are discussed. Preliminary investigations show that a speed-up of about 3.85 is achievable on a four transputer pipeline network. (author). 2 figs., 3 a ppendixes., 7 refs
Gürbüz, Hasan; Evlioglu, Bengisu; Erol, Çigdem Selçukcan; Gülseçen, Hulusi; Gülseçen, Sevinç
2017-01-01
Computer-based games as developments in information technology seem to grow and spread rapidly. Using of these games by children and teenagers have increased. The presence of more beneficial and educational games in contrast to the violent and harmful games is remarkable. Many scientific studies have indicated that the useful (functional) games…
The role of guidance in computer-based problem solving for the development of concepts of logic
Eysink, Tessa H.S.; Dijkstra, S.; Kuper, Jan
The effect of two instructional variables, manipulation of objects and guidance, in learning to use the logical connective, conditional, was investigated. Instructions for 72 first- and second year social science students were varied in the computer-based learning environment Tarski’s World,
DNA-based technology helps people solve problems. It can be used to correctly match organ donors with recipients, identify victims of natural and man-made disasters, and detect bacteria and other organisms that may pollute air, soil, food, or water.
DNA-based asymmetric organometallic catalysis in water
Oelerich, Jens; Roelfes, Gerard
2013-01-01
Here, the first examples of DNA-based organometallic catalysis in water that give rise to high enantioselectivities are described. Copper complexes of strongly intercalating ligands were found to enable the asymmetric intramolecular cyclopropanation of alpha-diazo-beta-keto sulfones in water. Up to
Directory of Open Access Journals (Sweden)
Ichitaro Yamazaki
2015-01-01
of their low-rank properties. To compute a low-rank approximation of a dense matrix, in this paper, we study the performance of QR factorization with column pivoting or with restricted pivoting on multicore CPUs with a GPU. We first propose several techniques to reduce the postprocessing time, which is required for restricted pivoting, on a modern CPU. We then examine the potential of using a GPU to accelerate the factorization process with both column and restricted pivoting. Our performance results on two eight-core Intel Sandy Bridge CPUs with one NVIDIA Kepler GPU demonstrate that using the GPU, the factorization time can be reduced by a factor of more than two. In addition, to study the performance of our implementations in practice, we integrate them into a recently developed software StruMF which algebraically exploits such low-rank structures for solving a general sparse linear system of equations. Our performance results for solving Poisson's equations demonstrate that the proposed techniques can significantly reduce the preconditioner construction time of StruMF on the CPUs, and the construction time can be further reduced by 10%–50% using the GPU.
A universal DNA-based protein detection system.
Tran, Thua N N; Cui, Jinhui; Hartman, Mark R; Peng, Songming; Funabashi, Hisakage; Duan, Faping; Yang, Dayong; March, John C; Lis, John T; Cui, Haixin; Luo, Dan
2013-09-25
Protein immune detection requires secondary antibodies which must be carefully selected in order to avoid interspecies cross-reactivity, and is therefore restricted by the limited availability of primary/secondary antibody pairs. Here we present a versatile DNA-based protein detection system using a universal adapter to interface between IgG antibodies and DNA-modified reporter molecules. As a demonstration of this capability, we successfully used DNA nano-barcodes, quantum dots, and horseradish peroxidase enzyme to detect multiple proteins using our DNA-based labeling system. Our system not only eliminates secondary antibodies but also serves as a novel method platform for protein detection with modularity, high capacity, and multiplexed capability.
Controlling charge current through a DNA based molecular transistor
Energy Technology Data Exchange (ETDEWEB)
Behnia, S., E-mail: s.behnia@sci.uut.ac.ir; Fathizadeh, S.; Ziaei, J.
2017-01-05
Molecular electronics is complementary to silicon-based electronics and may induce electronic functions which are difficult to obtain with conventional technology. We have considered a DNA based molecular transistor and study its transport properties. The appropriate DNA sequence as a central chain in molecular transistor and the functional interval for applied voltages is obtained. I–V characteristic diagram shows the rectifier behavior as well as the negative differential resistance phenomenon of DNA transistor. We have observed the nearly periodic behavior in the current flowing through DNA. It is reported that there is a critical gate voltage for each applied bias which above it, the electrical current is always positive. - Highlights: • Modeling a DNA based molecular transistor and studying its transport properties. • Choosing the appropriate DNA sequence using the quantum chaos tools. • Choosing the functional interval for voltages via the inverse participation ratio tool. • Detecting the rectifier and negative differential resistance behavior of DNA.
Trial watch: Naked and vectored DNA-based anticancer vaccines.
Bloy, Norma; Buqué, Aitziber; Aranda, Fernando; Castoldi, Francesca; Eggermont, Alexander; Cremer, Isabelle; Sautès-Fridman, Catherine; Fucikova, Jitka; Galon, Jérôme; Spisek, Radek; Tartour, Eric; Zitvogel, Laurence; Kroemer, Guido; Galluzzi, Lorenzo
2015-05-01
One type of anticancer vaccine relies on the administration of DNA constructs encoding one or multiple tumor-associated antigens (TAAs). The ultimate objective of these preparations, which can be naked or vectored by non-pathogenic viruses, bacteria or yeast cells, is to drive the synthesis of TAAs in the context of an immunostimulatory milieu, resulting in the (re-)elicitation of a tumor-targeting immune response. In spite of encouraging preclinical results, the clinical efficacy of DNA-based vaccines employed as standalone immunotherapeutic interventions in cancer patients appears to be limited. Thus, efforts are currently being devoted to the development of combinatorial regimens that allow DNA-based anticancer vaccines to elicit clinically relevant immune responses. Here, we discuss recent advances in the preclinical and clinical development of this therapeutic paradigm.
DNA-based random number generation in security circuitry.
Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C
2010-06-01
DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.
Directory of Open Access Journals (Sweden)
V. P. Shapeev
2014-01-01
Full Text Available The method of collocations and least residuals (CLR, which was proposed previously for the numerical solution of two-dimensional Navier–Stokes equations governing the stationary flows of a viscous incompressible fluid, is extended here for the three-dimensional case. The solution is sought in the implemented version of the method in the form of an expansion in the basis solenoidal functions. At all stages of the CLR method construction, a computer algebra system (CAS is applied for the derivation and verification of the formulas of the method and for their translation into arithmetic operators of the Fortran language. For accelerating the convergence of iterations a sufficiently universal algorithm is proposed, which is simple in its implementation and is based on the use of the Krylov’s subspaces. The obtained computational formulas of the CLR method were verified on the exact analytic solution of a test problem. Comparisons with the published numerical results of solving the benchmark problem of the 3D driven cubic cavity flow show that the accuracy of the results obtained by the CLR method corresponds to the known high-accuracy solutions.
Magnetophoresis of flexible DNA-based dumbbell structures
Babić, B.; Ghai, R.; Dimitrov, K.
2008-02-01
Controlled movement and manipulation of magnetic micro- and nanostructures using magnetic forces can give rise to important applications in biomedecine, diagnostics, and immunology. We report controlled magnetophoresis and stretching, in aqueous solution, of a DNA-based dumbbell structure containing magnetic and diamagnetic microspheres. The velocity and stretching of the dumbbell were experimentally measured and correlated with a theoretical model based on the forces acting on individual magnetic beads or the entire dumbbell structures. The results show that precise and predictable manipulation of dumbbell structures is achievable and can potentially be applied to immunomagnetic cell separators.
The current state of eukaryotic DNA base damage and repair.
Bauer, Nicholas C; Corbett, Anita H; Doetsch, Paul W
2015-12-02
DNA damage is a natural hazard of life. The most common DNA lesions are base, sugar, and single-strand break damage resulting from oxidation, alkylation, deamination, and spontaneous hydrolysis. If left unrepaired, such lesions can become fixed in the genome as permanent mutations. Thus, evolution has led to the creation of several highly conserved, partially redundant pathways to repair or mitigate the effects of DNA base damage. The biochemical mechanisms of these pathways have been well characterized and the impact of this work was recently highlighted by the selection of Tomas Lindahl, Aziz Sancar and Paul Modrich as the recipients of the 2015 Nobel Prize in Chemistry for their seminal work in defining DNA repair pathways. However, how these repair pathways are regulated and interconnected is still being elucidated. This review focuses on the classical base excision repair and strand incision pathways in eukaryotes, considering both Saccharomyces cerevisiae and humans, and extends to some important questions and challenges facing the field of DNA base damage repair. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Problem Solving with General Semantics.
Hewson, David
1996-01-01
Discusses how to use general semantics formulations to improve problem solving at home or at work--methods come from the areas of artificial intelligence/computer science, engineering, operations research, and psychology. (PA)
Angeli, Charoula
2013-01-01
An investigation was carried out to examine the effects of cognitive style on learners' performance and interaction during complex problem solving with a computer modeling tool. One hundred and nineteen undergraduates volunteered to participate in the study. Participants were first administered a test, and based on their test scores they were…
Aviram–Ratner rectifying mechanism for DNA base-pair sequencing through graphene nanogaps
International Nuclear Information System (INIS)
Agapito, Luis A; Gayles, Jacob; Wolowiec, Christian; Kioussis, Nicholas
2012-01-01
We demonstrate that biological molecules such as Watson–Crick DNA base pairs can behave as biological Aviram–Ratner electrical rectifiers because of the spatial separation and weak hydrogen bonding between the nucleobases. We have performed a parallel computational implementation of the ab initio non-equilibrium Green’s function (NEGF) theory to determine the electrical response of graphene—base-pair—graphene junctions. The results show an asymmetric (rectifying) current–voltage response for the cytosine–guanine base pair adsorbed on a graphene nanogap. In sharp contrast we find a symmetric response for the thymine–adenine case. We propose applying the asymmetry of the current–voltage response as a sensing criterion to the technological challenge of rapid DNA sequencing via graphene nanogaps. (paper)
Quasiparticle properties of DNA bases from GW calculations in a Wannier basis
Qian, Xiaofeng; Marzari, Nicola; Umari, Paolo
2009-03-01
The quasiparticle GW-Wannier (GWW) approach [1] has been recently developed to overcome the size limitations of conventional planewave GW calculations. By taking advantage of the localization properties of the maximally-localized Wannier functions and choosing a small set of polarization basis we reduce the number of Bloch wavefunctions products required for the evaluation of dynamical polarizabilities, and in turn greatly reduce memory requirements and computational efficiency. We apply GWW to study quasiparticle properties of different DNA bases and base-pairs, and solvation effects on the energy gap, demonstrating in the process the key advantages of this approach. [1] P. Umari,G. Stenuit, and S. Baroni, cond-mat/0811.1453
A Rewritable, Random-Access DNA-Based Storage System.
Yazdi, S M Hossein Tabatabaei; Yuan, Yongbo; Ma, Jian; Zhao, Huimin; Milenkovic, Olgica
2015-09-18
We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.
Application of DNA-based methods in forensic entomology.
Wells, Jeffrey D; Stevens, Jamie R
2008-01-01
A forensic entomological investigation can benefit from a variety of widely practiced molecular genotyping methods. The most commonly used is DNA-based specimen identification. Other applications include the identification of insect gut contents and the characterization of the population genetic structure of a forensically important insect species. The proper application of these procedures demands that the analyst be technically expert. However, one must also be aware of the extensive list of standards and expectations that many legal systems have developed for forensic DNA analysis. We summarize the DNA techniques that are currently used in, or have been proposed for, forensic entomology and review established genetic analyses from other scientific fields that address questions similar to those in forensic entomology. We describe how accepted standards for forensic DNA practice and method validation are likely to apply to insect evidence used in a death or other forensic entomological investigation.
Oxidative DNA base modifications as factors in carcinogenesis
International Nuclear Information System (INIS)
Olinski, R.; Jaruga, P.; Zastawny, T.H.
1998-01-01
Reactive oxygen species can cause extensive DNA modifications including modified bases. Some of the DNA base damage has been found to possess premutagenic properties. Therefore, if not repaired, it can contribute to carcinogenesis. We have found elevated amounts of modified bases in cancerous and precancerous tissues as compared with normal tissues. Most of the agents used in anticancer therapy are paradoxically responsible for induction of secondary malignancies and some of them may generate free radicals. The results of our experiments provide evidence that exposure of cancer patients to therapeutic doses of ionizing radiation and anticancer drugs cause base modifications in genomic DNA of lymphocytes. Some of these base damages could lead to mutagenesis in critical genes and ultimately to secondary cancers such as leukemias. This may point to an important role of oxidative base damage in cancer initiation. Alternatively, the increased level of the modified base products may contribute to genetic instability and metastatic potential of tumor cells. (author)
Ultraviolet enhancement of DNA base release by bleomycin
International Nuclear Information System (INIS)
Kakinuma, J.; Tanabe, M.; Orii, H.
1984-01-01
The effect of UV irradiation on base-releasing activity of bleomycin was studied on bleomycin A 2 -DNA reaction mixture in the presence of Fe(II) and 2-mercaptoethanol. This effect was measured by the release of free bases from calf thymus DNA with high-performance liquid chromatography. UV irradiation enhanced DNA base-releasing activity of bleomycin and simultaneously caused disappearance of fluorescence emission maximum at 355 nm assigned to bithiazole rings and increase in the intensity of a peak at 400 nm. UV irradiation at 295 nm, the UV absorption maximum of bleomycin, is the most effective in releasing free bases and in changing fluorescence emission patterns. From these results, we suggest that some alterations in the bithiazole group of bleomycin molecule were initiated by UV irradiation and contributed to increased base-releasing activity of bleomycin through a yet unexplained mechanism, presumably through bleomycin dimer formation. (orig.)
Abele, Stephan
2018-01-01
This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…
Energy Technology Data Exchange (ETDEWEB)
Llaurado, J. G. [Biomedical Engineering Group, Marquette University (United States); Marquette School of Medicine, Milwaukee (United States); Nuclear Medicine Service of Veterans Administration Center, Wood, WI (United States)
1971-02-15
A method commonly used for the study of the distribution of a substance among the different spaces ol a biological tissue is the continuous washout (outflow) and isotope counting of fragments of tissue previously incubated with a tracer. A first order kinetics compartmental system can be postulated and characterized by the transport rates (k) at which the substance of interest moves across its different compartments. Direct solution from the outflow data requires knowledge of the initial conditions for, or to have access for measurements in, each compartment. This cannot be fulfilled in most biological problems. In the course of studying {sup 22}Na distribution in segments of arteries a digital computer simulation approach was developed to solve the system. In the belief that the approach transcends this particular application, its mathematical basis is herein presented: the movement of radioactive tracer obeys Divides dq/d Divides = - Divides k Divides Divides q Divides + Divides r Divides (1) where |q| is a vector of response functions for each compartment, |k] is a square matrix of transport rate constants and |r| is a vector of input rates to the system. Solution of Eq. 1 is Divides q Divides = e{sup - Divides k Divides t} {integral}{sub 0}{sup t} e{sup Divides k Divides t} Divides r Divides dt + e{sup - Divides k Divides t} Divides q{sub 0} Divides (2) (i) For an inflow experiment, with 0 initial conditions and a constant unit input rate |r{sub u}| Divides q Divides = ( Divides I Divides - e{sup - Divides k Divides t}) Divides k Divides {sup -1} Divides r{sub u} Divides (3) as t --> {infinity}, Divides q{sub {infinity}} Divides = Divides k Divides {sup -1} Divides r{sub u} Divides , which replaced in Eq. 3, Divides q Divides = Divides q{sub {infinity}} Divides -e{sup - Divides k Divides t} Divides q{sub {infinity}} Divides (4) (ii) For an outflow experiment Divides r Divides = 0 and Eq. 2 becomes Divides q Divides = e{sup - Divides k Divides t} Divides q{sub 0
Interactive problem solving using LOGO
Boecker, Heinz-Dieter; Fischer, Gerhard
2014-01-01
This book is unique in that its stress is not on the mastery of a programming language, but on the importance and value of interactive problem solving. The authors focus on several specific interest worlds: mathematics, computer science, artificial intelligence, linguistics, and games; however, their approach can serve as a model that may be applied easily to other fields as well. Those who are interested in symbolic computing will find that Interactive Problem Solving Using LOGO provides a gentle introduction from which one may move on to other, more advanced computational frameworks or more
Inference rule and problem solving
Energy Technology Data Exchange (ETDEWEB)
Goto, S
1982-04-01
Intelligent information processing signifies an opportunity of having man's intellectual activity executed on the computer, in which inference, in place of ordinary calculation, is used as the basic operational mechanism for such an information processing. Many inference rules are derived from syllogisms in formal logic. The problem of programming this inference function is referred to as a problem solving. Although logically inference and problem-solving are in close relation, the calculation ability of current computers is on a low level for inferring. For clarifying the relation between inference and computers, nonmonotonic logic has been considered. The paper deals with the above topics. 16 references.
A DNA-based semantic fusion model for remote sensing data.
Directory of Open Access Journals (Sweden)
Heng Sun
Full Text Available Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.
A DNA-based semantic fusion model for remote sensing data.
Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H
2013-01-01
Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.
DNA based random key generation and management for OTP encryption.
Zhang, Yunpeng; Liu, Xin; Sun, Manhui
2017-09-01
One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.
Electron accommodation dynamics in the DNA base thymine
King, Sarah B.; Stephansen, Anne B.; Yokoi, Yuki; Yandell, Margaret A.; Kunin, Alice; Takayanagi, Toshiyuki; Neumark, Daniel M.
2015-07-01
The dynamics of electron attachment to the DNA base thymine are investigated using femtosecond time-resolved photoelectron imaging of the gas phase iodide-thymine (I-T) complex. An ultraviolet pump pulse ejects an electron from the iodide and prepares an iodine-thymine temporary negative ion that is photodetached with a near-IR probe pulse. The resulting photoelectrons are analyzed with velocity-map imaging. At excitation energies ranging from -120 meV to +90 meV with respect to the vertical detachment energy (VDE) of 4.05 eV for I-T, both the dipole-bound and valence-bound negative ions of thymine are observed. A slightly longer rise time for the valence-bound state than the dipole-bound state suggests that some of the dipole-bound anions convert to valence-bound species. No evidence is seen for a dipole-bound anion of thymine at higher excitation energies, in the range of 0.6 eV above the I-T VDE, which suggests that if the dipole-bound anion acts as a "doorway" to the valence-bound anion, it only does so at excitation energies near the VDE of the complex.
Electron accommodation dynamics in the DNA base thymine
Energy Technology Data Exchange (ETDEWEB)
King, Sarah B.; Yandell, Margaret A.; Kunin, Alice [Department of Chemistry, University of California, Berkeley, California 94720 (United States); Stephansen, Anne B. [Department of Chemistry, University of Copenhagen, Universitetsparken 5, DK-2100 København Ø (Denmark); Yokoi, Yuki; Takayanagi, Toshiyuki [Department of Chemistry, Saitama University, 255 Shimo-Okubo, Sakura-ku, Saitama City, Saitama 338-8570 (Japan); Neumark, Daniel M., E-mail: dneumark@berkeley.edu [Department of Chemistry, University of California, Berkeley, California 94720 (United States); Chemical Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States)
2015-07-14
The dynamics of electron attachment to the DNA base thymine are investigated using femtosecond time-resolved photoelectron imaging of the gas phase iodide-thymine (I{sup −}T) complex. An ultraviolet pump pulse ejects an electron from the iodide and prepares an iodine-thymine temporary negative ion that is photodetached with a near-IR probe pulse. The resulting photoelectrons are analyzed with velocity-map imaging. At excitation energies ranging from −120 meV to +90 meV with respect to the vertical detachment energy (VDE) of 4.05 eV for I{sup −}T, both the dipole-bound and valence-bound negative ions of thymine are observed. A slightly longer rise time for the valence-bound state than the dipole-bound state suggests that some of the dipole-bound anions convert to valence-bound species. No evidence is seen for a dipole-bound anion of thymine at higher excitation energies, in the range of 0.6 eV above the I{sup −}T VDE, which suggests that if the dipole-bound anion acts as a “doorway” to the valence-bound anion, it only does so at excitation energies near the VDE of the complex.
International Nuclear Information System (INIS)
Murfi, Hendri; Basaruddin, T.
2001-01-01
The interior point method for linear programming has gained extraordinary interest as an alternative to simplex method since Karmarkar presented a polynomial-time algorithm for linear programming based on interior point method. In implementation of the algorithm of this method, there are two important things that have impact heavily to performance of the algorithm; they are data structure and used method to solve linear equation system in the algorithm. This paper describes about solving linear equation system in variants of the algorithm called dual-affine scaling algorithm. Next, we evaluate experimentally results of some used methods, either direct method or iterative method. The experimental evaluation used Matlab
Solvent effects on hydrogen bonds in Watson-Crick, mismatched, and modified DNA base pairs
Poater, Jordi; Swart, Marcel; Guerra, Celia Fonseca; Bickelhaupt, F. Matthias
2012-01-01
We have theoretically analyzed a complete series of Watson–Crick and mismatched DNA base pairs, both in gas phase and in solution. Solvation causes a weakening and lengthening of the hydrogen bonds between the DNA bases because of the stabilization of the lone pairs involved in these bonds. We have
DNA-based species detection capabilities using laser transmission spectroscopy.
Mahon, A R; Barnes, M A; Li, F; Egan, S P; Tanner, C E; Ruggiero, S T; Feder, J L; Lodge, D M
2013-01-06
Early detection of invasive species is critical for effective biocontrol to mitigate potential ecological and economic damage. Laser transmission spectroscopy (LTS) is a powerful solution offering real-time, DNA-based species detection in the field. LTS can measure the size, shape and number of nanoparticles in a solution and was used here to detect size shifts resulting from hybridization of the polymerase chain reaction product to nanoparticles functionalized with species-specific oligonucleotide probes or with the species-specific oligonucleotide probes alone. We carried out a series of DNA detection experiments using the invasive freshwater quagga mussel (Dreissena bugensis) to evaluate the capability of the LTS platform for invasive species detection. Specifically, we tested LTS sensitivity to (i) DNA concentrations of a single target species, (ii) the presence of a target species within a mixed sample of other closely related species, (iii) species-specific functionalized nanoparticles versus species-specific oligonucleotide probes alone, and (iv) amplified DNA fragments versus unamplified genomic DNA. We demonstrate that LTS is a highly sensitive technique for rapid target species detection, with detection limits in the picomolar range, capable of successful identification in multispecies samples containing target and non-target species DNA. These results indicate that the LTS DNA detection platform will be useful for field application of target species. Additionally, we find that LTS detection is effective with species-specific oligonucleotide tags alone or when they are attached to polystyrene nanobeads and with both amplified and unamplified DNA, indicating that the technique may also have versatility for broader applications.
Parallel Algorithm Solves Coupled Differential Equations
Hayashi, A.
1987-01-01
Numerical methods adapted to concurrent processing. Algorithm solves set of coupled partial differential equations by numerical integration. Adapted to run on hypercube computer, algorithm separates problem into smaller problems solved concurrently. Increase in computing speed with concurrent processing over that achievable with conventional sequential processing appreciable, especially for large problems.
Solving applied mathematical problems with Matlab
Xue, Dingyu
2008-01-01
Computer Mathematics Language-An Overview. Fundamentals of MATLAB Programming. Calculus Problems. MATLAB Computations of Linear Algebra Problems. Integral Transforms and Complex Variable Functions. Solutions to Nonlinear Equations and Optimization Problems. MATLAB Solutions to Differential Equation Problems. Solving Interpolations and Approximations Problems. Solving Probability and Mathematical Statistics Problems. Nontraditional Solution Methods for Mathematical Problems.
Brovarets', O O
2013-01-01
At the MP2/6-311++G(2df,pd)//B3LYP/6-311++G(d,p) level of theory it was established for the first time, that the Löwdin's G*.C* DNA base pair formed by the mutagenic tautomers can acquire, as the A-T Watson-Crick DNA base pair, four biologically important configurations, namely: Watson-Crick, reverse Watson-Crick, Hoogsteen and reverse Hoogsteen. This fact demonstrates rather unexpected role of the tautomerisation of the one of the Watson-Crick DNA base pairs, in particular, via double proton transfer: exactly the G.C-->G*.C* tautomerisation allows to overcome steric hindrances for the implementation of the above mentioned configurations. Geometric, electron-topological and energetic properties of the H-bonds that stabilise the studied pairs, as well as the energetic characteristics of the latters are presented.
Grandey, Robert C.
The development of computer-assisted instructional lessons on the following three topics is discussed: 1) the mole concept and chemical formulas, 2) concentration of solutions and quantities from chemical equations, and 3) balancing equations for oxidation-reduction reactions. Emphasis was placed on developing computer routines which interpret…
DNA-based construction at the nanoscale: emerging trends and applications
Lourdu Xavier, P.; Chandrasekaran, Arun Richard
2018-02-01
The field of structural DNA nanotechnology has evolved remarkably—from the creation of artificial immobile junctions to the recent DNA-protein hybrid nanoscale shapes—in a span of about 35 years. It is now possible to create complex DNA-based nanoscale shapes and large hierarchical assemblies with greater stability and predictability, thanks to the development of computational tools and advances in experimental techniques. Although it started with the original goal of DNA-assisted structure determination of difficult-to-crystallize molecules, DNA nanotechnology has found its applications in a myriad of fields. In this review, we cover some of the basic and emerging assembly principles: hybridization, base stacking/shape complementarity, and protein-mediated formation of nanoscale structures. We also review various applications of DNA nanostructures, with special emphasis on some of the biophysical applications that have been reported in recent years. In the outlook, we discuss further improvements in the assembly of such structures, and explore possible future applications involving super-resolved fluorescence, single-particle cryo-electron (cryo-EM) and x-ray free electron laser (XFEL) nanoscopic imaging techniques, and in creating new synergistic designer materials.
The essential component in DNA-based information storage system: robust error-tolerating module
Directory of Open Access Journals (Sweden)
Aldrin Kay-Yuen eYim
2014-11-01
Full Text Available The size of digital data is ever increasing and is expected to grow to 40,000EB by 2020, yet the estimated global information storage capacity in 2011 is less than 300EB, indicating that most of the data are transient. DNA, as a very stable nano-molecule, is an ideal massive storage device for long-term data archive. The two most notable illustrations are from Church et al. and Goldman et al., whose approaches are well-optimized for most sequencing platforms – short synthesized DNA fragments without homopolymer. Here we suggested improvements on error handling methodology that could enable the integration of DNA-based computational process, e.g. algorithms based on self-assembly of DNA. As a proof of concept, a picture of size 438 bytes was encoded to DNA with Low-Density Parity-Check error-correction code. We salvaged a significant portion of sequencing reads with mutations generated during DNA synthesis and sequencing and successfully reconstructed the entire picture. A modular-based programming framework - DNAcodec with a XML-based data format was also introduced. Our experiments demonstrated the practicability of long DNA message recovery with high error-tolerance, which opens the field to biocomputing and synthetic biology.
Huang, C. J.; Motard, R. L.
1978-01-01
The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.
International Nuclear Information System (INIS)
Swarts, S.G.; Smith, G.S.; Miao, L.; Wheeler, K.T.
1996-01-01
Gas chromatography/mass spectrometry (GC/ MS-SIM) is an excellent technique for performing both qualitative and quantitative analysis of DNA base damage products that are formed by exposure to ionizing radiation or by the interaction of intracellular DNA with activated oxygen species. This technique commonly uses a hot formic acid hydrolysis step to degrade the DNA to individual free bases. However, due to the harsh nature of this degradation procedure, the quantitation of DNA base damage products may be adversely affected. Consequently, we examined the effects of various formic acid hydrolysis procedures on the quantitation of a number of DNA base damage products and identified several factors that can influence this quantitation. These factors included (1) the inherent acid stabilities of both the lesions and the internal standards; (2) the hydrolysis temperature; (3) the source and grade of the formic acid; and (4) the sample mass during hydrolysis. Our data also suggested that the N, O-bis (trimethylsilyl)trifluoroacetamide (BSTFA) derivatization efficiency can be adversely affected, presumably by trace contaminants either in the formic acid or from the acid-activated surface of the glass derivatization vials. Where adverse effects were noted, modifications were explored in an attempt to improve the quantitation of these DNA lesions. Although experimental steps could be taken to minimize the influence of these factors on the quantitation of some base damage products, no single procedure solved the quantitation problem for all base lesions. However, a significant improvement in the quantitation was achieved if the relative molecular response factor (RMRF) values for these lesions were generated with authentic DNA base damage products that had been treated exactly like the experimental samples. (orig.)
Problem solving and inference mechanisms
Energy Technology Data Exchange (ETDEWEB)
Furukawa, K; Nakajima, R; Yonezawa, A; Goto, S; Aoyama, A
1982-01-01
The heart of the fifth generation computer will be powerful mechanisms for problem solving and inference. A deduction-oriented language is to be designed, which will form the core of the whole computing system. The language is based on predicate logic with the extended features of structuring facilities, meta structures and relational data base interfaces. Parallel computation mechanisms and specialized hardware architectures are being investigated to make possible efficient realization of the language features. The project includes research into an intelligent programming system, a knowledge representation language and system, and a meta inference system to be built on the core. 30 references.
Intelligent DNA-based molecular diagnostics using linked genetic markers
Energy Technology Data Exchange (ETDEWEB)
Pathak, D.K.; Perlin, M.W.; Hoffman, E.P.
1994-12-31
This paper describes a knowledge-based system for molecular diagnostics, and its application to fully automated diagnosis of X-linked genetic disorders. Molecular diagnostic information is used in clinical practice for determining genetic risks, such as carrier determination and prenatal diagnosis. Initially, blood samples are obtained from related individuals, and PCR amplification is performed. Linkage-based molecular diagnosis then entails three data analysis steps. First, for every individual, the alleles (i.e., DNA composition) are determined at specified chromosomal locations. Second, the flow of genetic material among the individuals is established. Third, the probability that a given individual is either a carrier of the disease or affected by the disease is determined. The current practice is to perform each of these three steps manually, which is costly, time consuming, labor-intensive, and error-prone. As such, the knowledge-intensive data analysis and interpretation supersede the actual experimentation effort as the major bottleneck in molecular diagnostics. By examining the human problem solving for the task, we have designed and implemented a prototype knowledge-based system capable of fully automating linkage-based molecular diagnostics in X-linked genetic disorders, including Duchenne Muscular Dystrophy (DMD). Our system uses knowledge-based interpretation of gel electrophoresis images to determine individual DNA marker labels, a constraint satisfaction search for consistent genetic flow among individuals, and a blackboard-style problem solver for risk assessment. We describe the system`s successful diagnosis of DMD carrier and affected individuals from raw clinical data.
Solving global optimization problems on GPU cluster
Energy Technology Data Exchange (ETDEWEB)
Barkalov, Konstantin; Gergel, Victor; Lebedev, Ilya [Lobachevsky State University of Nizhni Novgorod, Gagarin Avenue 23, 603950 Nizhni Novgorod (Russian Federation)
2016-06-08
The paper contains the results of investigation of a parallel global optimization algorithm combined with a dimension reduction scheme. This allows solving multidimensional problems by means of reducing to data-independent subproblems with smaller dimension solved in parallel. The new element implemented in the research consists in using several graphic accelerators at different computing nodes. The paper also includes results of solving problems of well-known multiextremal test class GKLS on Lobachevsky supercomputer using tens of thousands of GPU cores.
Some Applications of Algebraic System Solving
Roanes-Lozano, Eugenio
2011-01-01
Technology and, in particular, computer algebra systems, allows us to change both the way we teach mathematics and the mathematical curriculum. Curiously enough, unlike what happens with linear system solving, algebraic system solving is not widely known. The aim of this paper is to show that, although the theory lying behind the "exact…
Green, David L.; Berry, Lee A.; Simpson, Adam B.; Younkin, Timothy R.
2018-04-01
We present the KINETIC-J code, a computational kernel for evaluating the linearized Vlasov equation with application to calculating the kinetic plasma response (current) to an applied time harmonic wave electric field. This code addresses the need for a configuration space evaluation of the plasma current to enable kinetic full-wave solvers for waves in hot plasmas to move beyond the limitations of the traditional Fourier spectral methods. We benchmark the kernel via comparison with the standard k →-space forms of the hot plasma conductivity tensor.
Validation of DNA-based identification software by computation of pedigree likelihood ratios
Slooten, K.
Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually
Improving mathematical problem solving : A computerized approach
Harskamp, EG; Suhre, CJM
Mathematics teachers often experience difficulties in teaching students to become skilled problem solvers. This paper evaluates the effectiveness of two interactive computer programs for high school mathematics problem solving. Both programs present students with problems accompanied by instruction
Physics: Quantum problems solved through games
Maniscalco, Sabrina
2016-04-01
Humans are better than computers at performing certain tasks because of their intuition and superior visual processing. Video games are now being used to channel these abilities to solve problems in quantum physics. See Letter p.210
Directory of Open Access Journals (Sweden)
Yaritza Tardo Fernández
2013-02-01
Full Text Available The cultural, technological and eminently social character of the computer programming problems solving process, joined with the complexity and difficulties detected in their teaching, has contributed to increase the concern about the study of the processes of communication, transmission and understanding of computer programming and to attract the attention of a wide scientific community in correspondence with the growing development that this reaches at the present time. That is the reason why this paper has the objective of discover, from the didactic point of view, the integrators axes of an algorithmic logic that solves the contradiction that is revealed in the formative process between the mathematic modeling and their algorithmic systematization to empower an efficient performance of the professionals of Computer Science and Computer Engineering. In this sense a new didactic proposal is based, that consist in an algorithmic logic, in which are specified and explained those essentials processes that should be carry out to solve computer programming problems. Based on the theoretical fundaments, we concluded that these processes constitute didactics moments, required in order to solve the contradiction mentioned before.RESUMENEl carácter eminentemente social, cultural y tecnológico del proceso de resolución de problemas de programación computacional, junto a la complejidad y dificultades detectadas en su enseñanza, han contribuido a despertar la preocupación por el estudio de los procesos de comunicación, transmisión y comprensión de la Programación y a interesar a una amplia comunidad científica en correspondencia con el creciente desarrollo que ésta alcanza en la actualidad. Razón por la cual este trabajo tiene como objetivo que se develen, desde el punto de vista didáctico, los ejes integradores de una lógica algorítmica que sea contentiva de la solución a la contradicción que se revela en el proceso formativo entre la
CERN. Geneva
2008-01-01
What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...
Students’ difficulties in probabilistic problem-solving
Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.
2018-03-01
There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.
The Influence of Square Planar Platinum Complexes on DNA Bases Pairing. An ab initio DFT Study
Czech Academy of Sciences Publication Activity Database
Burda, J. V.; Šponer, Jiří; Leszczynski, J.
2001-01-01
Roč. 3, č. 19 (2001), s. 4404-4411 ISSN 1463-9076 R&D Projects: GA MŠk LN00A032 Institutional research plan: CEZ:AV0Z4040901 Keywords : DNA base pairing * platinated base pairs * ab initio DFT study Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.787, year: 2001
Potential for DNA-based ID of Great Lakes fauna: Species inventories vs. barcode libraries
DNA-based identification of mixed-organism samples offers the potential to greatly reduce the need for resource-intensive morphological identification, which would be of value both to biotic condition assessment and non-native species early-detection monitoring. However the abil...
Research Report Non-invasive DNA-based species and sex ...
Indian Academy of Sciences (India)
shrushti modi
Non-invasive DNA-based species and sex identification of Asiatic wild dog (Cuon alpinus) .... We did not find any cross-gender amplification with any of the reference or field-collected samples. Success rate for sex discrimination for all field-.
Performance of various density functionals for the hydrogen bonds in DNA base pairs
van der Wijst, T.; Fonseca Guerra, C.; Swart, M.; Bickelhaupt, F.M.
2006-01-01
We have investigated the performance of seven popular density functionals (B3LYP, BLYP, BP86, mPW, OPBE, PBE, PW91) for describing the geometry and stability of the hydrogen bonds in DNA base pairs. For the gas-phase situation, the hydrogen-bond lengths and strengths in the DNA pairs have been
DNA-based asymmetric catalysis : Sequence-dependent rate acceleration and enantioselectivity
Boersma, Arnold J.; Klijn, Jaap E.; Feringa, Ben L.; Roelfes, Gerard
2008-01-01
This study shows that the role of DNA in the DNA-based enantioselective Diels-Alder reaction of azachalcone with cyclopentadiene is not limited to that of a chiral scaffold. DNA in combination with the copper complex of 4,4'-dimethyl-2,2'-bipyridine (Cu-L1) gives rise to a rate acceleration of up to
Metalophillic attraction in the consecutive T-HgII-T DNA base pairs
Czech Academy of Sciences Publication Activity Database
Benda, Ladislav; Straka, Michal; Bouř, Petr; Tanaka, Y.; Sychrovský, Vladimír
2012-01-01
Roč. 12, č. 1 (2012), s. 50-50 ISSN 1210-8529. [10th Discussions in Structural Molecular Biology. 22.03.2012-24.03.2012, Nové Hrady] Institutional research plan: CEZ:AV0Z40550506 Keywords : T-HgII-T * DNA base pairs Subject RIV: CF - Physical ; Theoretical Chemistry
DNA-based stable isotope probing: a link between community structure and function
Czech Academy of Sciences Publication Activity Database
Uhlík, Ondřej; Ječná, K.; Leigh, M. B.; Macková, Martina; Macek, Tomáš
2009-01-01
Roč. 407, č. 12 (2009), s. 3611-3619 ISSN 0048-9697 Grant - others:GA MŠk(CZ) 2B08031 Program:2B Institutional research plan: CEZ:AV0Z40550506 Keywords : DNA-based stable isotope probing * microbial diversity * bioremediation Subject RIV: EI - Biotechnology ; Bionics Impact factor: 2.905, year: 2009
DNA-based approaches to identify forest fungi in Pacific Islands: A pilot study
Anna E. Case; Sara M. Ashiglar; Phil G. Cannon; Ernesto P. Militante; Edwin R. Tadiosa; Mutya Quintos-Manalo; Nelson M. Pampolina; John W. Hanna; Fred E. Brooks; Amy L. Ross-Davis; Mee-Sook Kim; Ned B. Klopfenstein
2013-01-01
DNA-based diagnostics have been successfully used to characterize diverse forest fungi (e.g., Hoff et al. 2004, Kim et al. 2006, Glaeser & Lindner 2011). DNA sequencing of the internal transcribed spacer (ITS) and large subunit (LSU) regions of nuclear ribosomal DNA (rDNA) has proved especially useful (Sonnenberg et al. 2007, Seifert 2009, Schoch et al. 2012) for...
DNA-based identification of Armillaria isolates from peach orchards in Mexico state
Ruben Damian Elias Roman; Ned B. Klopfenstein; Dionicio Alvarado Rosales; Mee-Sook Kim; Anna E. Case; Sara M. Ashiglar; John W. Hanna; Amy L. Ross-Davis; Remigio A. Guzman Plazola
2012-01-01
A collaborative project between the Programa de FitopatologÃa, Colegio de Postgraduados, Texcoco, Estado de Mexico and the USDA Forest Service - RMRS, Moscow Forest Pathology Laboratory has begun this year (2011) to assess which species of Armillaria are causing widespread and severe damage to the peach orchards from MÃ©xico state, Mexico. We are employing a DNA-based...
DNA-based identification and phylogeny of North American Armillaria species
Amy L. Ross-Davis; John W. Hanna; Ned B. Klopfenstein
2011-01-01
Because Armillaria species display different ecological behaviors across diverse forest ecosystems, it is critical to identify Armillaria species accurately for any assessment of forest health. To further develop DNA-based identification methods, partial sequences of the translation elongation factor-1 alpha (EF-1α) gene were used to examine the phylogenetic...
DNA-based catalytic enantioselective intermolecular oxa-Michael addition reactions
Megens, Rik P.; Roelfes, Gerard
2012-01-01
Using the DNA-based catalysis concept, a novel Cu(II) catalyzed enantioselective oxa-Michael addition of alcohols to enones is reported. Enantioselectivities of up to 86% were obtained. The presence of water is important for the reactivity, possibly by reverting unwanted side reactions such as
M. Kasemann
Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...
The semantic system is involved in mathematical problem solving.
Zhou, Xinlin; Li, Mengyi; Li, Leinian; Zhang, Yiyun; Cui, Jiaxin; Liu, Jie; Chen, Chuansheng
2018-02-01
Numerous studies have shown that the brain regions around bilateral intraparietal cortex are critical for number processing and arithmetical computation. However, the neural circuits for more advanced mathematics such as mathematical problem solving (with little routine arithmetical computation) remain unclear. Using functional magnetic resonance imaging (fMRI), this study (N = 24 undergraduate students) compared neural bases of mathematical problem solving (i.e., number series completion, mathematical word problem solving, and geometric problem solving) and arithmetical computation. Direct subject- and item-wise comparisons revealed that mathematical problem solving typically had greater activation than arithmetical computation in all 7 regions of the semantic system (which was based on a meta-analysis of 120 functional neuroimaging studies on semantic processing). Arithmetical computation typically had greater activation in the supplementary motor area and left precentral gyrus. The results suggest that the semantic system in the brain supports mathematical problem solving. Copyright © 2017 Elsevier Inc. All rights reserved.
I. Fisk
2011-01-01
Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...
P. McBride
The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...
M. Kasemann
Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...
The integration of DNA-based identification methods into bioassessments could result in more accurate representations of species distributions and species-habitat relationships. DNA-based approaches may be particularly informative for tracking the distributions of rare and/or inv...
Capturing Problem-Solving Processes Using Critical Rationalism
Chitpin, Stephanie; Simon, Marielle
2012-01-01
The examination of problem-solving processes continues to be a current research topic in education. Knowing how to solve problems is not only a key aspect of learning mathematics but is also at the heart of cognitive theories, linguistics, artificial intelligence, and computers sciences. Problem solving is a multistep, higher-order cognitive task…
I. Fisk
2013-01-01
Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...
Solving the Schroedinger equation using Smolyak interpolants
International Nuclear Information System (INIS)
Avila, Gustavo; Carrington, Tucker Jr.
2013-01-01
In this paper, we present a new collocation method for solving the Schroedinger equation. Collocation has the advantage that it obviates integrals. All previous collocation methods have, however, the crucial disadvantage that they require solving a generalized eigenvalue problem. By combining Lagrange-like functions with a Smolyak interpolant, we device a collocation method that does not require solving a generalized eigenvalue problem. We exploit the structure of the grid to develop an efficient algorithm for evaluating the matrix-vector products required to compute energy levels and wavefunctions. Energies systematically converge as the number of points and basis functions are increased
Hydrogen bond disruption in DNA base pairs from (14)C transmutation.
Sassi, Michel; Carter, Damien J; Uberuaga, Blas P; Stanek, Christopher R; Mancera, Ricardo L; Marks, Nigel A
2014-09-04
Recent ab initio molecular dynamics simulations have shown that radioactive carbon does not normally fragment DNA bases when it decays. Motivated by this finding, density functional theory and Bader analysis have been used to quantify the effect of C → N transmutation on hydrogen bonding in DNA base pairs. We find that (14)C decay has the potential to significantly alter hydrogen bonds in a variety of ways including direct proton shuttling (thymine and cytosine), thermally activated proton shuttling (guanine), and hydrogen bond breaking (cytosine). Transmutation substantially modifies both the absolute and relative strengths of the hydrogen bonding pattern, and in two instances (adenine and cytosine), the density at the critical point indicates development of mild covalent character. Since hydrogen bonding is an important component of Watson-Crick pairing, these (14)C-induced modifications, while infrequent, may trigger errors in DNA transcription and replication.
Nutrigenetics and personalized nutrition: are we ready for DNA-based dietary advice?
Grimaldi, Keith A
2014-05-01
Common genetic variation affects individual nutrient requirements and the use of DNA-based dietary advice, derived from nutrigenetics, has been growing. The growth is about to accelerate as the cost of genotyping continues to fall and research results from major nutrigenetics projects are published. There is still some skepticism; some barriers remain including some commercial tests, which make exaggerated, incorrect claims. There is a need for more public resources dedicated to unbiased, objective review and dissemination of nutrigenetics information; however, nutrigenetics evidence should be assessed in the context of standard nutritional evidence and should not require higher standards. This article argues that we are ready for some DNA-based dietary advice in general nutrition and it can be beneficial. Examples of the scientific validity and health utility of gene-diet interactions will be given and the development of guidelines for assessment and validation of benefits will be discussed.
Ann-Charlotte Wallenhammar; Albin Gunnarson; Fredrik Hansson; Anders Jonsson
2016-01-01
Outbreaks of clubroot disease caused by the soil-borne obligate parasite Plasmodiophora brassicae are common in oilseed rape (OSR) in Sweden. A DNA-based soil testing service that identifies fields where P. brassicae poses a significant risk of clubroot infection is now commercially available. It was applied here in field surveys to monitor the prevalence of P. brassicae DNA in field soils intended for winter OSR production and winter OSR field experiments. In 2013 in Scania, prior to plantin...
Advanced DNA-Based Point-of-Care Diagnostic Methods for Plant Diseases Detection
Lau, Han Yih; Botella, Jose R.
2017-01-01
Diagnostic technologies for the detection of plant pathogens with point-of-care capability and high multiplexing ability are an essential tool in the fight to reduce the large agricultural production losses caused by plant diseases. The main desirable characteristics for such diagnostic assays are high specificity, sensitivity, reproducibility, quickness, cost efficiency and high-throughput multiplex detection capability. This article describes and discusses various DNA-based point-of care di...
The role of DNA base excision repair in brain homeostasis and disease
DEFF Research Database (Denmark)
Akbari, Mansour; Morevati, Marya; Croteau, Deborah
2015-01-01
Chemical modification and spontaneous loss of nucleotide bases from DNA are estimated to occur at the rate of thousands per human cell per day. DNA base excision repair (BER) is a critical mechanism for repairing such lesions in nuclear and mitochondrial DNA. Defective expression or function of p...... energy homeostasis, mitochondrial function and cellular bioenergetics, with especially strong influence on neurological function. Further studies in this area could lead to novel approaches to prevent and treat human neurodegenerative disease....
Recent advance in DNA-based traceability and authentication of livestock meat PDO and PGI products.
Nicoloso, Letizia; Crepaldi, Paola; Mazza, Raffaele; Ajmone-Marsan, Paolo; Negrini, Riccardo
2013-04-01
This review updates the available molecular techniques and technologies and discusses how they can be used for traceability, food control and enforcement activities. The review also provides examples on how molecular techniques succeeded to trace back unknowns to their breeds of origin, to fingerprint single individuals and to generate evidence in court cases. The examples demonstrate the potential of the DNA based traceability techniques and explore possibilities for translating the next generation genomics tools into a food and feed control and enforcement framework.
Singh, Chandralekha
2009-07-01
One finding of cognitive research is that people do not automatically acquire usable knowledge by spending lots of time on task. Because students' knowledge hierarchy is more fragmented, "knowledge chunks" are smaller than those of experts. The limited capacity of short term memory makes the cognitive load high during problem solving tasks, leaving few cognitive resources available for meta-cognition. The abstract nature of the laws of physics and the chain of reasoning required to draw meaningful inferences makes these issues critical. In order to help students, it is crucial to consider the difficulty of a problem from the perspective of students. We are developing and evaluating interactive problem-solving tutorials to help students in the introductory physics courses learn effective problem-solving strategies while solidifying physics concepts. The self-paced tutorials can provide guidance and support for a variety of problem solving techniques, and opportunity for knowledge and skill acquisition.
Teaching Creative Problem Solving.
Christensen, Kip W.; Martin, Loren
1992-01-01
Interpersonal and cognitive skills, adaptability, and critical thinking can be developed through problem solving and cooperative learning in technology education. These skills have been identified as significant needs of the workplace as well as for functioning in society. (SK)
I. Fisk
2010-01-01
Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...
M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley
Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...
DNA-Based Nanobiosensors as an Emerging Platform for Detection of Disease
Directory of Open Access Journals (Sweden)
Khalid M. Abu-Salah
2015-06-01
Full Text Available Detection of disease at an early stage is one of the biggest challenges in medicine. Different disciplines of science are working together in this regard. The goal of nanodiagnostics is to provide more accurate tools for earlier diagnosis, to reduce cost and to simplify healthcare delivery of effective and personalized medicine, especially with regard to chronic diseases (e.g., diabetes and cardiovascular diseases that have high healthcare costs. Up-to-date results suggest that DNA-based nanobiosensors could be used effectively to provide simple, fast, cost-effective, sensitive and specific detection of some genetic, cancer, and infectious diseases. In addition, they could potentially be used as a platform to detect immunodeficiency, and neurological and other diseases. This review examines different types of DNA-based nanobiosensors, the basic principles upon which they are based and their advantages and potential in diagnosis of acute and chronic diseases. We discuss recent trends and applications of new strategies for DNA-based nanobiosensors, and emphasize the challenges in translating basic research to the clinical laboratory.
International Nuclear Information System (INIS)
Ahmed, Towfiq; Haraldsen, Jason T; Balatsky, Alexander V; Rehr, John J; Di Ventra, Massimiliano; Schuller, Ivan
2014-01-01
Nanopore-based sequencing has demonstrated a significant potential for the development of fast, accurate, and cost-efficient fingerprinting techniques for next generation molecular detection and sequencing. We propose a specific multilayered graphene-based nanopore device architecture for the recognition of single biomolecules. Molecular detection and analysis can be accomplished through the detection of transverse currents as the molecule or DNA base translocates through the nanopore. To increase the overall signal-to-noise ratio and the accuracy, we implement a new ‘multi-point cross-correlation’ technique for identification of DNA bases or other molecules on the single molecular level. We demonstrate that the cross-correlations between each nanopore will greatly enhance the transverse current signal for each molecule. We implement first-principles transport calculations for DNA bases surveyed across a multilayered graphene nanopore system to illustrate the advantages of the proposed geometry. A time-series analysis of the cross-correlation functions illustrates the potential of this method for enhancing the signal-to-noise ratio. This work constitutes a significant step forward in facilitating fingerprinting of single biomolecules using solid state technology. (paper)
Ahmed, Towfiq; Haraldsen, Jason T.; Rehr, John J.; Di Ventra, Massimiliano; Schuller, Ivan; Balatsky, Alexander V.
2014-03-01
Nanopore-based sequencing has demonstrated a significant potential for the development of fast, accurate, and cost-efficient fingerprinting techniques for next generation molecular detection and sequencing. We propose a specific multilayered graphene-based nanopore device architecture for the recognition of single biomolecules. Molecular detection and analysis can be accomplished through the detection of transverse currents as the molecule or DNA base translocates through the nanopore. To increase the overall signal-to-noise ratio and the accuracy, we implement a new ‘multi-point cross-correlation’ technique for identification of DNA bases or other molecules on the single molecular level. We demonstrate that the cross-correlations between each nanopore will greatly enhance the transverse current signal for each molecule. We implement first-principles transport calculations for DNA bases surveyed across a multilayered graphene nanopore system to illustrate the advantages of the proposed geometry. A time-series analysis of the cross-correlation functions illustrates the potential of this method for enhancing the signal-to-noise ratio. This work constitutes a significant step forward in facilitating fingerprinting of single biomolecules using solid state technology.
Yaney, Perry P.; Ouchen, Fahima; Grote, James G.
2009-08-01
DC resistivity studies were carried out on biopolymer films of DNA-CTMA and silk fibroin, and on selected traditional polymer films, including PMMA and APC. Films of DNA-CTMA versus molecular weight and with conductive dopants PCBM, BAYTRON P and ammonium tetrachloroplatinate are reported. The films were spin coated on glass slides configured for measurements of volume dc resistance. The measurements used the alternating polarity method to record the applied voltage-dependent current independent of charging and background currents. The Arrhenius equation plus a constant was fitted to the conductivity versus temperature data of the polymers and the non-doped DNA-based biopolymers with activation energies ranging from 0.8 to 1.4 eV.
P. McBride
It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...
M. Kasemann
Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...
I. Fisk
2011-01-01
Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...
I. Fisk
2012-01-01
Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...
M. Kasemann
CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes. Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...
I. Fisk
2010-01-01
Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...
M. Kasemann
Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...
Adams, Wendy Kristine
The purpose of my research was to produce a problem solving evaluation tool for physics. To do this it was necessary to gain a thorough understanding of how students solve problems. Although physics educators highly value problem solving and have put extensive effort into understanding successful problem solving, there is currently no efficient way to evaluate problem solving skill. Attempts have been made in the past; however, knowledge of the principles required to solve the subject problem are so absolutely critical that they completely overshadow any other skills students may use when solving a problem. The work presented here is unique because the evaluation tool removes the requirement that the student already have a grasp of physics concepts. It is also unique because I picked a wide range of people and picked a wide range of tasks for evaluation. This is an important design feature that helps make things emerge more clearly. This dissertation includes an extensive literature review of problem solving in physics, math, education and cognitive science as well as descriptions of studies involving student use of interactive computer simulations, the design and validation of a beliefs about physics survey and finally the design of the problem solving evaluation tool. I have successfully developed and validated a problem solving evaluation tool that identifies 44 separate assets (skills) necessary for solving problems. Rigorous validation studies, including work with an independent interviewer, show these assets identified by this content-free evaluation tool are the same assets that students use to solve problems in mechanics and quantum mechanics. Understanding this set of component assets will help teachers and researchers address problem solving within the classroom.
DEFF Research Database (Denmark)
Chemi, Tatiana
2016-01-01
This chapter aims to deconstruct some persistent myths about creativity: the myth of individualism and of the genius. By looking at literature that approaches creativity as a participatory and distributed phenomenon and by bringing empirical evidence from artists’ studios, the author presents a p......, what can educators at higher education learn from the ways creative groups solve problems? How can artists contribute to inspiring higher education?......This chapter aims to deconstruct some persistent myths about creativity: the myth of individualism and of the genius. By looking at literature that approaches creativity as a participatory and distributed phenomenon and by bringing empirical evidence from artists’ studios, the author presents...... a perspective that is relevant to higher education. The focus here is on how artists solve problems in distributed paths, and on the elements of creative collaboration. Creative problem-solving will be looked at as an ongoing dialogue that artists engage with themselves, with others, with recipients...
2010-01-01
Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...
Contributions from I. Fisk
2012-01-01
Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences. Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...
M. Kasemann
Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...
Matthias Kasemann
Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...
P. MacBride
The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...
I. Fisk
2013-01-01
Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites. Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month. Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB. Figure 3: The volume of data moved between CMS sites in the last six months The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...
I. Fisk
2012-01-01
Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently. Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...
Solving Environmental Problems
DEFF Research Database (Denmark)
Ørding Olsen, Anders; Sofka, Wolfgang; Grimpe, Christoph
2017-01-01
for Research and Technological Development (FP7), our results indicate that the problem-solving potential of a search strategy increases with the diversity of existing knowledge of the partners in a consortium and with the experience of the partners involved. Moreover, we identify a substantial negative effect...... dispersed. Hence, firms need to collaborate. We shed new light on collaborative search strategies led by firms in general and for solving environmental problems in particular. Both topics are largely absent in the extant open innovation literature. Using data from the European Seventh Framework Program...
Learning Matlab a problem solving approach
Gander, Walter
2015-01-01
This comprehensive and stimulating introduction to Matlab, a computer language now widely used for technical computing, is based on an introductory course held at Qian Weichang College, Shanghai University, in the fall of 2014. Teaching and learning a substantial programming language aren’t always straightforward tasks. Accordingly, this textbook is not meant to cover the whole range of this high-performance technical programming environment, but to motivate first- and second-year undergraduate students in mathematics and computer science to learn Matlab by studying representative problems, developing algorithms and programming them in Matlab. While several topics are taken from the field of scientific computing, the main emphasis is on programming. A wealth of examples are completely discussed and solved, allowing students to learn Matlab by doing: by solving problems, comparing approaches and assessing the proposed solutions.
I. Fisk
2011-01-01
Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...
Dual functions of ASCIZ in the DNA base damage response and pulmonary organogenesis.
Directory of Open Access Journals (Sweden)
Sabine Jurado
2010-10-01
Full Text Available Zn²(+-finger proteins comprise one of the largest protein superfamilies with diverse biological functions. The ATM substrate Chk2-interacting Zn²(+-finger protein (ASCIZ; also known as ATMIN and ZNF822 was originally linked to functions in the DNA base damage response and has also been proposed to be an essential cofactor of the ATM kinase. Here we show that absence of ASCIZ leads to p53-independent late-embryonic lethality in mice. Asciz-deficient primary fibroblasts exhibit increased sensitivity to DNA base damaging agents MMS and H2O2, but Asciz deletion knock-down does not affect ATM levels and activation in mouse, chicken, or human cells. Unexpectedly, Asciz-deficient embryos also exhibit severe respiratory tract defects with complete pulmonary agenesis and severe tracheal atresia. Nkx2.1-expressing respiratory precursors are still specified in the absence of ASCIZ, but fail to segregate properly within the ventral foregut, and as a consequence lung buds never form and separation of the trachea from the oesophagus stalls early. Comparison of phenotypes suggests that ASCIZ functions between Wnt2-2b/ß-catenin and FGF10/FGF-receptor 2b signaling pathways in the mesodermal/endodermal crosstalk regulating early respiratory development. We also find that ASCIZ can activate expression of reporter genes via its SQ/TQ-cluster domain in vitro, suggesting that it may exert its developmental functions as a transcription factor. Altogether, the data indicate that, in addition to its role in the DNA base damage response, ASCIZ has separate developmental functions as an essential regulator of respiratory organogenesis.
Introspection in Problem Solving
Jäkel, Frank; Schreiber, Cornell
2013-01-01
Problem solving research has encountered an impasse. Since the seminal work of Newell und Simon (1972) researchers do not seem to have made much theoretical progress (Batchelder and Alexander, 2012; Ohlsson, 2012). In this paper we argue that one factor that is holding back the field is the widespread rejection of introspection among cognitive…
Greene, Kim; Heyck-Williams, Jeff; Timpson Gray, Elicia
2017-01-01
Problem solving spans all grade levels and content areas, as evidenced by this compilation of projects from schools across the United States. In one project, high school girls built a solar-powered tent to serve their city's homeless population. In another project, 4th graders explored historic Jamestown to learn about the voices lost to history.…
Solving Linear Differential Equations
Nguyen, K.A.; Put, M. van der
2010-01-01
The theme of this paper is to 'solve' an absolutely irreducible differential module explicitly in terms of modules of lower dimension and finite extensions of the differential field K. Representations of semi-simple Lie algebras and differential Galo is theory are the main tools. The results extend
Utomo, P.H.; Makarim, R.H.
2017-01-01
A Binary puzzle is a Sudoku-like puzzle with values in each cell taken from the set {0,1} {0,1}. Let n≥4 be an even integer, a solved binary puzzle is an n×n binary array that satisfies the following conditions: (1) no three consecutive ones and no three consecutive zeros in each row and each
Ayrinhac, Simon
2014-01-01
We present in this work a demonstration of the maze-solving problem with electricity. Electric current flowing in a maze as a printed circuit produces Joule heating and the right way is instantaneously revealed with infrared thermal imaging. The basic properties of electric current can be discussed in this context, with this challenging question:…
Transport equation solving methods
International Nuclear Information System (INIS)
Granjean, P.M.
1984-06-01
This work is mainly devoted to Csub(N) and Fsub(N) methods. CN method: starting from a lemma stated by Placzek, an equivalence is established between two problems: the first one is defined in a finite medium bounded by a surface S, the second one is defined in the whole space. In the first problem the angular flux on the surface S is shown to be the solution of an integral equation. This equation is solved by Galerkin's method. The Csub(N) method is applied here to one-velocity problems: in plane geometry, slab albedo and transmission with Rayleigh scattering, calculation of the extrapolation length; in cylindrical geometry, albedo and extrapolation length calculation with linear scattering. Fsub(N) method: the basic integral transport equation of the Csub(N) method is integrated on Case's elementary distributions; another integral transport equation is obtained: this equation is solved by a collocation method. The plane problems solved by the Csub(N) method are also solved by the Fsub(N) method. The Fsub(N) method is extended to any polynomial scattering law. Some simple spherical problems are also studied. Chandrasekhar's method, collision probability method, Case's method are presented for comparison with Csub(N) and Fsub(N) methods. This comparison shows the respective advantages of the two methods: a) fast convergence and possible extension to various geometries for Csub(N) method; b) easy calculations and easy extension to polynomial scattering for Fsub(N) method [fr
Dobbs, David E.
2013-01-01
A direct method is given for solving first-order linear recurrences with constant coefficients. The limiting value of that solution is studied as "n to infinity." This classroom note could serve as enrichment material for the typical introductory course on discrete mathematics that follows a calculus course.
Analytical method for solving radioactive transformations
International Nuclear Information System (INIS)
Vukadin, Z.
1999-01-01
The exact method of solving radioactive transformations is presented. Nonsingular Bateman coefficients, which can be computed using recurrence formulas, greatly reduce computational time and eliminate singularities that often arise in problems involving nuclide transmutations. Depletion function power series expansion enables high accuracy of the performed calculations, specially in a case of a decay constants with closely spaced values. Generality and simplicity of the method make the method useful for many practical applications. (author)
Hering, Daniel; Borja, Angel; Jones, J Iwan; Pont, Didier; Boets, Pieter; Bouchez, Agnes; Bruce, Kat; Drakare, Stina; Hänfling, Bernd; Kahlert, Maria; Leese, Florian; Meissner, Kristian; Mergen, Patricia; Reyjol, Yorick; Segurado, Pedro; Vogler, Alfried; Kelly, Martyn
2018-07-01
Assessment of ecological status for the European Water Framework Directive (WFD) is based on "Biological Quality Elements" (BQEs), namely phytoplankton, benthic flora, benthic invertebrates and fish. Morphological identification of these organisms is a time-consuming and expensive procedure. Here, we assess the options for complementing and, perhaps, replacing morphological identification with procedures using eDNA, metabarcoding or similar approaches. We rate the applicability of DNA-based identification for the individual BQEs and water categories (rivers, lakes, transitional and coastal waters) against eleven criteria, summarised under the headlines representativeness (for example suitability of current sampling methods for DNA-based identification, errors from DNA-based species detection), sensitivity (for example capability to detect sensitive taxa, unassigned reads), precision of DNA-based identification (knowledge about uncertainty), comparability with conventional approaches (for example sensitivity of metrics to differences in DNA-based identification), cost effectiveness and environmental impact. Overall, suitability of DNA-based identification is particularly high for fish, as eDNA is a well-suited sampling approach which can replace expensive and potentially harmful methods such as gill-netting, trawling or electrofishing. Furthermore, there are attempts to replace absolute by relative abundance in metric calculations. For invertebrates and phytobenthos, the main challenges include the modification of indices and completing barcode libraries. For phytoplankton, the barcode libraries are even more problematic, due to the high taxonomic diversity in plankton samples. If current assessment concepts are kept, DNA-based identification is least appropriate for macrophytes (rivers, lakes) and angiosperms/macroalgae (transitional and coastal waters), which are surveyed rather than sampled. We discuss general implications of implementing DNA-based identification
Alteration of gene conversion patterns in Sordaria fimicola by supplementation with DNA bases.
Kitani, Y; Olive, L S
1970-08-01
Supplementation with DNA bases in crosses of Sordaria fimicola heterozygous for spore color markers (g(1), h(2)) within the gray-spore (g) locus has been found to cause significant alterations in patterns of gene conversion at the two mutant sites. Each base had its own characteristic effect in altering the conversion pattern, and responses of the two mutant sites to the four bases were different in several ways. Also, the responses of the two involved chromatids of the meiotic bivalent were different.
Electronic properties and assambly of DNA-based molecules on gold surfaces
DEFF Research Database (Denmark)
Salvatore, Princia
, highly base specific voltammetric peak in the presence of spermidine ions. A capacitive origin was attributed to this peak, and a novel route to detection of hybridization and base pair mismatches proposed on the basis of the high sensitivity to base pair mismatches showed by such ON-based monolayers...... as widely employed as Au(111) surfaces). In particular, SERS offered a valuable and rapid way ofcharacterising interactions between the DNA-based molecules and the NP surface, with no need for complex sample preparation....
Toward Solving the Problem of Problem Solving: An Analysis Framework
Roesler, Rebecca A.
2016-01-01
Teaching is replete with problem solving. Problem solving as a skill, however, is seldom addressed directly within music teacher education curricula, and research in music education has not examined problem solving systematically. A framework detailing problem-solving component skills would provide a needed foundation. I observed problem solving…
M. Kasemann
CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...
Augmentation of French grunt diet description using combined visual and DNA-based analyses
Hargrove, John S.; Parkyn, Daryl C.; Murie, Debra J.; Demopoulos, Amanda W.J.; Austin, James D.
2012-01-01
Trophic linkages within a coral-reef ecosystem may be difficult to discern in fish species that reside on, but do not forage on, coral reefs. Furthermore, dietary analysis of fish can be difficult in situations where prey is thoroughly macerated, resulting in many visually unrecognisable food items. The present study examined whether the inclusion of a DNA-based method could improve the identification of prey consumed by French grunt, Haemulon flavolineatum, a reef fish that possesses pharyngeal teeth and forages on soft-bodied prey items. Visual analysis indicated that crustaceans were most abundant numerically (38.9%), followed by sipunculans (31.0%) and polychaete worms (5.2%), with a substantial number of unidentified prey (12.7%). For the subset of prey with both visual and molecular data, there was a marked reduction in the number of unidentified sipunculans (visual – 31.1%, combined &ndash 4.4%), unidentified crustaceans (visual &ndash 15.6%, combined &ndash 6.7%), and unidentified taxa (visual &ndash 11.1%, combined &ndash 0.0%). Utilising results from both methodologies resulted in an increased number of prey placed at the family level (visual &ndash 6, combined &ndash 33) and species level (visual &ndash 0, combined &ndash 4). Although more costly than visual analysis alone, our study demonstrated the feasibility of DNA-based identification of visually unidentifiable prey in the stomach contents of fish.
The Five Immune Forces Impacting DNA-Based Cancer Immunotherapeutic Strategy
Directory of Open Access Journals (Sweden)
Suneetha Amara
2017-03-01
Full Text Available DNA-based vaccine strategy is increasingly realized as a viable cancer treatment approach. Strategies to enhance immunogenicity utilizing tumor associated antigens have been investigated in several pre-clinical and clinical studies. The promising outcomes of these studies have suggested that DNA-based vaccines induce potent T-cell effector responses and at the same time cause only minimal side-effects to cancer patients. However, the immune evasive tumor microenvironment is still an important hindrance to a long-term vaccine success. Several options are currently under various stages of study to overcome immune inhibitory effect in tumor microenvironment. Some of these approaches include, but are not limited to, identification of neoantigens, mutanome studies, designing fusion plasmids, vaccine adjuvant modifications, and co-treatment with immune-checkpoint inhibitors. In this review, we follow a Porter’s analysis analogy, otherwise commonly used in business models, to analyze various immune-forces that determine the potential success and sustainable positive outcomes following DNA vaccination using non-viral tumor associated antigens in treatment against cancer.
Ionically conducting Er3+-doped DNA-based biomembranes for electrochromic devices
International Nuclear Information System (INIS)
Leones, R.; Fernandes, M.; Sentanin, F.; Cesarino, I.; Lima, J.F.; Zea Bermudez, V. de; Pawlicka, A.; Magon, C.J.; Donoso, J.P.; Silva, M.M.
2014-01-01
Biopolymer-based membranes have particular interest due to their biocompatibility, Biodegradability, easy extraction from natural resources and low cost. The incorporation of Er 3+ ions into natural macromolecule hosts with the purpose of producing highly efficient emitting phosphors is of widespread interest in materials science, due to their important roles in display devices. Thus, biomembranes may be viewed as innovative materials for the area of optics. This paper describes studies of luminescent material DNA-based membranes doped with erbium triflate and demonstrates that their potential technological applications may be expanded to electrochromic devices. The sample that exhibits the highest ionic conductivity is DNA 10 Er, (1.17 × 10 −5 and 7.76 × 10 −4 S.cm −1 at 30 and 100 °C, respectively). DSC, XRD and POM showed that the inclusion of the guest salt into DNA does not change significantly its amorphous nature. The overall redox stability was ca. 2.0 V indicating that these materials have an acceptable stability window for applications in solid state electrochemical devices. The EPR analysis suggested that the Er 3+ ions are distributed in various environments. A small ECD comprising a Er 3+ -doped DNA-based membrane was assembled and tested by cyclic voltammetry and chronoamperometry. These electrochemical analyses revealed a pale blue color to transparent color change and a decrease of the charge density from -4.0 to -1.2 mC.cm −2 during 4000 color/bleaching cycles
DNA-based stable isotope probing: a link between community structure and function
International Nuclear Information System (INIS)
Uhlik, Ondrej; Jecna, Katerina; Leigh, Mary Beth; Mackova, Martina; Macek, Tomas
2009-01-01
DNA-based molecular techniques permit the comprehensive determination of microbial diversity but generally do not reveal the relationship between the identity and the function of microorganisms. The first direct molecular technique to enable the linkage of phylogeny with function is DNA-based stable isotope probing (DNA-SIP). Applying this method first helped describe the utilization of simple compounds, such as methane, methanol or glucose and has since been used to detect microbial communities active in the utilization of a wide variety of compounds, including various xenobiotics. The principle of the method lies in providing 13C-labeled substrate to a microbial community and subsequent analyses of the 13C-DNA isolated from the community. Isopycnic centrifugation permits separating 13C-labeled DNA of organisms that utilized the substrate from 12C-DNA of the inactive majority. As the whole metagenome of active populations is isolated, its follow-up analysis provides successful taxonomic identification as well as the potential for functional gene analyses. Because of its power, DNA-SIP has become one of the leading techniques of microbial ecology research. But from other point of view, it is a labor-intensive method that requires careful attention to detail during each experimental step in order to avoid misinterpretation of results.
Ultrafast dynamics of solvation and charge transfer in a DNA-based biomaterial.
Choudhury, Susobhan; Batabyal, Subrata; Mondol, Tanumoy; Sao, Dilip; Lemmens, Peter; Pal, Samir Kumar
2014-05-01
Charge migration along DNA molecules is a key factor for DNA-based devices in optoelectronics and biotechnology. The association of a significant amount of water molecules in DNA-based materials for the intactness of the DNA structure and their dynamic role in the charge-transfer (CT) dynamics is less documented in contemporary literature. In the present study, we have used a genomic DNA-cetyltrimethyl ammonium chloride (CTMA) complex, a technological important biomaterial, and Hoechest 33258 (H258), a well-known DNA minor groove binder, as fluorogenic probe for the dynamic solvation studies. The CT dynamics of CdSe/ZnS quantum dots (QDs; 5.2 nm) embedded in the as-prepared and swollen biomaterial have also been studied and correlated with that of the timescale of solvation. We have extended our studies on the temperature-dependent CT dynamics of QDs in a nanoenvironment of an anionic, sodium bis(2-ethylhexyl)sulfosuccinate reverse micelle (AOT RMs), whereby the number of water molecules and their dynamics can be tuned in a controlled manner. A direct correlation of the dynamics of solvation and that of the CT in the nanoenvironments clearly suggests that the hydration barrier within the Arrhenius framework essentially dictates the charge-transfer dynamics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Structuring polymers for delivery of DNA-based therapeutics: updated insights.
Gupta, Madhu; Tiwari, Shailja; Vyas, Suresh
2012-01-01
Gene therapy offers greater opportunities for treating numerous incurable diseases from genetic disorders, infections, and cancer. However, development of appropriate delivery systems could be one of the most important factors to overcome numerous biological barriers for delivery of various therapeutic molecules. A number of nonviral polymer-mediated vectors have been developed for DNA delivery and offer the potential to surmount the associated problems of their viral counterpart. To address the concerns associated with safety issues, a wide range of polymeric vectors are available and have been utilized successfully to deliver their therapeutics in vivo. Today's research is mainly focused on the various natural or synthetic polymer-based delivery carriers that protect the DNA molecule from degradation, which offer specific targeting to the desired cells after systemic administration, have transfection efficiencies equivalent to virus-mediated gene delivery, and have long-term gene expression through sustained-release mechanisms. This review explores an updated overview of different nonviral polymeric delivery system for delivery of DNA-based therapeutics. These polymeric carriers have been evaluated in vitro and in vivo and are being utilized in various stages of clinical evaluation. Continued research and understanding of the principles of polymer-based gene delivery systems will enable us to develop new and efficient delivery systems for the delivery of DNA-based therapeutics to achieve the goal of efficacious and specific gene therapy for a vast array of clinical disorders as the therapeutic solutions of tomorrow.
Creativity and Problem Solving
DEFF Research Database (Denmark)
Vidal, Rene Victor Valqui
2004-01-01
This paper presents some modern and interdisciplinary concepts about creativity and creative processes of special relevance for Operational Research workers. Central publications in the area Creativity-Operational Research are shortly reviewed. Some creative tools and the Creative Problem Solving...... approach are also discussed. Finally, some applications of these concepts and tools are outlined. Some central references are presented for further study of themes related to creativity or creative tools....
Creativity and problem Solving
Directory of Open Access Journals (Sweden)
René Victor Valqui Vidal
2004-12-01
Full Text Available This paper presents some modern and interdisciplinary concepts about creativity and creative processes of special relevance for Operational Research workers. Central publications in the area Creativity-Operational Research are shortly reviewed. Some creative tools and the Creative Problem Solving approach are also discussed. Finally, some applications of these concepts and tools are outlined. Some central references are presented for further study of themes related to creativity or creative tools.
Programming languages for business problem solving
Wang, Shouhong
2007-01-01
It has become crucial for managers to be computer literate in today's business environment. It is also important that those entering the field acquire the fundamental theories of information systems, the essential practical skills in computer applications, and the desire for life-long learning in information technology. Programming Languages for Business Problem Solving presents a working knowledge of the major programming languages, including COBOL, C++, Java, HTML, JavaScript, VB.NET, VBA, ASP.NET, Perl, PHP, XML, and SQL, used in the current business computing environment. The book examin
Multiscale empirical interpolation for solving nonlinear PDEs
Calo, Victor M.
2014-12-01
In this paper, we propose a multiscale empirical interpolation method for solving nonlinear multiscale partial differential equations. The proposed method combines empirical interpolation techniques and local multiscale methods, such as the Generalized Multiscale Finite Element Method (GMsFEM). To solve nonlinear equations, the GMsFEM is used to represent the solution on a coarse grid with multiscale basis functions computed offline. Computing the GMsFEM solution involves calculating the system residuals and Jacobians on the fine grid. We use empirical interpolation concepts to evaluate these residuals and Jacobians of the multiscale system with a computational cost which is proportional to the size of the coarse-scale problem rather than the fully-resolved fine scale one. The empirical interpolation method uses basis functions which are built by sampling the nonlinear function we want to approximate a limited number of times. The coefficients needed for this approximation are computed in the offline stage by inverting an inexpensive linear system. The proposed multiscale empirical interpolation techniques: (1) divide computing the nonlinear function into coarse regions; (2) evaluate contributions of nonlinear functions in each coarse region taking advantage of a reduced-order representation of the solution; and (3) introduce multiscale proper-orthogonal-decomposition techniques to find appropriate interpolation vectors. We demonstrate the effectiveness of the proposed methods on several nonlinear multiscale PDEs that are solved with Newton\\'s methods and fully-implicit time marching schemes. Our numerical results show that the proposed methods provide a robust framework for solving nonlinear multiscale PDEs on a coarse grid with bounded error and significant computational cost reduction.
Dreams and creative problem-solving.
Barrett, Deirdre
2017-10-01
Dreams have produced art, music, novels, films, mathematical proofs, designs for architecture, telescopes, and computers. Dreaming is essentially our brain thinking in another neurophysiologic state-and therefore it is likely to solve some problems on which our waking minds have become stuck. This neurophysiologic state is characterized by high activity in brain areas associated with imagery, so problems requiring vivid visualization are also more likely to get help from dreaming. This article reviews great historical dreams and modern laboratory research to suggest how dreams can aid creativity and problem-solving. © 2017 New York Academy of Sciences.
Surprising conformers of the biologically important A·T DNA base pairs: QM/QTAIM proofs
Brovarets', Ol'ha O.; Tsiupa, Kostiantyn S.; Hovorun, Dmytro M.
2018-02-01
For the first time novel high-energy conformers – A·T(wWC) (5.36), A·T(wrWC) (5.97), A·T(wH) (5.78) and A·T(wrH) (ΔG=5.82 kcal•mol-1) were revealed for each of the four biologically important A·T(WC) DNA base pairs – Watson-Crick A·T(WC), reverse Watson-Crick A·T(rWC), Hoogsteen A·T(H) and reverse Hoogsteen A·T(rH) at the MP2/aug-cc-pVDZ//B3LYP/6-311++G(d,p) level of quantum-mechanical theory in the continuum with ɛ=4 under normal conditions. Each of these conformers possesses substantially non-planar wobble (w) structure and is stabilized by the participation of the two anti-parallel N6H/N6H'…O4/O2 and N3H…N6 H-bonds, involving the pyramidalized amino group of the A DNA base as an acceptor and a donor of the H-bonding. The transition states – TSA·T(WC)↔A·T(wWC), TSA·T(rWC)↔A·T(wrWC), TSA·T(H)↔A·T(wH) and TSA·T(rH)↔A·T(wrH), controlling the dipole-active transformations of the conformers from the main plane-symmetric state into the high-energy, significantly non-planar state and vice versa, were localized. They also possess wobble structures similarly to the high-energy conformers and are stabilized by the participation of the N6H/N6H'…O4/O2 and N3H…N6 H-bonds. Discovered conformers of the A·T DNA base pairs are dynamically stable short-lived structures (lifetime τ = (1.4-3.9) ps). Their possible biological significance and future perspectives have been briefly discussed.
Metal-mediated DNA base pairing: alternatives to hydrogen-bonded Watson-Crick base pairs.
Takezawa, Yusuke; Shionoya, Mitsuhiko
2012-12-18
With its capacity to store and transfer the genetic information within a sequence of monomers, DNA forms its central role in chemical evolution through replication and amplification. This elegant behavior is largely based on highly specific molecular recognition between nucleobases through the specific hydrogen bonds in the Watson-Crick base pairing system. While the native base pairs have been amazingly sophisticated through the long history of evolution, synthetic chemists have devoted considerable efforts to create alternative base pairing systems in recent decades. Most of these new systems were designed based on the shape complementarity of the pairs or the rearrangement of hydrogen-bonding patterns. We wondered whether metal coordination could serve as an alternative driving force for DNA base pairing and why hydrogen bonding was selected on Earth in the course of molecular evolution. Therefore, we envisioned an alternative design strategy: we replaced hydrogen bonding with another important scheme in biological systems, metal-coordination bonding. In this Account, we provide an overview of the chemistry of metal-mediated base pairing including basic concepts, molecular design, characteristic structures and properties, and possible applications of DNA-based molecular systems. We describe several examples of artificial metal-mediated base pairs, such as Cu(2+)-mediated hydroxypyridone base pair, H-Cu(2+)-H (where H denotes a hydroxypyridone-bearing nucleoside), developed by us and other researchers. To design the metallo-base pairs we carefully chose appropriate combinations of ligand-bearing nucleosides and metal ions. As expected from their stronger bonding through metal coordination, DNA duplexes possessing metallo-base pairs exhibited higher thermal stability than natural hydrogen-bonded DNAs. Furthermore, we could also use metal-mediated base pairs to construct or induce other high-order structures. These features could lead to metal-responsive functional
Surprising Conformers of the Biologically Important A·T DNA Base Pairs: QM/QTAIM Proofs
Directory of Open Access Journals (Sweden)
Ol'ha O. Brovarets'
2018-02-01
Full Text Available For the first time novel high-energy conformers–A·T(wWC (5.36, A·T(wrWC (5.97, A·T(wH (5.78, and A·T(wrH (ΔG = 5.82 kcal·mol−1 (See Graphical Abstract were revealed for each of the four biologically important A·T DNA base pairs – Watson-Crick A·T(WC, reverse Watson-Crick A·T(rWC, Hoogsteen A·T(H and reverse Hoogsteen A·T(rH at the MP2/aug-cc-pVDZ//B3LYP/6-311++G(d,p level of quantum-mechanical theory in the continuum with ε = 4 under normal conditions. Each of these conformers possesses substantially non-planar wobble (w structure and is stabilized by the participation of the two anti-parallel N6H/N6H′…O4/O2 and N3H…N6 H-bonds, involving the pyramidalized amino group of the A DNA base as an acceptor and a donor of the H-bonding. The transition states – TSA·T(WC↔A·T(wWC, TSA·T(rWC↔A·T(wrWC, TSA·T(H↔A·T(wH, and TSA·T(rH↔A·T(wrH, controlling the dipole-active transformations of the conformers from the main plane-symmetric state into the high-energy, significantly non-planar state and vice versa, were localized. They also possess wobble structures similarly to the high-energy conformers and are stabilized by the participation of the N6H/N6H′…O4/O2 and N3H…N6 H-bonds. Discovered conformers of the A·T DNA base pairs are dynamically stable short-lived structures [lifetime τ = (1.4–3.9 ps]. Their possible biological significance and future perspectives have been briefly discussed.
DNA-based nanobiostructured devices: The role of quasiperiodicity and correlation effects
Energy Technology Data Exchange (ETDEWEB)
Albuquerque, E.L., E-mail: eudenilson@gmail.com [Departamento de Biofísica e Farmacologia, Universidade Federal do Rio Grande do Norte, 59072-970, Natal-RN (Brazil); Fulco, U.L. [Departamento de Biofísica e Farmacologia, Universidade Federal do Rio Grande do Norte, 59072-970, Natal-RN (Brazil); Freire, V.N. [Departamento de Física, Universidade Federal do Ceará, 60455-760, Fortaleza-CE (Brazil); Caetano, E.W.S. [Instituto Federal de Educação, Ciência e Tecnologia do Ceará, 60040-531, Fortaleza-CE (Brazil); Lyra, M.L.; Moura, F.A.B.F. de [Instituto de Física, Universidade Federal de Alagoas, 57072-970, Maceió-AL (Brazil)
2014-02-01
The purpose of this review is to present a comprehensive and up-to-date account of the main physical properties of DNA-based nanobiostructured devices, stressing the role played by their quasi-periodicity arrangement and correlation effects. Although the DNA-like molecule is usually described as a short-ranged correlated random ladder, artificial segments can be grown following quasiperiodic sequences as, for instance, the Fibonacci and Rudin–Shapiro ones. They have interesting properties like a complex fractal spectra of energy, which can be considered as their indelible mark, and collective properties that are not shared by their constituents. These collective properties are due to the presence of long-range correlations, which are expected to be reflected somehow in their various spectra (electronic transmission, density of states, etc.) defining another description of disorder. Although long-range correlations are responsible for the effective electronic transport at specific resonant energies of finite DNA segments, much of the anomalous spread of an initially localized electron wave-packet can be accounted by short-range pair correlations, suggesting that an approach based on the inclusion of further short-range correlations on the nucleotide distribution leads to an adequate description of the electronic properties of DNA segments. The introduction of defects may generate states within the gap, and substantially improves the conductance, specially of finite branches. They usually become exponentially localized for any amount of disorder, and have the property to tailor the electronic transport properties of DNA-based nanoelectronic devices. In particular, symmetric and antisymmetric correlations have quite distinct influence on the nature of the electronic states, and a diluted distribution of defects lead to an anomalous diffusion of the electronic wave-packet. Nonlinear contributions, arising from the coupling between electrons and the molecular
DEFF Research Database (Denmark)
Hansen, David
2012-01-01
Many industrial production work systems have increased in complexity, and their new business model scompete on innovation, rather than low cost.At a medical device production facility committed to Lean Production, a research project was carried out to use Appreciative Inquiry to better engage...... employee strengths in continuou simprovements of the work system. The research question was: “How can Lean problem solving and Appreciative Inquiry be combined for optimized work system innovation?” The research project was carried out as a co-creation process with close cooperation between researcher...
DEFF Research Database (Denmark)
Foss, Kirsten; Foss, Nicolai Juul
2006-01-01
as a general approach to problem solving. We apply these Simonian ideas to organisational issues, specifically new organisational forms. Specifically, Simonian ideas allow us to develop a morphology of new organisational forms and to point to some design problems that characterise these forms.......Two of Herbert Simon's best-known papers are 'The Architecture of Complexity' and 'The Structure of Ill-Structured Problems.' We discuss the neglected links between these two papers, highlighting the role of decomposition in the context of problems on which constraints have been imposed...
1982-10-01
Artificial Intelig ~ence (Vol. III, edited by Paul R. Cohen and’ Edward A.. Feigenbaum)’, The chapter was written B’ Paul Cohen, with contributions... Artificial Intelligence (Vol. III, edited by Paul R. Cohen and EdWard A. Feigenbaum). The chapter was written by Paul R. Cohen, with contributions by Stephen...Wheevoats"EntermdI’ Planning and Problem ’Solving by Paul R. Cohen Chaptb-rXV-of Volumec III’of the Handbook of Artificial Intelligence edited by Paul R
DNA-Based Single-Molecule Electronics: From Concept to Function.
Wang, Kun
2018-01-17
Beyond being the repository of genetic information, DNA is playing an increasingly important role as a building block for molecular electronics. Its inherent structural and molecular recognition properties render it a leading candidate for molecular electronics applications. The structural stability, diversity and programmability of DNA provide overwhelming freedom for the design and fabrication of molecular-scale devices. In the past two decades DNA has therefore attracted inordinate amounts of attention in molecular electronics. This review gives a brief survey of recent experimental progress in DNA-based single-molecule electronics with special focus on single-molecule conductance and I-V characteristics of individual DNA molecules. Existing challenges and exciting future opportunities are also discussed.
Ab initio Calculations of Electronic Fingerprints of DNA bases on Graphene
Ahmed, Towfiq; Rehr, John J.; Kilina, Svetlana; Das, Tanmoy; Haraldsen, Jason T.; Balatsky, Alexander V.
2012-02-01
We have carried out first principles DFT calculations of the electronic local density of states (LDOS) of DNA nucleotide bases (A,C,G,T) adsorbed on graphene using LDA with ultra-soft pseudo-potentials. We have also calculated the longitudinal transmission currents T(E) through graphene nano-pores as an individual DNA base passes through it, using a non-equilibrium Green's function (NEGF) formalism. We observe several dominant base-dependent features in the LDOS and T(E) in an energy range within a few eV of the Fermi level. These features can serve as electronic fingerprints for the identification of individual bases from dI/dV measurements in scanning tunneling spectroscopy (STS) and nano-pore experiments. Thus these electronic signatures can provide an alternative approach to DNA sequencing.
Energy Technology Data Exchange (ETDEWEB)
Khadsai, Sudarat; Rutnakornpituk, Boonjira [Naresuan University, Department of Chemistry and Center of Excellence in Biomaterials, Faculty of Science (Thailand); Vilaivan, Tirayut [Chulalongkorn University, Department of Chemistry, Organic Synthesis Research Unit, Faculty of Science (Thailand); Nakkuntod, Maliwan [Naresuan University, Department of Biology, Faculty of Science (Thailand); Rutnakornpituk, Metha, E-mail: methar@nu.ac.th [Naresuan University, Department of Chemistry and Center of Excellence in Biomaterials, Faculty of Science (Thailand)
2016-09-15
Magnetite nanoparticles (MNPs) were surface modified with anionic poly(N-acryloyl glycine) (PNAG) and streptavidin for specific interaction with biotin-conjugated pyrrolidinyl peptide nucleic acid (PNA). Hydrodynamic size (D{sub h}) of PNAG-grafted MNPs varied from 334 to 496 nm depending on the loading ratio of the MNP to NAG in the reaction. UV–visible and fluorescence spectrophotometries were used to confirm the successful immobilization of streptavidin and PNA on the MNPs. About 291 pmol of the PNA/mg MNP was immobilized on the particle surface. The PNA-functionalized MNPs were effectively used as solid supports to differentiate between fully complementary and non-complementary/single-base mismatch DNA using the PNA probe. These novel anionic MNPs can be efficiently applicable for use as a magnetically guidable support for DNA base discrimination.Graphical Abstract.
DNA-Based Single-Molecule Electronics: From Concept to Function
2018-01-01
Beyond being the repository of genetic information, DNA is playing an increasingly important role as a building block for molecular electronics. Its inherent structural and molecular recognition properties render it a leading candidate for molecular electronics applications. The structural stability, diversity and programmability of DNA provide overwhelming freedom for the design and fabrication of molecular-scale devices. In the past two decades DNA has therefore attracted inordinate amounts of attention in molecular electronics. This review gives a brief survey of recent experimental progress in DNA-based single-molecule electronics with special focus on single-molecule conductance and I–V characteristics of individual DNA molecules. Existing challenges and exciting future opportunities are also discussed. PMID:29342091
Satellite DNA-based artificial chromosomes for use in gene therapy.
Hadlaczky, G
2001-04-01
Satellite DNA-based artificial chromosomes (SATACs) can be made by induced de novo chromosome formation in cells of different mammalian species. These artificially generated accessory chromosomes are composed of predictable DNA sequences and they contain defined genetic information. Prototype human SATACs have been successfully constructed in different cell types from 'neutral' endogenous DNA sequences from the short arm of the human chromosome 15. SATACs have already passed a number of hurdles crucial to their further development as gene therapy vectors, including: large-scale purification; transfer of purified artificial chromosomes into different cells and embryos; generation of transgenic animals and germline transmission with purified SATACs; and the tissue-specific expression of a therapeutic gene from an artificial chromosome in the milk of transgenic animals.
International Nuclear Information System (INIS)
Khadsai, Sudarat; Rutnakornpituk, Boonjira; Vilaivan, Tirayut; Nakkuntod, Maliwan; Rutnakornpituk, Metha
2016-01-01
Magnetite nanoparticles (MNPs) were surface modified with anionic poly(N-acryloyl glycine) (PNAG) and streptavidin for specific interaction with biotin-conjugated pyrrolidinyl peptide nucleic acid (PNA). Hydrodynamic size (D h ) of PNAG-grafted MNPs varied from 334 to 496 nm depending on the loading ratio of the MNP to NAG in the reaction. UV–visible and fluorescence spectrophotometries were used to confirm the successful immobilization of streptavidin and PNA on the MNPs. About 291 pmol of the PNA/mg MNP was immobilized on the particle surface. The PNA-functionalized MNPs were effectively used as solid supports to differentiate between fully complementary and non-complementary/single-base mismatch DNA using the PNA probe. These novel anionic MNPs can be efficiently applicable for use as a magnetically guidable support for DNA base discrimination.Graphical Abstract
International Nuclear Information System (INIS)
Sharma, Geeta K.
2011-01-01
In the emerging field of nanoscience and nanotechnology, tremendous focus has been made by researcher to explore the applications of nanomaterials for human welfare by converting the findings into technology. Some of the examples have been the use of nanoparticles in the field of opto-electronic, fuel cells, medicine and catalysis. These wide applications and significance lies in the fact that nanoparticles possess unique physical and chemical properties very different from their bulk precursors. Numerous methods for the synthesis of noble nanoparticles with tunable shape and size have been reported in literature. The goal of our group is to use different methods of synthesis of noble metal nanoparticles (Au, Ag, Pt and Pd) and test their protective/damaging role towards DNA base damage induced by ionizing radiation (Au and Ag) and to test the catalytic activity of nanoparticles (Pt and Pd) in certain known organic synthesis/electron transfer reactions. Using radiation chemical techniques such as pulse radiolysis and steady state radiolysis complemented by the product analysis using HPLC/LC-MS, a detailed mechanism for the formation of transient species, kinetics leading to the formation of stable end products is studied in the DNA base damage induced by ionizing radiation in presence and absence of Au and Ag nanoparticles. Unraveling the complex interaction between catalysts and reactants under operando conditions is a key step towards gaining fundamental insight in catalysis. The catalytic activity of Pt and Pd nanoparticles in electron transfer and Suzuki coupling reactions has been determined. Investigations are currently underway to gain insight into the interaction between catalysts and reactants using time resolved spectroscopic measurements. These studies will be detailed during the presentation. (author)
Solving Differential Equations in R: Package deSolve
In this paper we present the R package deSolve to solve initial value problems (IVP) written as ordinary differential equations (ODE), differential algebraic equations (DAE) of index 0 or 1 and partial differential equations (PDE), the latter solved using the method of lines appr...
Solving Differential Equations in R: Package deSolve
Soetaert, K.E.R.; Petzoldt, T.; Setzer, R.W.
2010-01-01
In this paper we present the R package deSolve to solve initial value problems (IVP) written as ordinary differential equations (ODE), differential algebraic equations (DAE) of index 0 or 1 and partial differential equations (PDE), the latter solved using the method of lines approach. The
Quantum Computing for Computer Architects
Metodi, Tzvetan
2011-01-01
Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore
High School Students' Use of Meiosis When Solving Genetics Problems.
Wynne, Cynthia F.; Stewart, Jim; Passmore, Cindy
2001-01-01
Paints a different picture of students' reasoning with meiosis as they solved complex, computer-generated genetics problems, some of which required them to revise their understanding of meiosis in response to anomalous data. Students were able to develop a rich understanding of meiosis and can utilize that knowledge to solve genetics problems.…
Logo Programming, Problem Solving, and Knowledge-Based Instruction.
Swan, Karen; Black, John B.
The research reported in this paper was designed to investigate the hypothesis that computer programming may support the teaching and learning of problem solving, but that to do so, problem solving must be explicitly taught. Three studies involved students in several grades: 4th, 6th, 8th, 11th, and 12th. Findings collectively show that five…
Problem solving and Program design using the TI-92
Ir.ing. Ton Marée; ir Martijn van Dongen
2000-01-01
This textbook is intended for a basic course in problem solving and program design needed by scientists and engineers using the TI-92. The TI-92 is an extremely powerful problem solving tool that can help you manage complicated problems quickly. We assume no prior knowledge of computers or
Examining Multiscale Movement Coordination in Collaborative Problem Solving
DEFF Research Database (Denmark)
Wiltshire, Travis; Steffensen, Sune Vork
2017-01-01
During collaborative problem solving (CPS), coordination occurs at different spatial and temporal scales. This multiscale coordination should, at least on some scales, play a functional role in facilitating effective collaboration outcomes. To evaluate this, we conducted a study of computer...
Fostering Information Problem Solving Skills Through Completion Problems and Prompts
Frerejean, Jimmy; Brand-Gruwel, Saskia; Kirschner, Paul A.
2012-01-01
Frerejean, J., Brand-Gruwel, S., & Kirschner, P. A. (2012, September). Fostering Information Problem Solving Skills Through Completion Problems and Prompts. Poster presented at the EARLI SIG 6 & 7 "Instructional Design" and "Learning and Instruction with Computers", Bari, Italy.
van Hoorn, Jelke J.; Nogueira, Agustín; Ojea, Ignacio; Gromicho Dos Santos, Joaquim Antonio
2017-01-01
In [1] an algorithm is proposed for solving the job-shop scheduling problem optimally using a dynamic programming strategy. This is, according to our knowledge, the first exact algorithm for the Job Shop problem which is not based on integer linear programming and branch and bound. Despite the
International Nuclear Information System (INIS)
Ethier, C.R.
2004-01-01
Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)
Solved problems in electromagnetics
Salazar Bloise, Félix; Bayón Rojo, Ana; Gascón Latasa, Francisco
2017-01-01
This book presents the fundamental concepts of electromagnetism through problems with a brief theoretical introduction at the beginning of each chapter. The present book has a strong didactic character. It explains all the mathematical steps and the theoretical concepts connected with the development of the problem. It guides the reader to understand the employed procedures to learn to solve the exercises independently. The exercises are structured in a similar way: The chapters begin with easy problems increasing progressively in the level of difficulty. This book is written for students of physics and engineering in the framework of the new European Plans of Study for Bachelor and Master and also for tutors and lecturers. .
Solved problems in electrochemistry
International Nuclear Information System (INIS)
Piron, D.L.
2004-01-01
This book presents calculated solutions to problems in fundamental and applied electrochemistry. It uses industrial data to illustrate scientific concepts and scientific knowledge to solve practical problems. It is subdivided into three parts. The first uses modern basic concepts, the second studies the scientific basis for electrode and electrolyte thermodynamics (including E-pH diagrams and the minimum energy involved in transformations) and the kinetics of rate processes (including the energy lost in heat and in parasite reactions). The third part treats larger problems in electrolysis and power generation, as well as in corrosion and its prevention. Each chapter includes three sections: the presentation of useful principles; some twenty problems with their solutions; and, a set of unsolved problems
Solving multiconstraint assignment problems using learning automata.
Horn, Geir; Oommen, B John
2010-02-01
This paper considers the NP-hard problem of object assignment with respect to multiple constraints: assigning a set of elements (or objects) into mutually exclusive classes (or groups), where the elements which are "similar" to each other are hopefully located in the same class. The literature reports solutions in which the similarity constraint consists of a single index that is inappropriate for the type of multiconstraint problems considered here and where the constraints could simultaneously be contradictory. This feature, where we permit possibly contradictory constraints, distinguishes this paper from the state of the art. Indeed, we are aware of no learning automata (or other heuristic) solutions which solve this problem in its most general setting. Such a scenario is illustrated with the static mapping problem, which consists of distributing the processes of a parallel application onto a set of computing nodes. This is a classical and yet very important problem within the areas of parallel computing, grid computing, and cloud computing. We have developed four learning-automata (LA)-based algorithms to solve this problem: First, a fixed-structure stochastic automata algorithm is presented, where the processes try to form pairs to go onto the same node. This algorithm solves the problem, although it requires some centralized coordination. As it is desirable to avoid centralized control, we subsequently present three different variable-structure stochastic automata (VSSA) algorithms, which have superior partitioning properties in certain settings, although they forfeit some of the scalability features of the fixed-structure algorithm. All three VSSA algorithms model the processes as automata having first the hosting nodes as possible actions; second, the processes as possible actions; and, third, attempting to estimate the process communication digraph prior to probabilistically mapping the processes. This paper, which, we believe, comprehensively reports the
Effects of sampling conditions on DNA-based estimates of American black bear abundance
Laufenberg, Jared S.; Van Manen, Frank T.; Clark, Joseph D.
2013-01-01
DNA-based capture-mark-recapture techniques are commonly used to estimate American black bear (Ursus americanus) population abundance (N). Although the technique is well established, many questions remain regarding study design. In particular, relationships among N, capture probability of heterogeneity mixtures A and B (pA and pB, respectively, or p, collectively), the proportion of each mixture (π), number of capture occasions (k), and probability of obtaining reliable estimates of N are not fully understood. We investigated these relationships using 1) an empirical dataset of DNA samples for which true N was unknown and 2) simulated datasets with known properties that represented a broader array of sampling conditions. For the empirical data analysis, we used the full closed population with heterogeneity data type in Program MARK to estimate N for a black bear population in Great Smoky Mountains National Park, Tennessee. We systematically reduced the number of those samples used in the analysis to evaluate the effect that changes in capture probabilities may have on parameter estimates. Model-averaged N for females and males were 161 (95% CI = 114–272) and 100 (95% CI = 74–167), respectively (pooled N = 261, 95% CI = 192–419), and the average weekly p was 0.09 for females and 0.12 for males. When we reduced the number of samples of the empirical data, support for heterogeneity models decreased. For the simulation analysis, we generated capture data with individual heterogeneity covering a range of sampling conditions commonly encountered in DNA-based capture-mark-recapture studies and examined the relationships between those conditions and accuracy (i.e., probability of obtaining an estimated N that is within 20% of true N), coverage (i.e., probability that 95% confidence interval includes true N), and precision (i.e., probability of obtaining a coefficient of variation ≤20%) of estimates using logistic regression. The capture probability
Exploiting Quantum Resonance to Solve Combinatorial Problems
Zak, Michail; Fijany, Amir
2006-01-01
Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.
An Integer Programming Approach to Solving Tantrix on Fixed Boards
Directory of Open Access Journals (Sweden)
Yushi Uno
2012-03-01
Full Text Available Tantrix (Tantrix R ⃝ is a registered trademark of Colour of Strategy Ltd. in New Zealand, and of TANTRIX JAPAN in Japan, respectively, under the license of M. McManaway, the inventor. is a puzzle to make a loop by connecting lines drawn on hexagonal tiles, and the objective of this research is to solve it by a computer. For this purpose, we first give a problem setting of solving Tantrix as making a loop on a given fixed board. We then formulate it as an integer program by describing the rules of Tantrix as its constraints, and solve it by a mathematical programming solver to have a solution. As a result, we establish a formulation that can solve Tantrix of moderate size, and even when the solutions are invalid only by elementary constraints, we achieved it by introducing additional constraints and re-solve it. By this approach we succeeded to solve Tantrix of size up to 60.
DNA-based identification of mixed-organism samples offers the potential to greatly reduce the need for resource-intensive morphological identification, which would be of value both to biotic condition assessment and non-native species early-detection monitoring. However, the abi...
Rosati, F.; Roelfes, J.G.
A structure-activity relationship study of the first generation ligands for the DNA-based asymmetric hydration of enones and Diels-Alder reaction in water is reported. The design of the ligand was optimized resulting in a maximum ee of 83% in the hydration reaction and 75% in the Diels-Alder
Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...
Proton tunneling in the A∙T Watson-Crick DNA base pair: myth or reality?
Brovarets', Ol'ha O; Hovorun, Dmytro M
2015-01-01
The results and conclusions reached by Godbeer et al. in their recent work, that proton tunneling in the A∙T(WC) Watson-Crick (WC) DNA base pair occurs according to the Löwdin's (L) model, but with a small (~10(-9)) probability were critically analyzed. Here, it was shown that this finding overestimates the possibility of the proton tunneling at the A∙T(WC)↔A*∙T*(L) tautomerization, because this process cannot be implemented as a chemical reaction. Furthermore, it was outlined those biologically important nucleobase mispairs (A∙A*↔A*∙A, G∙G*↔G*∙G, T∙T*↔T*∙T, C∙C*↔C*∙C, H∙H*↔H*∙H (H - hypoxanthine)) - the players in the field of the spontaneous point mutagenesis - where the tunneling of protons is expected and for which the application of the model proposed by Godbeer et al. can be productive.
Solid state radiation chemistry of co-crystallized DNA base pairs studied with EPR and ENDOR
International Nuclear Information System (INIS)
Nelson, W.H.; Nimmala, S.; Hole, E.O.; Sagstuen, E.; Close, D.M.
1995-01-01
For a number of years, the authors' group has focused on identification of radicals formed from x-irradiation of DNA components by application of EPR and ENDOR spectroscopic techniques to samples in the form of single crystals. With single crystals as samples, it is possible to use the detailed packing and structural information available from x-ray or neutron diffraction reports. This report summarizes results from two crystal systems in which DNA bases are paired by hydrogen bonding. Extensive results are available from one of these, 1-methyl-thymine:9-methyladenine (MTMA), in which the base pairing is the Hoogsteen configuration. Although this configuration is different from that found by Watson-Crick in DNA, nonetheless the hydrogen bond between T(O4) and A(NH 2 ) is present. Although MTMA crystals have been studied previously, the objective was to apply the high-resolution technique of ENDOR to crystals irradiated and studied at temperatures of 10 K or lower in the effort to obtain direct evidence for specific proton transfers. The second system, from which the results are only preliminary, is 9-ethylguanine:1-methyl-5-fluorocytosine (GFC) in which the G:C bases pair is in the Watson Crick configuration. Both crystal systems are anhydrous, so the results include no possible effects from water interactions
Akdemir, Hülya; Suzerer, Veysel; Tilkat, Engin; Onay, Ahmet; Çiftçi, Yelda Ozden
2016-12-01
Determination of genetic stability of in vitro-grown plantlets is needed for safe and large-scale production of mature trees. In this study, genetic variation of long-term micropropagated mature pistachio developed through direct shoot bud regeneration using apical buds (protocol A) and in vitro-derived leaves (protocol B) was assessed via DNA-based molecular markers. Randomly amplified polymorphic DNA (RAPD), inter-simple sequence repeat (ISSR), and amplified fragment length polymorphism (AFLP) were employed, and the obtained PIC values from RAPD (0.226), ISSR (0.220), and AFLP (0.241) showed that micropropagation of pistachio for different periods of time resulted in "reasonable polymorphism" among donor plant and its 18 clones. Mantel's test showed a consistence polymorphism level between marker systems based on similarity matrices. In conclusion, this is the first study on occurrence of genetic variability in long-term micropropagated mature pistachio plantlets. The obtained results clearly indicated that different marker approaches used in this study are reliable for assessing tissue culture-induced variations in long-term cultured pistachio plantlets.
Reversible Modulation of DNA-Based Hydrogel Shapes by Internal Stress Interactions.
Hu, Yuwei; Kahn, Jason S; Guo, Weiwei; Huang, Fujian; Fadeev, Michael; Harries, Daniel; Willner, Itamar
2016-12-14
We present the assembly of asymmetric two-layer hybrid DNA-based hydrogels revealing stimuli-triggered reversibly modulated shape transitions. Asymmetric, linear hydrogels that include layer-selective switchable stimuli-responsive elements that control the hydrogel stiffness are designed. Trigger-induced stress in one of the layers results in the bending of the linear hybrid structure, thereby minimizing the elastic free energy of the systems. The removal of the stress by a counter-trigger restores the original linear bilayer hydrogel. The stiffness of the DNA hydrogel layers is controlled by thermal, pH (i-motif), K + ion/crown ether (G-quadruplexes), chemical (pH-doped polyaniline), or biocatalytic (glucose oxidase/urease) triggers. A theoretical model relating the experimental bending radius of curvatures of the hydrogels with the Young's moduli and geometrical parameters of the hydrogels is provided. Promising applications of shape-regulated stimuli-responsive asymmetric hydrogels include their use as valves, actuators, sensors, and drug delivery devices.
Self-assembling of calcium salt of the new DNA base 5-carboxylcytosine
Energy Technology Data Exchange (ETDEWEB)
Irrera, Simona [Department of Chemistry, SAPIENZA University of Rome, Piazzale A. Moro 5, 00185 Rome (Italy); Department of Chemistry, University College London, 20 Grodon Street, WC1H0AJ London (United Kingdom); Ruiz-Hernandez, Sergio E. [School of Chemistry, Cardiff University Main Building, Park Place, CF103AT Cardiff (United Kingdom); Reggente, Melania [Department of Basic and Applied Sciences for Engineering, SAPIENZA University of Rome, Via A. Scarpa 16, 00161 Rome (Italy); Passeri, Daniele, E-mail: daniele.passeri@uniroma1.it [Department of Basic and Applied Sciences for Engineering, SAPIENZA University of Rome, Via A. Scarpa 16, 00161 Rome (Italy); Natali, Marco [Department of Basic and Applied Sciences for Engineering, SAPIENZA University of Rome, Via A. Scarpa 16, 00161 Rome (Italy); Gala, Fabrizio [Department of Basic and Applied Sciences for Engineering, SAPIENZA University of Rome, Via A. Scarpa 16, 00161 Rome (Italy); Department of Medical-Surgical, Techno-Biomedical Sciences and Translational Medicine of SAPIENZA University of Rome, Sant’Andrea Hospital, Rome (Italy); Zollo, Giuseppe [Department of Basic and Applied Sciences for Engineering, SAPIENZA University of Rome, Via A. Scarpa 16, 00161 Rome (Italy); Rossi, Marco [Department of Basic and Applied Sciences for Engineering, SAPIENZA University of Rome, Via A. Scarpa 16, 00161 Rome (Italy); Research Center for Nanotechnology applied to Engineering of SAPIENZA University of Rome (CNIS), Piazzale A. Moro 5, 00185 Rome (Italy); Portalone, Gustavo, E-mail: gustavo.portalone@uniroma1.it [Department of Chemistry, SAPIENZA University of Rome, Piazzale A. Moro 5, 00185 Rome (Italy)
2017-06-15
Highlights: • Ca salt of 5-carboxylcytosine has been deposited on HOPG substrate. • Molecules self-assembled in monolayers and filaments. • Height of the features were measured by atomic force microscopy. • Ab-initio calculations confirmed the AFM results. - Abstract: Supramolecular architectures involving DNA bases can have a strong impact in several fields such as nanomedicine and nanodevice manufacturing. To date, in addition to the four canonical nucleobases (adenine, thymine, guanine and cytosine), four other forms of cytosine modified at the 5 position have been identified in DNA. Among these four new cytosine derivatives, 5-carboxylcytosine has been recently discovered in mammalian stem cell DNA, and proposed as the final product of the oxidative epigenetic demethylation pathway on the 5 position of cytosine. In this work, a calcium salt of 5-carboxylcytosine has been synthesized and deposited on graphite surface, where it forms self-assembled features as long range monolayers and up to one micron long filaments. These structures have been analyzed in details combining different theoretical and experimental approaches: X-ray single-crystal diffraction data were used to simulate the molecule-graphite interaction, first using molecular dynamics and then refining the results using density functional theory (DFT); finally, data obtained with DFT were used to rationalize atomic force microscopy (AFM) results.
Mitochondrial DNA-based identification of some forensically important blowflies in Thailand.
Preativatanyou, Kanok; Sirisup, Nantana; Payungporn, Sunchai; Poovorawan, Yong; Thavara, Usavadee; Tawatsin, Apiwat; Sungpradit, Sivapong; Siriyasatien, Padet
2010-10-10
Accurate identification of insects collected from death scenes provides not only specific developmental data assisting forensic entomologists to determine the postmortem interval more precisely but also other kinds of forensic evidence. However, morphological identification can be complicated due to the similarity among species, especially in the early larval stages. To simplify and make the species identification more practical and reliable, DNA-based identification is preferentially considered. In this study, we demonstrate the application of partial mitochondrial cytochrome oxidase I (COI) and cytochrome oxidase II (COII) sequences for differentiation of forensically important blowflies in Thailand; Chrysomya megacephala, Chrysomya rufifacies and Lucilia cuprina by polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP). The PCR yields a single 1324bp-sized amplicon in all blowfly specimens, followed by direct DNA sequencing. Taq(α)I and VspI predicted from the sequencing data provide different RFLP profiles among these three species. Sequence analysis reveals no significant intraspecific divergence in blowfly specimens captured from different geographical regions in Thailand. Accordingly, neighbor-joining tree using Kimura's 2-parameter model illustrates reciprocal monophyly between species. Thus, these approaches serve as promising tools for molecular identification of these three common forensically important blowfly species in Thailand. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Aleksandra Delplanque
Full Text Available Lanthanide-doped nanoparticles are of considerable interest for biodetection and bioimaging techniques thanks to their unique chemical and optical properties. As a sensitive luminescence material, they can be used as (bio probes in Förster Resonance Energy Transfer (FRET where trivalent lanthanide ions (La3+ act as energy donors. In this paper we present an efficient method to transfer ultrasmall (ca. 8 nm NaYF4 nanoparticles dispersed in organic solvent to an aqueous solution via oxidation of the oleic acid ligand. Nanoparticles were then functionalized with single strand DNA oligomers (ssDNA by inducing covalent bonds between surface carboxylic groups and a 5' amine modified-ssDNA. Hybridization with the 5' fluorophore (Cy5 modified complementary ssDNA strand demonstrated the specificity of binding and allowed the fine control over the distance between Eu3+ ions doped nanoparticle and the fluorophore by varying the number of the dsDNA base pairs. First, our results confirmed nonradiative resonance energy transfer and demonstrate the dependence of its efficiency on the distance between the donor (Eu3+ and the acceptor (Cy5 with sensitivity at a nanometre scale.
Delplanque, Aleksandra; Wawrzynczyk, Dominika; Jaworski, Pawel; Matczyszyn, Katarzyna; Pawlik, Krzysztof; Buckle, Malcolm; Nyk, Marcin; Nogues, Claude; Samoc, Marek
2015-01-01
Lanthanide-doped nanoparticles are of considerable interest for biodetection and bioimaging techniques thanks to their unique chemical and optical properties. As a sensitive luminescence material, they can be used as (bio) probes in Förster Resonance Energy Transfer (FRET) where trivalent lanthanide ions (La3+) act as energy donors. In this paper we present an efficient method to transfer ultrasmall (ca. 8 nm) NaYF4 nanoparticles dispersed in organic solvent to an aqueous solution via oxidation of the oleic acid ligand. Nanoparticles were then functionalized with single strand DNA oligomers (ssDNA) by inducing covalent bonds between surface carboxylic groups and a 5' amine modified-ssDNA. Hybridization with the 5' fluorophore (Cy5) modified complementary ssDNA strand demonstrated the specificity of binding and allowed the fine control over the distance between Eu3+ ions doped nanoparticle and the fluorophore by varying the number of the dsDNA base pairs. First, our results confirmed nonradiative resonance energy transfer and demonstrate the dependence of its efficiency on the distance between the donor (Eu3+) and the acceptor (Cy5) with sensitivity at a nanometre scale.
Focke, Felix; Haase, Ilka; Fischer, Markus
2011-01-26
Usually spices are identified morphologically using simple methods like magnifying glasses or microscopic instruments. On the other hand, molecular biological methods like the polymerase chain reaction (PCR) enable an accurate and specific detection also in complex matrices. Generally, the origins of spices are plants with diverse genetic backgrounds and relationships. The processing methods used for the production of spices are complex and individual. Consequently, the development of a reliable DNA-based method for spice analysis is a challenging intention. However, once established, this method will be easily adapted to less difficult food matrices. In the current study, several alternative methods for the isolation of DNA from spices have been developed and evaluated in detail with regard to (i) its purity (photometric), (ii) yield (fluorimetric methods), and (iii) its amplifiability (PCR). Whole genome amplification methods were used to preamplify isolates to improve the ratio between amplifiable DNA and inhibiting substances. Specific primer sets were designed, and the PCR conditions were optimized to detect 18 spices selectively. Assays of self-made spice mixtures were performed to proof the applicability of the developed methods.
Slow elimination of injured liver DNA bases of γ-irradiated old mice
International Nuclear Information System (INIS)
Gaziev, A.I.; Malakhov, L.V.; Fomenko, L.A.
1982-01-01
The paper presents a study of the elimination of injured bases from the liver DNA of old and young mice after their exposure to γ rays. The presented data show that if DNA from the liver of irradiated mice is treated with incision enzymes, its priming activity is increased. In the case of enzymatic treatment of DNA isolated 5 h after irradiation we find a great difference between the priming activity of the liver DNA of old and young mice. The reason for this difference is that the liver DNA of 20-month old mice 5 h after irradiation still has many unrepaired injured bases. These data indicated that the rate of incision of γ-injured DNA bases in the liver of old mice is lower than in the liver of young mice. In the liver of mice of different age the rate of restitution of DNA, single-strand breaks induced by γ rays in doses up to 100 Gy is the same. At the same time, the level of induced reparative synthesis of DNA in cells of an old organism is lower than in cells of a young organism. The obtained data suggest that reduction of the rate of elimination of modified bases from the cell DNA of 20-month old mice is due to reduction of the activity of the DNA repair enzymes or to restrictions in the chromatin in the access of these enzymes to the injured regions of DNA in the cells of old animals
De novo formed satellite DNA-based mammalian artificial chromosomes and their possible applications.
Katona, Robert L
2015-02-01
Mammalian artificial chromosomes (MACs) are non-integrating, autonomously replicating natural chromosome-based vectors that may carry a vast amount of genetic material, which in turn enable potentially prolonged, safe, and regulated therapeutic transgene expression and render MACs as attractive genetic vectors for "gene replacement" or for controlling differentiation pathways in target cells. Satellite-DNA-based artificial chromosomes (SATACs) can be made by induced de novo chromosome formation in cells of different mammalian and plant species. These artificially generated accessory chromosomes are composed of predictable DNA sequences, and they contain defined genetic information. SATACs have already passed a number of obstacles crucial to their further development as gene therapy vectors, including large-scale purification, transfer of purified artificial chromosomes into different cells and embryos, generation of transgenic animals and germline transmission with purified SATACs, and the tissue-specific expression of a therapeutic gene from an artificial chromosome in the milk of transgenic animals. SATACs could be used in cell therapy protocols. For these methods, the most versatile target cell would be one that was pluripotent and self-renewing to address multiple disease target cell types, thus making multilineage stem cells, such as adult derived early progenitor cells and embryonic stem cells, as attractive universal host cells.
A Constant Rate of Spontaneous Mutation in DNA-Based Microbes
Drake, John W.
1991-08-01
In terms of evolution and fitness, the most significant spontaneous mutation rate is likely to be that for the entire genome (or its nonfrivolous fraction). Information is now available to calculate this rate for several DNA-based haploid microbes, including bacteriophages with single- or double-stranded DNA, a bacterium, a yeast, and a filamentous fungus. Their genome sizes vary by ≈6500-fold. Their average mutation rates per base pair vary by ≈16,000-fold, whereas their mutation rates per genome vary by only ≈2.5-fold, apparently randomly, around a mean value of 0.0033 per DNA replication. The average mutation rate per base pair is inversely proportional to genome size. Therefore, a nearly invariant microbial mutation rate appears to have evolved. Because this rate is uniform in such diverse organisms, it is likely to be determined by deep general forces, perhaps by a balance between the usually deleterious effects of mutation and the physiological costs of further reducing mutation rates.
Dihydropyridines decrease X-ray-induced DNA base damage in mammalian cells
Energy Technology Data Exchange (ETDEWEB)
Wojewodzka, M., E-mail: marylaw@ichtj.waw.pl [Center of Radiobiology and Biological Dosimetry, Institute of Nuclear Chemistry and Technology, Warszawa (Poland); Gradzka, I.; Buraczewska, I.; Brzoska, K.; Sochanowicz, B. [Center of Radiobiology and Biological Dosimetry, Institute of Nuclear Chemistry and Technology, Warszawa (Poland); Goncharova, R.; Kuzhir, T. [Institute of Genetics and Cytology, Belarussian National Academy of Sciences, Minsk (Belarus); Szumiel, I. [Center of Radiobiology and Biological Dosimetry, Institute of Nuclear Chemistry and Technology, Warszawa (Poland)
2009-12-01
Compounds with the structural motif of 1,4-dihydropyridine display a broad spectrum of biological activities, often defined as bioprotective. Among them are L-type calcium channel blockers, however, also derivatives which do not block calcium channels exert various effects at the cellular and organismal levels. We examined the effect of sodium 3,5-bis-ethoxycarbonyl-2,6-dimethyl-1,4-dihydropyridine-4-carboxylate (denoted here as DHP and previously also as AV-153) on X-ray-induced DNA damage and mutation frequency at the HGPRT (hypoxanthine-guanine phosphoribosyl transferase) locus in Chinese hamster ovary CHO-K1 cells. Using formamido-pyrimidine glycosylase (FPG) comet assay, we found that 1-h DHP (10 nM) treatment before X-irradiation considerably reduced the initial level of FPG-recognized DNA base damage, which was consistent with decreased 8-oxo-7,8-dihydro-2'-deoxyguanosine content and mutation frequency lowered by about 40%. No effect on single strand break rejoining or on cell survival was observed. Similar base damage-protective effect was observed for two calcium channel blockers: nifedipine (structurally similar to DHP) or verapamil (structurally unrelated). So far, the specificity of the DHP-caused reduction in DNA damage - practically limited to base damage - has no satisfactory explanation.
Advanced DNA-Based Point-of-Care Diagnostic Methods for Plant Diseases Detection
Directory of Open Access Journals (Sweden)
Han Yih Lau
2017-12-01
Full Text Available Diagnostic technologies for the detection of plant pathogens with point-of-care capability and high multiplexing ability are an essential tool in the fight to reduce the large agricultural production losses caused by plant diseases. The main desirable characteristics for such diagnostic assays are high specificity, sensitivity, reproducibility, quickness, cost efficiency and high-throughput multiplex detection capability. This article describes and discusses various DNA-based point-of care diagnostic methods for applications in plant disease detection. Polymerase chain reaction (PCR is the most common DNA amplification technology used for detecting various plant and animal pathogens. However, subsequent to PCR based assays, several types of nucleic acid amplification technologies have been developed to achieve higher sensitivity, rapid detection as well as suitable for field applications such as loop-mediated isothermal amplification, helicase-dependent amplification, rolling circle amplification, recombinase polymerase amplification, and molecular inversion probe. The principle behind these technologies has been thoroughly discussed in several review papers; herein we emphasize the application of these technologies to detect plant pathogens by outlining the advantages and disadvantages of each technology in detail.
Self-assembling of calcium salt of the new DNA base 5-carboxylcytosine
Irrera, Simona; Ruiz-Hernandez, Sergio E.; Reggente, Melania; Passeri, Daniele; Natali, Marco; Gala, Fabrizio; Zollo, Giuseppe; Rossi, Marco; Portalone, Gustavo
2017-06-01
Supramolecular architectures involving DNA bases can have a strong impact in several fields such as nanomedicine and nanodevice manufacturing. To date, in addition to the four canonical nucleobases (adenine, thymine, guanine and cytosine), four other forms of cytosine modified at the 5 position have been identified in DNA. Among these four new cytosine derivatives, 5-carboxylcytosine has been recently discovered in mammalian stem cell DNA, and proposed as the final product of the oxidative epigenetic demethylation pathway on the 5 position of cytosine. In this work, a calcium salt of 5-carboxylcytosine has been synthesized and deposited on graphite surface, where it forms self-assembled features as long range monolayers and up to one micron long filaments. These structures have been analyzed in details combining different theoretical and experimental approaches: X-ray single-crystal diffraction data were used to simulate the molecule-graphite interaction, first using molecular dynamics and then refining the results using density functional theory (DFT); finally, data obtained with DFT were used to rationalize atomic force microscopy (AFM) results.
Horses for courses: a DNA-based test for race distance aptitude in thoroughbred racehorses.
Hill, Emmeline W; Ryan, Donal P; MacHugh, David E
2012-12-01
Variation at the myostatin (MSTN) gene locus has been shown to influence racing phenotypes in Thoroughbred horses, and in particular, early skeletal muscle development and the aptitude for racing at short distances. Specifically, a single nucleotide polymorphism (SNP) in the first intron of MSTN (g.66493737C/T) is highly predictive of best race distance among Flat racing Thoroughbreds: homozygous C/C horses are best suited to short distance races, heterozygous C/T horses are best suited to middle distance races, and homozygous T/T horses are best suited to longer distance races. Patent applications for this gene marker association, and other linked markers, have been filed. The information contained within the patent applications is exclusively licensed to the commercial biotechnology company Equinome Ltd, which provides a DNA-based test to the international Thoroughbred horse racing and breeding industry. The application of this information in the industry enables informed decision making in breeding and racing and can be used to assist selection to accelerate the rate of change of genetic types among distinct populations (Case Study 1) and within individual breeding operations (Case Study 2).
Modeling visual problem solving as analogical reasoning.
Lovett, Andrew; Forbus, Kenneth
2017-01-01
We present a computational model of visual problem solving, designed to solve problems from the Raven's Progressive Matrices intelligence test. The model builds on the claim that analogical reasoning lies at the heart of visual problem solving, and intelligence more broadly. Images are compared via structure mapping, aligning the common relational structure in 2 images to identify commonalities and differences. These commonalities or differences can themselves be reified and used as the input for future comparisons. When images fail to align, the model dynamically rerepresents them to facilitate the comparison. In our analysis, we find that the model matches adult human performance on the Standard Progressive Matrices test, and that problems which are difficult for the model are also difficult for people. Furthermore, we show that model operations involving abstraction and rerepresentation are particularly difficult for people, suggesting that these operations may be critical for performing visual problem solving, and reasoning more generally, at the highest level. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Sahler, Olle Jane Z.; Sherman, Sandra A.; Fairclough, Diane L.; Butler, Robert W.; Katz, Ernest R.; Dolgin, Michael J.; Varni, James W.; Noll, Robert B.; Phipps, Sean
2009-01-01
Objectives To evaluate the feasibility and efficacy of a handheld personal digital assistant (PDA)-based supplement for maternal Problem-Solving Skills Training (PSST) and to explore Spanish-speaking mothers’ experiences with it. Methods Mothers (n = 197) of children with newly diagnosed cancer were randomized to traditional PSST or PSST + PDA 8-week programs. Participants completed the Social Problem-Solving Inventory-Revised, Beck Depression Inventory-II, Profile of Mood States, and Impact of Event Scale-Revised pre-, post-treatment, and 3 months after completion of the intervention. Mothers also rated optimism, logic, and confidence in the intervention and technology. Results Both groups demonstrated significant positive change over time on all psychosocial measures. No between-group differences emerged. Despite technological “glitches,” mothers expressed moderately high optimism, appreciation for logic, and confidence in both interventions and rated the PDA-based program favorably. Technology appealed to all Spanish-speaking mothers, with younger mothers showing greater proficiency. Conclusions Well-designed, supported technology holds promise for enhancing psychological interventions. PMID:19091804
Schoenfeld's problem solving theory in a student controlled learning environment
Harskamp, E.; Suhre, C.
2007-01-01
This paper evaluates the effectiveness of a student controlled computer program for high school mathematics based on instruction principles derived from Schoenfeld's theory of problem solving. The computer program allows students to choose problems and to make use of hints during different episodes
Hyndman, D E
2013-01-01
Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl
Genomic DNA-based absolute quantification of gene expression in Vitis.
Gambetta, Gregory A; McElrone, Andrew J; Matthews, Mark A
2013-07-01
Many studies in which gene expression is quantified by polymerase chain reaction represent the expression of a gene of interest (GOI) relative to that of a reference gene (RG). Relative expression is founded on the assumptions that RG expression is stable across samples, treatments, organs, etc., and that reaction efficiencies of the GOI and RG are equal; assumptions which are often faulty. The true variability in RG expression and actual reaction efficiencies are seldom determined experimentally. Here we present a rapid and robust method for absolute quantification of expression in Vitis where varying concentrations of genomic DNA were used to construct GOI standard curves. This methodology was utilized to absolutely quantify and determine the variability of the previously validated RG ubiquitin (VvUbi) across three test studies in three different tissues (roots, leaves and berries). In addition, in each study a GOI was absolutely quantified. Data sets resulting from relative and absolute methods of quantification were compared and the differences were striking. VvUbi expression was significantly different in magnitude between test studies and variable among individual samples. Absolute quantification consistently reduced the coefficients of variation of the GOIs by more than half, often resulting in differences in statistical significance and in some cases even changing the fundamental nature of the result. Utilizing genomic DNA-based absolute quantification is fast and efficient. Through eliminating error introduced by assuming RG stability and equal reaction efficiencies between the RG and GOI this methodology produces less variation, increased accuracy and greater statistical power. © 2012 Scandinavian Plant Physiology Society.
Modification of DNA bases in mammalian chromatin by radiation-generated free radicals
International Nuclear Information System (INIS)
Gajewski, E.; Rao, G.; Nackerdien, Z.; Dizdaroglu, M.
1990-01-01
Modification of DNA bases in mammalian chromatin in aqueous suspension by ionizing radiation generated free radicals was investigated. Argon, air, N2O, and N2O/O2 were used for saturation of the aqueous system in order to provide different radical environments. Radiation doses ranging from 20 to 200 Gy (J.kg-1) were used. Thirteen products resulting from radical interactions with pyrimidines and purines in chromatin were identified and quantitated by using the technique of gas chromatography/mass spectrometry with selected-ion monitoring after acidic hydrolysis and trimethylsilylation of chromatin. The methodology used permitted analysis of the modified bases directly in chromatin without the necessity of isolation of DNA from chromatin first. The results indicate that the radical environment provided by the presence of different gases in the system had a substantial effect on the types of products and their quantities. Some products were produced only in the presence of oxygen, whereas other products were detected only in the absence of oxygen. Products produced under all four gaseous conditions were also observed. Generally, the presence of oxygen in the system increased the yields of the products with the exception of formamidopyrimidines. Superoxide radical formed in the presence of air, and to a lesser extent in the presence of N2O/O2, had no effect on product formation. The presence of oxygen dramatically increased the yields of 8-hydroxypurines, whereas the yields of formamidopyrimidines were not affected by oxygen, although these products result from respective oxidation and reduction of the same hydroxyl-adduct radicals of purines. The yields of the products were much lower than those observed previously with DNA
Phylogenetic analysis and DNA-based species confirmation in Anopheles (Nyssorhynchus.
Directory of Open Access Journals (Sweden)
Peter G Foster
Full Text Available Specimens of neotropical Anopheles (Nyssorhynchus were collected and identified morphologically. We amplified three genes for phylogenetic analysis-the single copy nuclear white and CAD genes, and the COI barcode region. Since we had multiple specimens for most species we were able to test how well the single or combined genes were able to corroborate morphologically defined species by placing the species into exclusive groups. We found that single genes, including the COI barcode region, were poor at confirming species, but that the three genes combined were able to do so much better. This has implications for species identification, species delimitation, and species discovery, and we caution that single genes are not enough. Higher level groupings were partially resolved with some well-supported groupings, whereas others were found to be either polyphyletic or paraphyletic. There were examples of known groups, such as the Myzorhynchella Section, which were poorly supported with single genes but were well supported with combined genes. From this we can infer that more sequence data will be needed in order to show more higher-level groupings with good support. We got unambiguously good support (0.94-1.0 Bayesian posterior probability from all DNA-based analyses for a grouping of An. dunhami with An. nuneztovari and An. goeldii, and because of this and because of morphological similarities we propose that An. dunhami be included in the Nuneztovari Complex. We obtained phylogenetic corroboration for new species which had been recognised by morphological differences; these will need to be formally described and named.
Development of a multiplex DNA-based traceability tool for crop plant materials.
Voorhuijzen, Marleen M; van Dijk, Jeroen P; Prins, Theo W; Van Hoef, A M Angeline; Seyfarth, Ralf; Kok, Esther J
2012-01-01
The authenticity of food is of increasing importance for producers, retailers and consumers. All groups benefit from the correct labelling of the contents of food products. Producers and retailers want to guarantee the origin of their products and check for adulteration with cheaper or inferior ingredients. Consumers are also more demanding about the origin of their food for various socioeconomic reasons. In contrast to this increasing demand, correct labelling has become much more complex because of global transportation networks of raw materials and processed food products. Within the European integrated research project 'Tracing the origin of food' (TRACE), a DNA-based multiplex detection tool was developed-the padlock probe ligation and microarray detection (PPLMD) tool. In this paper, this method is extended to a 15-plex traceability tool with a focus on products of commercial importance such as the emmer wheat Farro della Garfagnana (FdG) and Basmati rice. The specificity of 14 plant-related padlock probes was determined and initially validated in mixtures comprising seven or nine plant species/varieties. One nucleotide difference in target sequence was sufficient for the distinction between the presence or absence of a specific target. At least 5% FdG or Basmati rice was detected in mixtures with cheaper bread wheat or non-fragrant rice, respectively. The results suggested that even lower levels of (un-)intentional adulteration could be detected. PPLMD has been shown to be a useful tool for the detection of fraudulent/intentional admixtures in premium foods and is ready for the monitoring of correct labelling of premium foods worldwide.
A DNA-based registry for all animal species: the barcode index number (BIN system.
Directory of Open Access Journals (Sweden)
Sujeevan Ratnasingham
Full Text Available Because many animal species are undescribed, and because the identification of known species is often difficult, interim taxonomic nomenclature has often been used in biodiversity analysis. By assigning individuals to presumptive species, called operational taxonomic units (OTUs, these systems speed investigations into the patterning of biodiversity and enable studies that would otherwise be impossible. Although OTUs have conventionally been separated through their morphological divergence, DNA-based delineations are not only feasible, but have important advantages. OTU designation can be automated, data can be readily archived, and results can be easily compared among investigations. This study exploits these attributes to develop a persistent, species-level taxonomic registry for the animal kingdom based on the analysis of patterns of nucleotide variation in the barcode region of the cytochrome c oxidase I (COI gene. It begins by examining the correspondence between groups of specimens identified to a species through prior taxonomic work and those inferred from the analysis of COI sequence variation using one new (RESL and four established (ABGD, CROP, GMYC, jMOTU algorithms. It subsequently describes the implementation, and structural attributes of the Barcode Index Number (BIN system. Aside from a pragmatic role in biodiversity assessments, BINs will aid revisionary taxonomy by flagging possible cases of synonymy, and by collating geographical information, descriptive metadata, and images for specimens that are likely to belong to the same species, even if it is undescribed. More than 274,000 BIN web pages are now available, creating a biodiversity resource that is positioned for rapid growth.
Solving fault diagnosis problems linear synthesis techniques
Varga, Andreas
2017-01-01
This book addresses fault detection and isolation topics from a computational perspective. Unlike most existing literature, it bridges the gap between the existing well-developed theoretical results and the realm of reliable computational synthesis procedures. The model-based approach to fault detection and diagnosis has been the subject of ongoing research for the past few decades. While the theoretical aspects of fault diagnosis on the basis of linear models are well understood, most of the computational methods proposed for the synthesis of fault detection and isolation filters are not satisfactory from a numerical standpoint. Several features make this book unique in the fault detection literature: Solution of standard synthesis problems in the most general setting, for both continuous- and discrete-time systems, regardless of whether they are proper or not; consequently, the proposed synthesis procedures can solve a specific problem whenever a solution exists Emphasis on the best numerical algorithms to ...
Difficulties in Genetics Problem Solving.
Tolman, Richard R.
1982-01-01
Examined problem-solving strategies of 30 high school students as they solved genetics problems. Proposes a new sequence of teaching genetics based on results: meiosis, sex chromosomes, sex determination, sex-linked traits, monohybrid and dihybrid crosses (humans), codominance (humans), and Mendel's pea experiments. (JN)
Problem Solving, Scaffolding and Learning
Lin, Shih-Yin
2012-01-01
Helping students to construct robust understanding of physics concepts and develop good solving skills is a central goal in many physics classrooms. This thesis examine students' problem solving abilities from different perspectives and explores strategies to scaffold students' learning. In studies involving analogical problem solving…
Problem Solving on a Monorail.
Barrow, Lloyd H.; And Others
1994-01-01
This activity was created to address a lack of problem-solving activities for elementary children. A "monorail" activity from the Evening Science Program for K-3 Students and Parents program is presented to illustrate the problem-solving format. Designed for performance at stations by groups of two students. (LZ)
Solving complex fisheries management problems
DEFF Research Database (Denmark)
Petter Johnsen, Jahn; Eliasen, Søren Qvist
2011-01-01
A crucial issue for the new EU common fisheries policy is how to solve the discard problem. Through a study of the institutional set up and the arrangements for solving the discard problem in Denmark, the Faroe Islands, Iceland and Norway, the article identifies the discard problem as related...
Solving ptychography with a convex relaxation
Horstmeyer, Roarke; Chen, Richard Y.; Ou, Xiaoze; Ames, Brendan; Tropp, Joel A.; Yang, Changhuei
2015-05-01
Ptychography is a powerful computational imaging technique that transforms a collection of low-resolution images into a high-resolution sample reconstruction. Unfortunately, algorithms that currently solve this reconstruction problem lack stability, robustness, and theoretical guarantees. Recently, convex optimization algorithms have improved the accuracy and reliability of several related reconstruction efforts. This paper proposes a convex formulation of the ptychography problem. This formulation has no local minima, it can be solved using a wide range of algorithms, it can incorporate appropriate noise models, and it can include multiple a priori constraints. The paper considers a specific algorithm, based on low-rank factorization, whose runtime and memory usage are near-linear in the size of the output image. Experiments demonstrate that this approach offers a 25% lower background variance on average than alternating projections, the ptychographic reconstruction algorithm that is currently in widespread use.
Mathematics for computer graphics
Vince, John
2006-01-01
Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications
Qin, Yulin; Xiang, Jie; Wang, Rifeng; Zhou, Haiyan; Li, Kuncheng; Zhong, Ning
2012-12-01
Newell and Simon postulated that the basic steps in human problem-solving involve iteratively applying operators to transform the state of the problem to eventually achieve a goal. To check the neural basis of this framework, the present study focused on the basic processes in human heuristic problem-solving that the participants identified the current problem state and then recalled and applied the corresponding heuristic rules to change the problem state. A new paradigm, solving simplified Sudoku puzzles, was developed for an event-related functional magnetic resonance imaging (fMRI) study in problem solving. Regions of interest (ROIs), including the left prefrontal cortex, the bilateral posterior parietal cortex, the anterior cingulated cortex, the bilateral caudate nuclei, the bilateral fusiform, as well as the bilateral frontal eye fields, were found to be involved in the task. To obtain convergent evidence, in addition to traditional statistical analysis, we used the multivariate voxel classification method to check the accuracy of the predictions for the condition of the task from the blood oxygen level dependent (BOLD) response of the ROIs, using a new classifier developed in this study for fMRI data. To reveal the roles that the ROIs play in problem solving, we developed an ACT-R computational model of the information-processing processes in human problem solving, and tried to predict the BOLD response of the ROIs from the task. Advances in human problem-solving research after Newell and Simon are then briefly discussed. © 2012 The Institute of Psychology, Chinese Academy of Sciences and Blackwell Publishing Asia Pty Ltd.
Planning under uncertainty solving large-scale stochastic linear programs
Energy Technology Data Exchange (ETDEWEB)
Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft
1992-12-01
For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.
Watanabe, Kohei; Koga, Hajime; Nakamura, Kodai; Fujita, Akiko; Hattori, Akimasa; Matsuda, Masaru; Koga, Akihiko
2014-04-01
DNA-based transposable elements are ubiquitous constituents of eukaryotic genomes. Vertebrates are, however, exceptional in that most of their DNA-based elements appear to be inactivated. The Tol1 element of the medaka fish, Oryzias latipes, is one of the few elements for which copies containing an undamaged gene have been found. Spontaneous transposition of this element in somatic cells has previously been demonstrated, but there is only indirect evidence for its germline transposition. Here, we show direct evidence of spontaneous excision in the germline. Tyrosinase is the key enzyme in melanin biosynthesis. In an albino laboratory strain of medaka fish, which is homozygous for a mutant tyrosinase gene in which a Tol1 copy is inserted, we identified de novo reversion mutations related to melanin pigmentation. The gamete-based reversion rate was as high as 0.4%. The revertant fish carried the tyrosinase gene from which the Tol1 copy had been excised. We previously reported the germline transposition of Tol2, another DNA-based element that is thought to be a recent invader of the medaka fish genome. Tol1 is an ancient resident of the genome. Our results indicate that even an old element can contribute to genetic variation in the host genome as a natural mutator.
Optimal calculational schemes for solving multigroup photon transport problem
International Nuclear Information System (INIS)
Dubinin, A.A.; Kurachenko, Yu.A.
1987-01-01
A scheme of complex algorithm for solving multigroup equation of radiation transport is suggested. The algorithm is based on using the method of successive collisions, the method of forward scattering and the spherical harmonics method, and is realized in the FORAP program (FORTRAN, BESM-6 computer). As an example the results of calculating reactor photon transport in water are presented. The considered algorithm being modified may be used for solving neutron transport problems
Kling, Daniel; Egeland, Thore; Piñero, Mariana Herrera; Vigeland, Magnus Dehli
2017-11-01
Methods and implementations of DNA-based identification are well established in several forensic contexts. However, assessing the statistical power of these methods has been largely overlooked, except in the simplest cases. In this paper we outline general methods for such power evaluation, and apply them to a large set of family reunification cases, where the objective is to decide whether a person of interest (POI) is identical to the missing person (MP) in a family, based on the DNA profile of the POI and available family members. As such, this application closely resembles database searching and disaster victim identification (DVI). If parents or children of the MP are available, they will typically provide sufficient statistical evidence to settle the case. However, if one must resort to more distant relatives, it is not a priori obvious that a reliable conclusion is likely to be reached. In these cases power evaluation can be highly valuable, for instance in the recruitment of additional family members. To assess the power in an identification case, we advocate the combined use of two statistics: the Probability of Exclusion, and the Probability of Exceedance. The former is the probability that the genotypes of a random, unrelated person are incompatible with the available family data. If this is close to 1, it is likely that a conclusion will be achieved regarding general relatedness, but not necessarily the specific relationship. To evaluate the ability to recognize a true match, we use simulations to estimate exceedance probabilities, i.e. the probability that the likelihood ratio will exceed a given threshold, assuming that the POI is indeed the MP. All simulations are done conditionally on available family data. Such conditional simulations have a long history in medical linkage analysis, but to our knowledge this is the first systematic forensic genetics application. Also, for forensic markers mutations cannot be ignored and therefore current models and
Wallenhammar, Ann-Charlotte; Gunnarson, Albin; Hansson, Fredrik; Jonsson, Anders
2016-04-22
Outbreaks of clubroot disease caused by the soil-borne obligate parasite Plasmodiophora brassicae are common in oilseed rape (OSR) in Sweden. A DNA-based soil testing service that identifies fields where P. brassicae poses a significant risk of clubroot infection is now commercially available. It was applied here in field surveys to monitor the prevalence of P. brassicae DNA in field soils intended for winter OSR production and winter OSR field experiments. In 2013 in Scania, prior to planting, P. brassicae DNA was detected in 60% of 45 fields on 10 of 18 farms. In 2014, P. brassicae DNA was detected in 44% of 59 fields in 14 of 36 farms, in the main winter OSR producing region in southern Sweden. P. brassicae was present indicative of a risk for >10% yield loss with susceptible cultivars (>1300 DNA copies g soil(-1)) in 47% and 44% of fields in 2013 and 2014 respectively. Furthermore, P. brassicae DNA was indicative of sites at risk of complete crop failure if susceptible cultivars were grown (>50 000 copies g(-1) soil) in 14% and 8% of fields in 2013 and 2014, respectively. A survey of all fields at Lanna research station in western Sweden showed that P. brassicae was spread throughout the farm, as only three of the fields (20%) showed infection levels below the detection limit for P.brassicae DNA, while the level was >50,000 DNA copies g(-1) soil in 20% of the fields. Soil-borne spread is of critical importance and soil scraped off footwear showed levels of up to 682 million spores g(-1) soil. Soil testing is an important tool for determining the presence of P. brassicae and providing an indication of potential yield loss, e.g., in advisory work on planning for a sustainable OSR crop rotation. This soil test is gaining acceptance as a tool that increases the likelihood of success in precision agriculture and in applied research conducted in commercial oilseed fields and at research stations. The present application highlights the importance of prevention of
Directory of Open Access Journals (Sweden)
Ann-Charlotte Wallenhammar
2016-04-01
Full Text Available Outbreaks of clubroot disease caused by the soil-borne obligate parasite Plasmodiophora brassicae are common in oilseed rape (OSR in Sweden. A DNA-based soil testing service that identifies fields where P. brassicae poses a significant risk of clubroot infection is now commercially available. It was applied here in field surveys to monitor the prevalence of P. brassicae DNA in field soils intended for winter OSR production and winter OSR field experiments. In 2013 in Scania, prior to planting, P. brassicae DNA was detected in 60% of 45 fields on 10 of 18 farms. In 2014, P. brassicae DNA was detected in 44% of 59 fields in 14 of 36 farms, in the main winter OSR producing region in southern Sweden. P. brassicae was present indicative of a risk for >10% yield loss with susceptible cultivars (>1300 DNA copies g soil−1 in 47% and 44% of fields in 2013 and 2014 respectively. Furthermore, P. brassicae DNA was indicative of sites at risk of complete crop failure if susceptible cultivars were grown (>50 000 copies g−1 soil in 14% and 8% of fields in 2013 and 2014, respectively. A survey of all fields at Lanna research station in western Sweden showed that P. brassicae was spread throughout the farm, as only three of the fields (20% showed infection levels below the detection limit for P.brassicae DNA, while the level was >50,000 DNA copies g−1 soil in 20% of the fields. Soil-borne spread is of critical importance and soil scraped off footwear showed levels of up to 682 million spores g−1 soil. Soil testing is an important tool for determining the presence of P. brassicae and providing an indication of potential yield loss, e.g., in advisory work on planning for a sustainable OSR crop rotation. This soil test is gaining acceptance as a tool that increases the likelihood of success in precision agriculture and in applied research conducted in commercial oilseed fields and at research stations. The present application highlights the importance of
Würtz, Rolf P
2008-01-01
Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.
How to solve mathematical problems
Wickelgren, Wayne A
1995-01-01
Seven problem-solving techniques include inference, classification of action sequences, subgoals, contradiction, working backward, relations between problems, and mathematical representation. Also, problems from mathematics, science, and engineering with complete solutions.
Interactive Problem-Solving Interventions
African Journals Online (AJOL)
Frew Demeke Alemu
concerted efforts of unofficial actors to establish unofficial communication ... Frew Demeke Alemu (LLB, LLM in International Human Rights Law from Lund ..... 24 Tamra Pearson d'Estrée (2009), “Problem-Solving Approaches”, (in The SAGE ...
Marques, Severino P C
2012-01-01
This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.
Tangram solved? Prefrontal cortex activation analysis during geometric problem solving.
Ayaz, Hasan; Shewokis, Patricia A; Izzetoğlu, Meltem; Çakır, Murat P; Onaral, Banu
2012-01-01
Recent neuroimaging studies have implicated prefrontal and parietal cortices for mathematical problem solving. Mental arithmetic tasks have been used extensively to study neural correlates of mathematical reasoning. In the present study we used geometric problem sets (tangram tasks) that require executive planning and visuospatial reasoning without any linguistic representation interference. We used portable optical brain imaging (functional near infrared spectroscopy--fNIR) to monitor hemodynamic changes within anterior prefrontal cortex during tangram tasks. Twelve healthy subjects were asked to solve a series of computerized tangram puzzles and control tasks that required same geometric shape manipulation without problem solving. Total hemoglobin (HbT) concentration changes indicated a significant increase during tangram problem solving in the right hemisphere. Moreover, HbT changes during failed trials (when no solution found) were significantly higher compared to successful trials. These preliminary results suggest that fNIR can be used to assess cortical activation changes induced by geometric problem solving. Since fNIR is safe, wearable and can be used in ecologically valid environments such as classrooms, this neuroimaging tool may help to improve and optimize learning in educational settings.
Building problem solving environments with the arches framework
Energy Technology Data Exchange (ETDEWEB)
Debardeleben, Nathan [Los Alamos National Laboratory; Sass, Ron [U NORTH CAROLINA; Stanzione, Jr., Daniel [ASU; Ligon, Ill, Walter [CLEMSON UNIV
2009-01-01
The computational problems that scientists face are rapidly escalating in size and scope. Moreover, the computer systems used to solve these problems are becoming significantly more complex than the familiar, well-understood sequential model on their desktops. While it is possible to re-train scientists to use emerging high-performance computing (HPC) models, it is much more effective to provide them with a higher-level programming environment that has been specialized to their particular domain. By fostering interaction between HPC specialists and the domain scientists, problem-solving environments (PSEs) provide a collaborative environment. A PSE environment allows scientists to focus on expressing their computational problem while the PSE and associated tools support mapping that domain-specific problem to a high-performance computing system. This article describes Arches, an object-oriented framework for building domain-specific PSEs. The framework was designed to support a wide range of problem domains and to be extensible to support very different high-performance computing targets. To demonstrate this flexibility, two PSEs have been developed from the Arches framework to solve problem in two different domains and target very different computing platforms. The Coven PSE supports parallel applications that require large-scale parallelism found in cost-effective Beowulf clusters. In contrast, RCADE targets FPGA-based reconfigurable computing and was originally designed to aid NASA Earth scientists studying satellite instrument data.
Directory of Open Access Journals (Sweden)
Abderrahim Oussalah
2018-04-01
associated with HCC diagnosis (initial: OR = 6.30, for each mSEPT9 positive triplicate [2.92–13.61, p < 0.0001]; replication: OR = 6.07 [3.25–11.35, p < 0.0001]; meta-analysis: OR = 6.15 [2.93–9.38, p < 0.0001], no heterogeneity: I2 = 0%, p = 0.95; no publication bias. AUROC associated with the discrimination of the logistic regression models in initial and validation studies were 0.969 (0.930–0.989 and 0.942 (0.878–0.978, respectively, with a pooled AUROC of 0.962 ([0.937–0.987, p < 0.0001], no heterogeneity: I2 = 0%, p = 0.36; and no publication bias. Interpretation: Among patients with cirrhosis, the mSEPT9 test constitutes a promising circulating epigenetic biomarker for HCC diagnosis at the individual patient level. Future prospective studies should assess the mSEPT9 test in the screening algorithm for cirrhotic patients to improve risk prediction and personalized therapeutic management of HCC. Keywords: Cirrhosis, Hepatocellular carcinoma, Circulating cell-free DNA-based epigenetic biomarker, DNA methylation, mSEPT9
Solving-Problems and Hypermedia Systems
Directory of Open Access Journals (Sweden)
Ricardo LÓPEZ FERNÁNDEZ
2009-06-01
Full Text Available The solving problems like the transfer constitute two nuclei, related, essential in the cognitive investigation and in the mathematical education. No is in and of itself casual that, from the first moment, in the investigations on the application gives the computer science to the teaching the mathematics, cybernetic models were developed that simulated processes problem solving and transfer cotexts (GPS, 1969 and IDEA (Interactive Decision Envisioning Aid, Pea, BrunerCohen, Webster & Mellen, 1987. The present articulates it analyzes, that can contribute to the development in this respect the new technologies hypermedias, give applications that are good to implement processes of learning the heuristic thought and give the capacity of «transfer». From our perspective and from the experience that we have developed in this field, to carry out a function gives analysis and the theories on the problem solving, it requires that we exercise a previous of interpretation the central aspsects over the theories gives the solving problem and transfer starting from the classic theories on the prosecution of the information. In this sense, so much the theory gives the dual memory as the most recent, J. Anderson (1993 based on the mechanisms activation nodes information they allow to establish an interpretation suggester over the mental mechanism that you/they operate in the heuristic processes. On this analysis, the present articulates it develops a theoritical interpretation over the function gives the supports based on technology hypermedia advancing in the definition of a necessary theoretical body, having in it counts that on the other hand the practical experimentation is permanent concluding in the efficiency and effectiveness gives the support hypermedia like mechanism of comunication in the processes heuristic learning.
Lemkul, Justin A; MacKerell, Alexander D
2017-05-09
Empirical force fields seek to relate the configuration of a set of atoms to its energy, thus yielding the forces governing its dynamics, using classical physics rather than more expensive quantum mechanical calculations that are computationally intractable for large systems. Most force fields used to simulate biomolecular systems use fixed atomic partial charges, neglecting the influence of electronic polarization, instead making use of a mean-field approximation that may not be transferable across environments. Recent hardware and software developments make polarizable simulations feasible, and to this end, polarizable force fields represent the next generation of molecular dynamics simulation technology. In this work, we describe the refinement of a polarizable force field for DNA based on the classical Drude oscillator model by targeting quantum mechanical interaction energies and conformational energy profiles of model compounds necessary to build a complete DNA force field. The parametrization strategy employed in the present work seeks to correct weak base stacking in A- and B-DNA and the unwinding of Z-DNA observed in the previous version of the force field, called Drude-2013. Refinement of base nonbonded terms and reparametrization of dihedral terms in the glycosidic linkage, deoxyribofuranose rings, and important backbone torsions resulted in improved agreement with quantum mechanical potential energy surfaces. Notably, we expand on previous efforts by explicitly including Z-DNA conformational energetics in the refinement.
National Research Council Canada - National Science Library
Harvey, Tia
2003-01-01
The DNA Base Excision Repair (PER) pathway is responsible for the repair of alkylation and oxidative DNA damage resulting in protection against the deleterious effects of endogenous and exogenous agents encountered on a daily basis...
Directory of Open Access Journals (Sweden)
Touihri Leila
2012-12-01
Full Text Available Abstract Background During the vaccination campaigns, puppies younger than 3 months old are not targeted and remain unvaccinated for at least the first year of their lives. Almost half of the reported rabid dogs are 6 months or younger. Hence, we should recommend the vaccination against rabies of young puppies. Unfortunately, owing to the exposure of puppies to infections with either canine parvovirus (CPV or distemper virus (CDV after the intervention of the vaccinators, owners are reluctant to vaccinate puppies against rabies. Therefore, it is necessary to include the CPV and CDV valences in the vaccine against rabies. Multivalent DNA-based vaccination in dogs, including rabies and distemper valences, could help in raising vaccine coverage. Methods We have designed monovalent and multivalent DNA-based vaccine candidates for in vitro and in vivo assays. These plasmids encode to the rabies virus glycoprotein and/or the canine distemper virus hemagglutinin. The first strategy of multivalent DNA-based vaccination is by mixing plasmids encoding to a single antigen each. The second is by simply fusing the genes of the antigens together. The third is by adding the foot and mouth disease virus (FMDV 2A oligopeptide gene into the antigen genes. The last strategy is by the design and use of a bicistronic plasmid with an “Internal Ribosome Entry Site” (IRES domain. Results The monovalent construct against canine distemper was efficiently validated by inducing higher humoral immune responses compared to cell-culture-derived vaccine both in mice and dogs. All multivalent plasmids efficiently expressed both valences after in vitro transfection of BHK-21 cells. In BALB/c mice, the bicistronic IRES-dependant construct was the most efficient inducer of virus-neutralizing antibodies against both valences. It was able to induce better humoral immune responses compared to the administration of either cell-culture-derived vaccines or monovalent plasmids. The
Touihri, Leila; Ahmed, Sami Belhaj; Chtourou, Yacine; Daoud, Rahma; Bahloul, Chokri
2012-12-27
During the vaccination campaigns, puppies younger than 3 months old are not targeted and remain unvaccinated for at least the first year of their lives. Almost half of the reported rabid dogs are 6 months or younger. Hence, we should recommend the vaccination against rabies of young puppies. Unfortunately, owing to the exposure of puppies to infections with either canine parvovirus (CPV) or distemper virus (CDV) after the intervention of the vaccinators, owners are reluctant to vaccinate puppies against rabies. Therefore, it is necessary to include the CPV and CDV valences in the vaccine against rabies. Multivalent DNA-based vaccination in dogs, including rabies and distemper valences, could help in raising vaccine coverage. We have designed monovalent and multivalent DNA-based vaccine candidates for in vitro and in vivo assays. These plasmids encode to the rabies virus glycoprotein and/or the canine distemper virus hemagglutinin. The first strategy of multivalent DNA-based vaccination is by mixing plasmids encoding to a single antigen each. The second is by simply fusing the genes of the antigens together. The third is by adding the foot and mouth disease virus (FMDV) 2A oligopeptide gene into the antigen genes. The last strategy is by the design and use of a bicistronic plasmid with an "Internal Ribosome Entry Site" (IRES) domain. The monovalent construct against canine distemper was efficiently validated by inducing higher humoral immune responses compared to cell-culture-derived vaccine both in mice and dogs. All multivalent plasmids efficiently expressed both valences after in vitro transfection of BHK-21 cells. In BALB/c mice, the bicistronic IRES-dependant construct was the most efficient inducer of virus-neutralizing antibodies against both valences. It was able to induce better humoral immune responses compared to the administration of either cell-culture-derived vaccines or monovalent plasmids. The FMDV 2A was also efficient in the design of multivalent
DNA-based molecular markers as tools for the discovery of γ-induced mutants in cereals and soybean
International Nuclear Information System (INIS)
Bondarenco, E.; Bondarenco, V.; Barbacar, N.; Coretchi, L.
2009-01-01
γ-induced mutagenesis is one of the present techniques effective in producing crops with enhanced quality and novel properties. The fast detection of mutants can be nowadays assured by the employment of DNA-based molecular markers. Different kinds of molecular markers are being widely used all over the world to monitor DNA sequence variation and identification of desired traits. In the given paper we present a short overview of the types of molecular markers and the first steps of the attempt of their use for mutants' characterization in the Republic of Moldova (authors)
Customer-centered problem solving.
Samelson, Q B
1999-11-01
If there is no single best way to attract new customers and retain current customers, there is surely an easy way to lose them: fail to solve the problems that arise in nearly every buyer-supplier relationship, or solve them in an unsatisfactory manner. Yet, all too frequently, companies do just that. Either we deny that a problem exists, we exert all our efforts to pin the blame elsewhere, or we "Band-Aid" the problem instead of fixing it, almost guaranteeing that we will face it again and again.
DEFF Research Database (Denmark)
Foss, Kirsten; Foss, Nicolai Juul
as a general approach to problem solving. We apply these Simonian ideas to organizational issues, specifically new organizational forms. Specifically, Simonian ideas allow us to develop a morphology of new organizational forms and to point to some design problems that characterize these forms.Keywords: Herbert...... Simon, problem-solving, new organizational forms. JEL Code: D23, D83......Two of Herbert Simon's best-known papers are "The Architecture of Complexity" and "The Structure of Ill-Structured Problems." We discuss the neglected links between these two papers, highlighting the role of decomposition in the context of problems on which constraints have been imposed...
Measurement and Theory of Hydrogen Bonding Contribution to Isosteric DNA Base Pairs
Khakshoor, Omid; Wheeler, Steven E.; Houk, K. N.; Kool, Eric T.
2012-01-01
We address the recent debate surrounding the ability of 2,4-difluorotoluene (F), a low-polarity mimic of thymine (T), to form a hydrogen-bonded complex with adenine in DNA. The hydrogen bonding ability of F has been characterized as small to zero in various experimental studies, and moderate to small in computational studies. However, recent X-ray crystallographic studies of difluorotoluene in DNA/RNA have indicated, based on interatomic distances, possible hydrogen bonding interactions betwe...
The limits of quantum computers
International Nuclear Information System (INIS)
Aaronson, S.
2008-01-01
Future computers, which work with quantum bits, would indeed solve some special problems extremely fastly, but for the most problems the would hardly be superior to contemporary computers. This knowledge could manifest a new fundamental physical principle
Funke, Joachim
2013-01-01
This paper presents a bibliography of 263 references related to human problem solving, arranged by subject matter. The references were taken from PsycInfo and Academic Premier data-base. Journal papers, book chapters, and dissertations are included. The topics include human development, education, neuroscience, and research in applied settings. It…
Solved problems in classical electromagnetism
Franklin, Jerrold
2018-01-01
This original Dover publication is the companion to a new edition of the author's Classical Electromagnetism: Second Edition. The latter volume will feature only basic answers; this book will contain some problems from the reissue as well as many other new ones. All feature complete, worked-out solutions and form a valuable source of problem-solving material for students.
Error Patterns in Problem Solving.
Babbitt, Beatrice C.
Although many common problem-solving errors within the realm of school mathematics have been previously identified, a compilation of such errors is not readily available within learning disabilities textbooks, mathematics education texts, or teacher's manuals for school mathematics texts. Using data on error frequencies drawn from both the Fourth…
Quantitative Reasoning in Problem Solving
Ramful, Ajay; Ho, Siew Yin
2015-01-01
In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.
Students' Problem Solving and Justification
Glass, Barbara; Maher, Carolyn A.
2004-01-01
This paper reports on methods of students' justifications of their solution to a problem in the area of combinatorics. From the analysis of the problem solving of 150 students in a variety of settings from high-school to graduate study, four major forms of reasoning evolved: (1) Justification by Cases, (2) Inductive Argument, (3) Elimination…
A genetic algorithm for solving supply chain network design model
Firoozi, Z.; Ismail, N.; Ariafar, S. H.; Tang, S. H.; Ariffin, M. K. M. A.
2013-09-01
Network design is by nature costly and optimization models play significant role in reducing the unnecessary cost components of a distribution network. This study proposes a genetic algorithm to solve a distribution network design model. The structure of the chromosome in the proposed algorithm is defined in a novel way that in addition to producing feasible solutions, it also reduces the computational complexity of the algorithm. Computational results are presented to show the algorithm performance.
Domain decomposition method for solving elliptic problems in unbounded domains
International Nuclear Information System (INIS)
Khoromskij, B.N.; Mazurkevich, G.E.; Zhidkov, E.P.
1991-01-01
Computational aspects of the box domain decomposition (DD) method for solving boundary value problems in an unbounded domain are discussed. A new variant of the DD-method for elliptic problems in unbounded domains is suggested. It is based on the partitioning of an unbounded domain adapted to the given asymptotic decay of an unknown function at infinity. The comparison of computational expenditures is given for boundary integral method and the suggested DD-algorithm. 29 refs.; 2 figs.; 2 tabs
Solving Differential Equations in R: Package deSolve
Directory of Open Access Journals (Sweden)
Karline Soetaert
2010-02-01
Full Text Available In this paper we present the R package deSolve to solve initial value problems (IVP written as ordinary differential equations (ODE, differential algebraic equations (DAE of index 0 or 1 and partial differential equations (PDE, the latter solved using the method of lines approach. The differential equations can be represented in R code or as compiled code. In the latter case, R is used as a tool to trigger the integration and post-process the results, which facilitates model development and application, whilst the compiled code significantly increases simulation speed. The methods implemented are efficient, robust, and well documented public-domain Fortran routines. They include four integrators from the ODEPACK package (LSODE, LSODES, LSODA, LSODAR, DVODE and DASPK2.0. In addition, a suite of Runge-Kutta integrators and special-purpose solvers to efficiently integrate 1-, 2- and 3-dimensional partial differential equations are available. The routines solve both stiff and non-stiff systems, and include many options, e.g., to deal in an efficient way with the sparsity of the Jacobian matrix, or finding the root of equations. In this article, our objectives are threefold: (1 to demonstrate the potential of using R for dynamic modeling, (2 to highlight typical uses of the different methods implemented and (3 to compare the performance of models specified in R code and in compiled code for a number of test cases. These comparisons demonstrate that, if the use of loops is avoided, R code can efficiently integrate problems comprising several thousands of state variables. Nevertheless, the same problem may be solved from 2 to more than 50 times faster by using compiled code compared to an implementation using only R code. Still, amongst the benefits of R are a more flexible and interactive implementation, better readability of the code, and access to R’s high-level procedures. deSolve is the successor of package odesolve which will be deprecated in
Problem solving skills for schizophrenia.
Xia, J; Li, Chunbo
2007-04-18
The severe and long-lasting symptoms of schizophrenia are often the cause of severe disability. Environmental stress such as life events and the practical problems people face in their daily can worsen the symptoms of schizophrenia. Deficits in problem solving skills in people with schizophrenia affect their independent and interpersonal functioning and impair their quality of life. As a result, therapies such as problem solving therapy have been developed to improve problem solving skills for people with schizophrenia. To review the effectiveness of problem solving therapy compared with other comparable therapies or routine care for those with schizophrenia. We searched the Cochrane Schizophrenia Group's Register (September 2006), which is based on regular searches of BIOSIS, CENTRAL, CINAHL, EMBASE, MEDLINE and PsycINFO. We inspected references of all identified studies for further trials. We included all clinical randomised trials comparing problem solving therapy with other comparable therapies or routine care. We extracted data independently. For homogenous dichotomous data we calculated random effects, relative risk (RR), 95% confidence intervals (CI) and, where appropriate, numbers needed to treat (NNT) on an intention-to-treat basis. For continuous data, we calculated weighted mean differences (WMD) using a random effects statistical model. We included only three small trials (n=52) that evaluated problem solving versus routine care, coping skills training or non-specific interaction. Inadequate reporting of data rendered many outcomes unusable. We were unable to undertake meta-analysis. Overall results were limited and inconclusive with no significant differences between treatment groups for hospital admission, mental state, behaviour, social skills or leaving the study early. No data were presented for global state, quality of life or satisfaction. We found insufficient evidence to confirm or refute the benefits of problem solving therapy as an additional
Genetics problem solving and worldview
Dale, Esther
The research goal was to determine whether worldview relates to traditional and real-world genetics problem solving. Traditionally, scientific literacy emphasized content knowledge alone because it was sufficient to solve traditional problems. The contemporary definition of scientific literacy is, "The knowledge and understanding of scientific concepts and processes required for personal decision-making, participation in civic and cultural affairs and economic productivity" (NRC, 1996). An expanded definition of scientific literacy is needed to solve socioscientific issues (SSI), complex social issues with conceptual, procedural, or technological associations with science. Teaching content knowledge alone assumes that students will find the scientific explanation of a phenomenon to be superior to a non-science explanation. Formal science and everyday ways of thinking about science are two different cultures (Palmer, 1999). Students address this rift with cognitive apartheid, the boxing away of science knowledge from other types of knowledge (Jedege & Aikenhead, 1999). By addressing worldview, cognitive apartheid may decrease and scientific literacy may increase. Introductory biology students at the University of Minnesota during fall semester 2005 completed a written questionnaire-including a genetics content-knowledge test, four genetic dilemmas, the Worldview Assessment Instrument (WAI) and some items about demographics and religiosity. Six students responded to the interview protocol. Based on statistical analysis and interview data, this study concluded the following: (1) Worldview, in the form of metaphysics, relates to solving traditional genetic dilemmas. (2) Worldview, in the form of agency, relates to solving traditional genetics problems. (3) Thus, worldview must be addressed in curriculum, instruction, and assessment.
Future Computer Requirements for Computational Aerodynamics
1978-01-01
Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.
High-fidelity in vivo replication of DNA base shape mimics without Watson–Crick hydrogen bonds
Delaney, James C.; Henderson, Paul T.; Helquist, Sandra A.; Morales, Juan C.; Essigmann, John M.; Kool, Eric T.
2003-01-01
We report studies testing the importance of Watson–Crick hydrogen bonding, base-pair geometry, and steric effects during DNA replication in living bacterial cells. Nonpolar DNA base shape mimics of thymine and adenine (abbreviated F and Q, respectively) were introduced into Escherichia coli by insertion into a phage genome followed by transfection of the vector into bacteria. Genetic assays showed that these two base mimics were bypassed with moderate to high efficiency in the cells and with very high efficiency under damage-response (SOS induction) conditions. Under both sets of conditions, the T-shape mimic (F) encoded genetic information in the bacteria as if it were thymine, directing incorporation of adenine opposite it with high fidelity. Similarly, the A mimic (Q) directed incorporation of thymine opposite itself with high fidelity. The data establish that Watson–Crick hydrogen bonding is not necessary for high-fidelity replication of a base pair in vivo. The results suggest that recognition of DNA base shape alone serves as the most powerful determinant of fidelity during transfer of genetic information in a living organism. PMID:12676985
Tipu, Hamid Nawaz; Bashir, Muhammad Mukarram; Noman, Muhammad
2016-10-01
Serology and DNA techniques are employed for Human Leukocyte Antigen (HLA) typing in different transplant centers. Results may not always correlate well and may need retyping with different technique. All the patients (with aplastic anemia, thalassemia, and immunodeficiency) and their donors, requiring HLA typing for bone marrow transplant were enrolled in the study. Serological HLA typing was done by complement-dependent lymphocytotoxicity while DNA-based typing was done with sequence specific primers (SSP). Serology identified 167 HLA A and 165 HLA B antigens while SSP in same samples identified 181 HLA A and 184 HLA B alleles. A11 and B51 were the commonest antigens/alleles by both methods. There were a total of 21 misreads and 32 dropouts on serology, for both HLA A and B loci with HLA A32, B52 and B61 being the most ambiguous antigens. Inherent limitations of serological techniques warrant careful interpretation or use of DNA-based methods for resolution of ambiguous typing.
Solution Tree Problem Solving Procedure for Engineering Analysis ...
African Journals Online (AJOL)
Illustrations are provided in the thermofluid engineering area to showcase the procedure's applications. This approach has proved to be a veritable tool for enhancing the problem-solving and computer algorithmic skills of engineering students, eliciting their curiosity, active participation and appreciation of the taught course.
(CBTP) on knowledge, problem-solving and learning approach
African Journals Online (AJOL)
In the first instance attention is paid to the effect of a computer-based teaching programme (CBTP) on the knowledge, problem-solving skills and learning approach of student ... In the practice group (oncology wards) no statistically significant change in the learning approach of respondents was found after using the CBTP.
Pendekatan Problem Solving berbantuan Komputer dalam Pembelajaran Matematika
Directory of Open Access Journals (Sweden)
Laswadi Laswadi
2015-06-01
Full Text Available Creating effective mathematics learning is a complex and continuous undertaking. Using the right approach of learning and utilizing technological developments is an attempt to improve the quality of learning. This paper examines the problem solving learning computer-assisted and how its potential in developing high-order thinking skills of students.
Solving the Water Jugs Problem by an Integer Sequence Approach
Man, Yiu-Kwong
2012-01-01
In this article, we present an integer sequence approach to solve the classic water jugs problem. The solution steps can be obtained easily by additions and subtractions only, which is suitable for manual calculation or programming by computer. This approach can be introduced to secondary and undergraduate students, and also to teachers and…
Lobe, Elisabeth; Stollenwerk, Tobias; Tröltzsch, Anke
2015-01-01
In the recent years, the field of adiabatic quantum computing has gained importance due to the advances in the realisation of such machines, especially by the company D-Wave Systems. These machines are suited to solve discrete optimisation problems which are typically very hard to solve on a classical computer. Due to the quantum nature of the device it is assumed that there is a substantial speedup compared to classical HPC facilities. We explain the basic principles of adiabatic ...
Measurement and theory of hydrogen bonding contribution to isosteric DNA base pairs.
Khakshoor, Omid; Wheeler, Steven E; Houk, K N; Kool, Eric T
2012-02-15
We address the recent debate surrounding the ability of 2,4-difluorotoluene (F), a low-polarity mimic of thymine (T), to form a hydrogen-bonded complex with adenine in DNA. The hydrogen bonding ability of F has been characterized as small to zero in various experimental studies, and moderate to small in computational studies. However, recent X-ray crystallographic studies of difluorotoluene in DNA/RNA have indicated, based on interatomic distances, possible hydrogen bonding interactions between F and natural bases in nucleic acid duplexes and in a DNA polymerase active site. Since F is widely used to measure electrostatic contributions to pairing and replication, it is important to quantify the impact of this isostere on DNA stability. Here, we studied the pairing stability and selectivity of this compound and a closely related variant, dichlorotoluene deoxyriboside (L), in DNA, using both experimental and computational approaches. We measured the thermodynamics of duplex formation in three sequence contexts and with all possible pairing partners by thermal melting studies using the van't Hoff approach, and for selected cases by isothermal titration calorimetry (ITC). Experimental results showed that internal F-A pairing in DNA is destabilizing by 3.8 kcal/mol (van't Hoff, 37 °C) as compared with T-A pairing. At the end of a duplex, base-base interactions are considerably smaller; however, the net F-A interaction remains repulsive while T-A pairing is attractive. As for selectivity, F is found to be slightly selective for adenine over C, G, T by 0.5 kcal mol, as compared with thymine's selectivity of 2.4 kcal/mol. Interestingly, dichlorotoluene in DNA is slightly less destabilizing and slightly more selective than F, despite the lack of strongly electronegative fluorine atoms. Experimental data were complemented by computational results, evaluated at the M06-2X/6-31+G(d) and MP2/cc-pVTZ levels of theory. These computations suggest that the pairing energy of F to A
Learning via problem solving in mathematics education
Directory of Open Access Journals (Sweden)
Piet Human
2009-09-01
Full Text Available Three forms of mathematics education at school level are distinguished: direct expository teaching with an emphasis on procedures, with the expectation that learners will at some later stage make logical and functional sense of what they have learnt and practised (the prevalent form, mathematically rigorous teaching in terms of fundamental mathematical concepts, as in the so-called “modern mathematics” programmes of the sixties, teaching and learning in the context of engaging with meaningful problems and focused both on learning to become good problem solvers (teaching for problem solving andutilising problems as vehicles for the development of mathematical knowledge andproﬁciency by learners (problem-centred learning, in conjunction with substantialteacher-led social interaction and mathematical discourse in classrooms.Direct expository teaching of mathematical procedures dominated in school systems after World War II, and was augmented by the “modern mathematics” movement in the period 1960-1970. The latter was experienced as a major failure, and was soon abandoned. Persistent poor outcomes of direct expository procedural teaching of mathematics for the majority of learners, as are still being experienced in South Africa, triggered a world-wide movement promoting teaching mathematics for and via problem solving in the seventies and eighties of the previous century. This movement took the form of a variety of curriculum experiments in which problem solving was the dominant classroom activity, mainly in the USA, Netherlands, France and South Africa. While initially focusing on basic arithmetic (computation with whole numbers and elementary calculus, the problem-solving movement started to address other mathematical topics (for example, elementary statistics, algebra, differential equations around the turn of the century. The movement also spread rapidly to other countries, including Japan, Singapore and Australia. Parallel with the
Polyomino Problems to Confuse Computers
Coffin, Stewart
2009-01-01
Computers are very good at solving certain types combinatorial problems, such as fitting sets of polyomino pieces into square or rectangular trays of a given size. However, most puzzle-solving programs now in use assume orthogonal arrangements. When one departs from the usual square grid layout, complications arise. The author--using a computer,…
CSIR Research Space (South Africa)
Motara, YM
2017-09-01
Full Text Available the intersection between the SHA-1 preimage problem, the encoding of that problem for SAT-solving, and SAT-solving. The results demonstrate that SAT-solving is not yet a viable approach to take to solve the preimage problem, and also indicate that some...
Assessing Algebraic Solving Ability: A Theoretical Framework
Lian, Lim Hooi; Yew, Wun Thiam
2012-01-01
Algebraic solving ability had been discussed by many educators and researchers. There exists no definite definition for algebraic solving ability as it can be viewed from different perspectives. In this paper, the nature of algebraic solving ability in terms of algebraic processes that demonstrate the ability in solving algebraic problem is…
Rerouting algorithms solving the air traffic congestion
Adacher, Ludovica; Flamini, Marta; Romano, Elpidio
2017-06-01
Congestion in the air traffic network is a problem with an increasing relevance for airlines costs as well as airspace safety. One of the major issue is the limited operative capacity of the air network. In this work an Autonomous Agent approach is proposed to solve in real time the problem of air traffic congestion. The air traffic infrastructures are modeled with a graph and are considered partitioned in different sectors. Each sector has its own decision agent dealing with the air traffic control involved in it. Each agent sector imposes a real time aircraft scheduling to respect both delay and capacity constrains. When a congestion is predicted, a new aircraft scheduling is computed. Congestion is solved when the capacity constrains are satisfied once again. This can be done by delaying on ground aircraft or/and rerouting aircraft and/or postponing the congestion. We have tested two different algorithms that calculate K feasible paths for each aircraft involved in the congestion. Some results are reported on North Italian air space.
Methods of solving nonstandard problems
Grigorieva, Ellina
2015-01-01
This book, written by an accomplished female mathematician, is the second to explore nonstandard mathematical problems – those that are not directly solved by standard mathematical methods but instead rely on insight and the synthesis of a variety of mathematical ideas. It promotes mental activity as well as greater mathematical skills, and is an ideal resource for successful preparation for the mathematics Olympiad. Numerous strategies and techniques are presented that can be used to solve intriguing and challenging problems of the type often found in competitions. The author uses a friendly, non-intimidating approach to emphasize connections between different fields of mathematics and often proposes several different ways to attack the same problem. Topics covered include functions and their properties, polynomials, trigonometric and transcendental equations and inequalities, optimization, differential equations, nonlinear systems, and word problems. Over 360 problems are included with hints, ...
Computational thinking as an emerging competence domain
Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.
2016-01-01
Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been
Solving Kepler's equation using implicit functions
Mortari, Daniele; Elipe, Antonio
2014-01-01
A new approach to solve Kepler's equation based on the use of implicit functions is proposed here. First, new upper and lower bounds are derived for two ranges of mean anomaly. These upper and lower bounds initialize a two-step procedure involving the solution of two implicit functions. These two implicit functions, which are non-rational (polynomial) Bézier functions, can be linear or quadratic, depending on the derivatives of the initial bound values. These are new initial bounds that have been compared and proven more accurate than Serafin's bounds. The procedure reaches machine error accuracy with no more that one quadratic and one linear iterations, experienced in the "tough range", where the eccentricity is close to one and the mean anomaly to zero. The proposed method is particularly suitable for space-based applications with limited computational capability.
Algorithms for solving common fixed point problems
Zaslavski, Alexander J
2018-01-01
This book details approximate solutions to common fixed point problems and convex feasibility problems in the presence of perturbations. Convex feasibility problems search for a common point of a finite collection of subsets in a Hilbert space; common fixed point problems pursue a common fixed point of a finite collection of self-mappings in a Hilbert space. A variety of algorithms are considered in this book for solving both types of problems, the study of which has fueled a rapidly growing area of research. This monograph is timely and highlights the numerous applications to engineering, computed tomography, and radiation therapy planning. Totaling eight chapters, this book begins with an introduction to foundational material and moves on to examine iterative methods in metric spaces. The dynamic string-averaging methods for common fixed point problems in normed space are analyzed in Chapter 3. Dynamic string methods, for common fixed point problems in a metric space are introduced and discussed in Chapter ...
Confluent-Functional solving systems
Directory of Open Access Journals (Sweden)
V.N. Koval
2001-08-01
Full Text Available The paper proposes a statistical knowledge-acquision approach. The solving systems are considered, which are able to find unknown structural dependences between situational and transforming variables on the basis of statistically analyzed input information. Situational variables describe features, states and relations between environment objects. Transforming variables describe transforming influences, exerted by a goal-oriented system onto an environment. Unknown environment rules are simulated by a structural equations system, associating situational and transforming variables.
Mono- and Di-Alkylation Processes of DNA Bases by Nitrogen Mustard Mechlorethamine.
Larrañaga, Olatz; de Cózar, Abel; Cossío, Fernando P
2017-12-06
The reactivity of nitrogen mustard mechlorethamine (mec) with purine bases towards formation of mono- (G-mec and A-mec) and dialkylated (AA-mec, GG-mec and AG-mec) adducts has been studied using density functional theory (DFT). To gain a complete overview of DNA-alkylation processes, direct chloride substitution and formation through activated aziridinium species were considered as possible reaction paths for adduct formation. Our results confirm that DNA alkylation by mec occurs via aziridine intermediates instead of direct substitution. Consideration of explicit water molecules in conjunction with polarizable continuum model (PCM) was shown as an adequate computational method for a proper representation of the system. Moreover, Runge-Kutta numerical kinetic simulations including the possible bisadducts have been performed. These simulations predicted a product ratio of 83:17 of GG-mec and AG-mec diadducts, respectively. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Computer mathematics for programmers
Abney, Darrell H; Sibrel, Donald W
1985-01-01
Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p
Shankar, Akshaya; Jagota, Anand; Mittal, Jeetain
2012-10-11
Single- and double-stranded DNA are increasingly being paired with surfaces and nanoparticles for numerous applications, such as sensing, imaging, and drug delivery. Unlike the majority of DNA structures in bulk that are stabilized by canonical Watson-Crick pairing between Ade-Thy and Gua-Cyt, those adsorbed on surfaces are often stabilized by noncanonical base pairing, quartet formation, and base-surface stacking. Not much is known about these kinds of interactions. To build an understanding of the role of non-Watson-Crick pairing on DNA behavior near surfaces, one requires basic information on DNA base pair stacking and hydrogen-bonding interactions. All-atom molecular simulations of DNA bases in two cases--in bulk water and strongly adsorbed on a graphite surface--are conducted to study the relative strengths of stacking and hydrogen bond interactions for each of the 10 possible combinations of base pairs. The key information obtained from these simulations is the free energy as a function of distance between two bases in a pair. We find that stacking interactions exert the dominant influence on the stability of DNA base pairs in bulk water as expected. The strength of stability for these stacking interactions is found to decrease in the order Gua-Gua > Ade-Gua > Ade-Ade > Gua-Thy > Gua-Cyt > Ade-Thy > Ade-Cyt > Thy-Thy > Cyt-Thy > Cyt-Cyt. On the other hand, mutual interactions of surface-adsorbed base pairs are stabilized mostly by hydrogen-bonding interactions in the order Gua-Cyt > Ade-Gua > Ade-Thy > Ade-Ade > Cyt-Thy > Gua-Gua > Cyt-Cyt > Ade-Cyt > Thy-Thy > Gua-Thy. Interestingly, several non-Watson-Crick base pairings, which are commonly ignored, have similar stabilization free energies due to interbase hydrogen bonding as Watson-Crick pairs. This clearly highlights the importance of non-Watson-Crick base pairing in the development of secondary structures of oligonucleotides near surfaces.
A Model for Solving the Maxwell Quasi Stationary Equations in a 3-Phase Electric Reduction Furnace
Directory of Open Access Journals (Sweden)
S. Ekrann
1982-10-01
Full Text Available A computer code has been developed for the approximate computation of electric and magnetic fields within an electric reduction furnace. The paper describes the numerical methods used to solve Maxwell's quasi-stationary equations, which are the governing equations for this problem. The equations are discretized by a staggered grid finite difference technique. The resulting algebraic equations are solved by iterating between computations of electric and magnetic quantities. This 'outer' iteration converges only when the skin depth is larger or of about the same magnitude as the linear dimensions of the computational domain. In solving for electric quantities with magnetic quantities being regarded as known, and vice versa, the central computational task is the solution of a Poisson equation for a scalar potential. These equations are solved by line successive overrelaxation combined with a rebalancing technique.
Using graph theory for automated electric circuit solving
International Nuclear Information System (INIS)
Toscano, L; Stella, S; Milotti, E
2015-01-01
Graph theory plays many important roles in modern physics and in many different contexts, spanning diverse topics such as the description of scale-free networks and the structure of the universe as a complex directed graph in causal set theory. Graph theory is also ideally suited to describe many concepts in computer science. Therefore it is increasingly important for physics students to master the basic concepts of graph theory. Here we describe a student project where we develop a computational approach to electric circuit solving which is based on graph theoretic concepts. This highly multidisciplinary approach combines abstract mathematics, linear algebra, the physics of circuits, and computer programming to reach the ambitious goal of implementing automated circuit solving. (paper)
EISPACK-J: subprogram package for solving eigenvalue problems
International Nuclear Information System (INIS)
Fujimura, Toichiro; Tsutsui, Tsuneo
1979-05-01
EISPACK-J, a subprogram package for solving eigenvalue problems, has been developed and subprograms with a variety of functions have been prepared. These subprograms can solve standard problems of complex matrices, general problems of real matrices and special problems in which only the required eigenvalues and eigenvectors are calculated. They are compared to existing subprograms, showing their features through benchmark tests. Many test problems, including realistic scale problems, are provided for the benchmark tests. Discussions are made on computer core storage and computing time required for each subprogram, and accuracy of the solution. The results show that the subprograms of EISPACK-J, based on Householder, QR and inverse iteration methods, are the best in computing time and accuracy. (author)
Karlheinz Schwarz; Rainer Breitling; Christian Allen
2013-01-01
Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
Doing physics with scientific notebook a problem solving approach
Gallant, Joseph
2012-01-01
The goal of this book is to teach undergraduate students how to use Scientific Notebook (SNB) to solve physics problems. SNB software combines word processing and mathematics in standard notation with the power of symbolic computation. As its name implies, SNB can be used as a notebook in which students set up a math or science problem, write and solve equations, and analyze and discuss their results. Written by a physics teacher with over 20 years experience, this text includes topics that have educational value, fit within the typical physics curriculum, and show the benefits of using SNB.
A Predictor-Corrector Method for Solving Equilibrium Problems
Directory of Open Access Journals (Sweden)
Zong-Ke Bao
2014-01-01
Full Text Available We suggest and analyze a predictor-corrector method for solving nonsmooth convex equilibrium problems based on the auxiliary problem principle. In the main algorithm each stage of computation requires two proximal steps. One step serves to predict the next point; the other helps to correct the new prediction. At the same time, we present convergence analysis under perfect foresight and imperfect one. In particular, we introduce a stopping criterion which gives rise to Δ-stationary points. Moreover, we apply this algorithm for solving the particular case: variational inequalities.
DNA-Based Sensor for Real-Time Measurement of the Enzymatic Activity of Human Topoisomerase I
DEFF Research Database (Denmark)
Marcussen, Lærke Bay; Jepsen, Morten Leth; Kristoffersen, Emil Laust
2013-01-01
Sensors capable of quantitative real-time measurements may present the easiest and most accurate way to study enzyme activities. Here we present a novel DNA-based sensor for specific and quantitative real-time measurement of the enzymatic activity of the essential human enzyme, topoisomerase I....... The basic design of the sensor relies on two DNA strands that hybridize to form a hairpin structure with a fluorophore-quencher pair. The quencher moiety is released from the sensor upon reaction with human topoisomerase I thus enabling real-time optical measurement of enzymatic activity. The sensor....... The cytotoxic effect of camptothecins correlates directly with the intracellular topoisomerase I activity. We therefore envision that the presented sensor may find use for the prediction of cellular drug response. Moreover, inhibition of topoisomerase I by camptothecin is readily detectable using the presented...
Siqueira, José F; Rôças, Isabela N; Andrade, Arnaldo F B; de Uzeda, Milton
2003-02-01
A 16S rDNA-based polymerase chain reaction (PCR) method was used to detect Peptostreptococcus micros in primary root canal infections. Samples were collected from 50 teeth having carious lesions, necrotic pulps, and different forms of periradicular diseases. DNA extracted from the samples was amplified using the PCR assay, which yielded a specific fragment of P. micros 16S rDNA. P. micros was detected in 6 of 22 root canals associated with asymptomatic chronic periradicular lesions (27.3%), 2 of 8 teeth with acute apical periodontitis (25%), and 6 of 20 cases of acute periradicular abscess (30%). In general, P. micros was found in 14 of 50 cases (28%). There was no correlation between the presence of P. micros and the occurrence of symptoms. Findings suggested that P. micros can be involved in the pathogenesis of different forms of periradicular lesions.
Mondal Roy, Sutapa
2018-08-01
The quantum chemical descriptors based on density functional theory (DFT) are applied to predict the biological activity (log IC 50 ) of one class of acyl-CoA: cholesterol O-acyltransferase (ACAT) inhibitors, viz. aminosulfonyl ureas. ACAT are very effective agents for reduction of triglyceride and cholesterol levels in human body. Successful two parameter quantitative structure-activity relationship (QSAR) models are developed with a combination of relevant global and local DFT based descriptors for prediction of biological activity of aminosulfonyl ureas. The global descriptors, electron affinity of the ACAT inhibitors (EA) and/or charge transfer (ΔN) between inhibitors and model biosystems (NA bases and DNA base pairs) along with the local group atomic charge on sulfonyl moiety (∑Q Sul ) of the inhibitors reveals more than 90% efficacy of the selected descriptors for predicting the experimental log (IC 50 ) values. Copyright © 2018 Elsevier Ltd. All rights reserved.
DEFF Research Database (Denmark)
Dunshea, G.; Barros, N. B.; Wells, R. S.
2008-01-01
Mitochondrial ribosomal DNA is commonly used in DNA-based dietary analyses. In such studies, these sequences are generally assumed to be the only version present in DNA of the organism of interest. However, nuclear pseudogenes that display variable similarity to the mitochondrial versions...... are common in many taxa. The presence of nuclear pseudogenes that co-amplify with their mitochondrial paralogues can lead to several possible confounding interpretations when applied to estimating animal diet. Here, we investigate the occurrence of nuclear pseudogenes in fecal samples taken from bottlenose...... dolphins (Tursiops truncatus) that were assayed for prey DNA with a universal primer technique. We found pseudogenes in 13 of 15 samples and 1-5 pseudogene haplotypes per sample representing 5-100% of all amplicons produced. The proportion of amplicons that were pseudogenes and the diversity of prey DNA...
Quantum computing with trapped ions
International Nuclear Information System (INIS)
Haeffner, H.; Roos, C.F.; Blatt, R.
2008-01-01
Quantum computers hold the promise of solving certain computational tasks much more efficiently than classical computers. We review recent experimental advances towards a quantum computer with trapped ions. In particular, various implementations of qubits, quantum gates and some key experiments are discussed. Furthermore, we review some implementations of quantum algorithms such as a deterministic teleportation of quantum information and an error correction scheme
Complexes of DNA bases and Watson-Crick base pairs with small neutral gold clusters.
Kryachko, E S; Remacle, F
2005-12-08
The nature of the DNA-gold interaction determines and differentiates the affinity of the nucleobases (adenine, thymine, guanine, and cytosine) to gold. Our preliminary computational study [Kryachko, E. S.; Remacle, F. Nano Lett. 2005, 5, 735] demonstrates that two major bonding factors govern this interaction: the anchoring, either of the Au-N or Au-O type, and the nonconventional N-H...Au hydrogen bonding. In this paper, we offer insight into the nature of nucleobase-gold interactions and provide a detailed characterization of their different facets, i.e., geometrical, energetic, and spectroscopic aspects; the gold cluster size and gold coordination effects; proton affinity; and deprotonation energy. We then investigate how the Watson-Crick DNA pairing patterns are modulated by the nucleobase-gold interaction. We do so in terms of the proton affinities and deprotonation energies of those proton acceptors and proton donors which are involved in the interbase hydrogen bondings. A variety of properties of the most stable Watson-Crick [A x T]-Au3 and [G x C]-Au3 hybridized complexes are described and compared with the isolated Watson-Crick A x T and G x C ones. It is shown that enlarging the gold cluster size to Au6 results in a rather short gold-gold bond in the Watson-Crick interbase region of the [G x C]-Au6 complex that bridges the G x C pair and thus leads to a significant strengthening of G x C pairing.
Solving project scheduling problems by minimum cut computations
Möhring, R.H.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen
In project scheduling, a set of precedence-constrained jobs has to be scheduled so as to minimize a given objective. In resource-constrained project scheduling, the jobs additionally compete for scarce resources. Due to its universality, the latter problem has a variety of applications in
Solving the Curriculum Sequencing Problem with DNA Computing Approach
Debbah, Amina; Ben Ali, Yamina Mohamed
2014-01-01
In the e-learning systems, a learning path is known as a sequence of learning materials linked to each others to help learners achieving their learning goals. As it is impossible to have the same learning path that suits different learners, the Curriculum Sequencing problem (CS) consists of the generation of a personalized learning path for each…
High Performance Computing for Solving Fractional Differential Equations with Applications
Zhang, Wei
2014-01-01
Fractional calculus is the generalization of integer-order calculus to rational order. This subject has at least three hundred years of history. However, it was traditionally regarded as a pure mathematical field and lacked real world applications for a very long time. In recent decades, fractional calculus has re-attracted the attention of scientists and engineers. For example, many researchers have found that fractional calculus is a useful tool for describing hereditary materials and p...
Problem solving through recreational mathematics
Averbach, Bonnie
1999-01-01
Historically, many of the most important mathematical concepts arose from problems that were recreational in origin. This book takes advantage of that fact, using recreational mathematics - problems, puzzles and games - to teach students how to think critically. Encouraging active participation rather than just observation, the book focuses less on mathematical results than on how these results can be applied to thinking about problems and solving them. Each chapter contains a diverse array of problems in such areas as logic, number and graph theory, two-player games of strategy, solitaire ga
Solving Differential Equations in R
Although R is still predominantly applied for statistical analysis and graphical representation, it is rapidly becoming more suitable for mathematical computing. One of the fields where considerable progress has been made recently is the solution of differential equations. Here w...
Trangenstein, John A
2017-01-01
This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...
Developing Student Programming and Problem-Solving Skills with Visual Basic
Siegle, Del
2009-01-01
Although most computer users will never need to write a computer program, many students enjoy the challenge of creating one. Computer programming enhances students' problem solving by forcing students to break a problem into its component pieces and reassemble it in a generic format that can be understood by a nonsentient entity. It promotes…
Problem-solving in a Constructivist Environment
Directory of Open Access Journals (Sweden)
Lee Chien Sing
1999-01-01
Full Text Available The dynamic challenges of an increasingly borderless world buoyed by advances in telecommunications and information technology has resulted in educational reform and subsequently, a reconceptualisation of what constitutes a learner, learning and the influence of the learning environment on the process of learning. In keeping up with the changing trends and challenges of an increasingly networked, dynamic and challenging international community, means to provide an alternative environment that stimulates inquiry and equips learners with the skills needed to manage technological change and innovations must be considered. This paper discusses the importance of interaction, cognition and context, collaboration in a networked computer-mediated environment, the problem-solving approach as a catalyst in stimulating creative and critical thinking and in providing context for meaningful interaction and whether the interactive environment created through computer-mediated collaboration will motivate learners to be responsible for their own learning and be independent thinkers. The sample involved learners from three schools in three different countries. Findings conclude that a rich interactive environment must be personally relevant to the learner by simulating authentic problems without lowering the degree of cognitive complexity. Review in curriculum, assessment and teacher training around constructivist principles are also imperative as these interrelated factors form part of the learning process system.
A Newton method for solving continuous multiple material minimum compliance problems
DEFF Research Database (Denmark)
Stolpe, M; Stegmann, Jan
method, one or two linear saddle point systems are solved. These systems involve the Hessian of the objective function, which is both expensive to compute and completely dense. Therefore, the linear algebra is arranged such that the Hessian is not explicitly formed. The main concern is to solve...
A Newton method for solving continuous multiple material minimum compliance problems
DEFF Research Database (Denmark)
Stolpe, Mathias; Stegmann, Jan
2007-01-01
method, one or two linear saddle point systems are solved. These systems involve the Hessian of the objective function, which is both expensive to compute and completely dense. Therefore, the linear algebra is arranged such that the Hessian is not explicitly formed. The main concern is to solve...
Peterson, Sharon L.; Palmer, Louann Bierlein
2011-01-01
This study identified the problem solving strategies used by students within a university course designed to teach pre-service teachers educational technology, and whether those strategies were influenced by the format of the course (i.e., face-to-face computer lab vs. online). It also examined to what extent the type of problem solving strategies…
Knowledge-Based Instruction: Teaching Problem Solving in a Logo Learning Environment.
Swan, Karen; Black, John B.
1993-01-01
Discussion of computer programming and knowledge-based instruction focuses on three studies of elementary and secondary school students which show that five particular problem-solving strategies can be developed in students explicitly taught the strategies and given practice applying them to solve LOGO programming problems. (Contains 53…
Organizational/Memory Tools: A Technique for Improving Problem Solving Skills.
Steinberg, Esther R.; And Others
1986-01-01
This study was conducted to determine whether students would use a computer-presented organizational/memory tool as an aid in problem solving, and whether and how locus of control would affect tool use and problem-solving performance. Learners did use the tools, which were most effective in the learner control with feedback condition. (MBR)
Domain decomposition methods for solving an image problem
Energy Technology Data Exchange (ETDEWEB)
Tsui, W.K.; Tong, C.S. [Hong Kong Baptist College (Hong Kong)
1994-12-31
The domain decomposition method is a technique to break up a problem so that ensuing sub-problems can be solved on a parallel computer. In order to improve the convergence rate of the capacitance systems, pre-conditioned conjugate gradient methods are commonly used. In the last decade, most of the efficient preconditioners are based on elliptic partial differential equations which are particularly useful for solving elliptic partial differential equations. In this paper, the authors apply the so called covering preconditioner, which is based on the information of the operator under investigation. Therefore, it is good for various kinds of applications, specifically, they shall apply the preconditioned domain decomposition method for solving an image restoration problem. The image restoration problem is to extract an original image which has been degraded by a known convolution process and additive Gaussian noise.
Insight and analysis problem solving in microbes to machines.
Clark, Kevin B
2015-11-01
A key feature for obtaining solutions to difficult problems, insight is oftentimes vaguely regarded as a special discontinuous intellectual process and/or a cognitive restructuring of problem representation or goal approach. However, this nearly century-old state of art devised by the Gestalt tradition to explain the non-analytical or non-trial-and-error, goal-seeking aptitude of primate mentality tends to neglect problem-solving capabilities of lower animal phyla, Kingdoms other than Animalia, and advancing smart computational technologies built from biological, artificial, and composite media. Attempting to provide an inclusive, precise definition of insight, two major criteria of insight, discontinuous processing and problem restructuring, are here reframed using terminology and statistical mechanical properties of computational complexity classes. Discontinuous processing becomes abrupt state transitions in algorithmic/heuristic outcomes or in types of algorithms/heuristics executed by agents using classical and/or quantum computational models. And problem restructuring becomes combinatorial reorganization of resources, problem-type substitution, and/or exchange of computational models. With insight bounded by computational complexity, humans, ciliated protozoa, and complex technological networks, for example, show insight when restructuring time requirements, combinatorial complexity, and problem type to solve polynomial and nondeterministic polynomial decision problems. Similar effects are expected from other problem types, supporting the idea that insight might be an epiphenomenon of analytical problem solving and consequently a larger information processing framework. Thus, this computational complexity definition of insight improves the power, external and internal validity, and reliability of operational parameters with which to classify, investigate, and produce the phenomenon for computational agents ranging from microbes to man-made devices. Copyright
Teaching effective problem solving skills to radiation protection students
International Nuclear Information System (INIS)
Waller, Edward
2008-01-01
Full text: Problem solving skills are essential for all radiation protection personnel. Although some students have more natural problem solving skills than others, all students require practice to become comfortable using these skills. At the University of Ontario Institute of Technology (UOIT), a unique one-semester course was developed as part of the core curriculum to teach students problem solving skills and elements of modelling and simulation. The underlying emphasis of the course was to allow students to develop their own problem solving strategies, both individually and in groups. Direction was provided on how to examine problems from different perspectives, and how to determine the proper root problem statement. A five-point problem solving strategy was presented as: 1) Problem definition; 2) Solution generation; 3) Decision; 4) Implementation; 5) Evaluation. Within the strategy, problem solving techniques were integrated from diverse areas such as: De Bono 's six thinking hats, Kepner-Tregoe decision analysis, Covey's seven habits of highly effective people, Reason's swiss cheese theory of complex failure, and Howlett's common failure modes. As part of the evaluation step, students critically explore areas such as ethics and environmental responsibility. In addition to exploring problem solving methods, students learn the usefulness of simulation methods, and how to model and simulate complex phenomena of relevance to radiation protection. Computational aspects of problem solving are explored using the commercially available MATLAB computer code. A number of case studies are presented as both examples and problems to the students. Emphasis was placed on solutions to problems of interest to radiation protection, health physics and nuclear engineering. A group project, pertaining to an accident or event related to the nuclear industry is a course requirement. Students learn to utilize common time and project management tools such as flowcharting, Pareto
Solving rational expectations models using Excel
DEFF Research Database (Denmark)
Strulik, Holger
2004-01-01
Problems of discrete time optimal control can be solved using backward iteration and Microsoft Excel. The author explains the method in general and shows how the basic models of neoclassical growth and real business cycles are solved......Problems of discrete time optimal control can be solved using backward iteration and Microsoft Excel. The author explains the method in general and shows how the basic models of neoclassical growth and real business cycles are solved...
Solving Math Problems Approximately: A Developmental Perspective.
Directory of Open Access Journals (Sweden)
Dana Ganor-Stern
Full Text Available Although solving arithmetic problems approximately is an important skill in everyday life, little is known about the development of this skill. Past research has shown that when children are asked to solve multi-digit multiplication problems approximately, they provide estimates that are often very far from the exact answer. This is unfortunate as computation estimation is needed in many circumstances in daily life. The present study examined 4th graders, 6th graders and adults' ability to estimate the results of arithmetic problems relative to a reference number. A developmental pattern was observed in accuracy, speed and strategy use. With age there was a general increase in speed, and an increase in accuracy mainly for trials in which the reference number was close to the exact answer. The children tended to use the sense of magnitude strategy, which does not involve any calculation but relies mainly on an intuitive coarse sense of magnitude, while the adults used the approximated calculation strategy which involves rounding and multiplication procedures, and relies to a greater extent on calculation skills and working memory resources. Importantly, the children were less accurate than the adults, but were well above chance level. In all age groups performance was enhanced when the reference number was smaller (vs. larger than the exact answer and when it was far (vs. close from it, suggesting the involvement of an approximate number system. The results suggest the existence of an intuitive sense of magnitude for the results of arithmetic problems that might help children and even adults with difficulties in math. The present findings are discussed in the context of past research reporting poor estimation skills among children, and the conditions that might allow using children estimation skills in an effective manner.
LEGO Robotics: An Authentic Problem Solving Tool?
Castledine, Alanah-Rei; Chalmers, Chris
2011-01-01
With the current curriculum focus on correlating classroom problem solving lessons to real-world contexts, are LEGO robotics an effective problem solving tool? This present study was designed to investigate this question and to ascertain what problem solving strategies primary students engaged with when working with LEGO robotics and whether the…
Perspectives on Problem Solving and Instruction
van Merrienboer, Jeroen J. G.
2013-01-01
Most educators claim that problem solving is important, but they take very different perspective on it and there is little agreement on how it should be taught. This article aims to sort out the different perspectives and discusses problem solving as a goal, a method, and a skill. As a goal, problem solving should not be limited to well-structured…
Bricolage Programming and Problem Solving Ability in Young Children : an Exploratory Study
Rose, Simon
2016-01-01
Visual programming environments, such as Scratch, are increasingly being used by schools to teach problem solving and computational thinking skills. However, academic research is divided on the effect that visual programming has on problem solving in a computational context. This paper focuses on the role of bricolage programming in this debate; a bottom-up programming approach that arises when using block-style programming interfaces. Bricolage programming was a term originally used to descr...
Quantum speedup in solving the maximal-clique problem
Chang, Weng-Long; Yu, Qi; Li, Zhaokai; Chen, Jiahui; Peng, Xinhua; Feng, Mang
2018-03-01
The maximal-clique problem, to find the maximally sized clique in a given graph, is classically an NP-complete computational problem, which has potential applications ranging from electrical engineering, computational chemistry, and bioinformatics to social networks. Here we develop a quantum algorithm to solve the maximal-clique problem for any graph G with n vertices with quadratic speedup over its classical counterparts, where the time and spatial complexities are reduced to, respectively, O (√{2n}) and O (n2) . With respect to oracle-related quantum algorithms for the NP-complete problems, we identify our algorithm as optimal. To justify the feasibility of the proposed quantum algorithm, we successfully solve a typical clique problem for a graph G with two vertices and one edge by carrying out a nuclear magnetic resonance experiment involving four qubits.
Vipin; Sharma, Vinita; Sharma, Chandra Prakash; Kumar, Ved Prakash; Goyal, Surendra Prakash
2016-09-01
The illegal trade in wildlife is a serious threat to the existence of wild animals throughout the world. The short supply and high demand for wildlife articles have caused an influx of many different forms of fake wildlife articles into this trade. The task of identifying the materials used in making such articles poses challenges in wildlife forensics as different approaches are required for species identification. Claws constitute 3.8% of the illegal animal parts (n=2899) received at the Wildlife Institute of India (WII) for species identification. We describe the identification of seized suspected tiger claws (n=18) using a combined approach of morphometric and DNA-based analysis. The differential keratin density, determined using X-ray radiographs, indicated that none of the 18 claws were of any large cat but were fake. We determined three claw measurements, viz. ac (from the external coronary dermo-epidermal interface to the epidermis of the skin fold connecting the palmar flanges of the coronary horn), bc (from the claw tip to the epidermis of the skin fold connecting the palmar flanges of the coronary horn) and the ratio bc/ac, for all the seized (n=18), tiger (n=23) and leopard (n=49) claws. Univariate and multivariate statistical analyses were performed using SPSS. A scatter plot generated using canonical discriminant function analysis revealed that of the 18 seized claws, 14 claws formed a cluster separate from the clusters of the tiger and leopard claws, whereas the remaining four claws were within the leopard cluster. Because a discrepancy was observed between the X-ray images and the measurements of these four claws, one of the claw that clustered with the leopard claws was chosen randomly and DNA analysis carried out using the cyt b (137bp) and 16S rRNA (410bp) genes. A BLAST search and comparison with the reference database at WII indicated that the keratin material of the claw was derived from Bos taurus (cattle). This is a pioneering discovery, and
Readiness for Solving Story Problems.
Dunlap, William F.
1982-01-01
Readiness activities are described which are designed to help learning disabled (LD) students learn to perform computations in story problems. Activities proceed from concrete objects to numbers and involve the students in devising story problems. The language experience approach is incorporated with the enactive, iconic, and symbolic levels of…
ADM For Solving Linear Second-Order Fredholm Integro-Differential Equations
Karim, Mohd F.; Mohamad, Mahathir; Saifullah Rusiman, Mohd; Che-Him, Norziha; Roslan, Rozaini; Khalid, Kamil
2018-04-01
In this paper, we apply Adomian Decomposition Method (ADM) as numerically analyse linear second-order Fredholm Integro-differential Equations. The approximate solutions of the problems are calculated by Maple package. Some numerical examples have been considered to illustrate the ADM for solving this equation. The results are compared with the existing exact solution. Thus, the Adomian decomposition method can be the best alternative method for solving linear second-order Fredholm Integro-Differential equation. It converges to the exact solution quickly and in the same time reduces computational work for solving the equation. The result obtained by ADM shows the ability and efficiency for solving these equations.
New Efficient Fourth Order Method for Solving Nonlinear Equations
Directory of Open Access Journals (Sweden)
Farooq Ahmad
2013-12-01
Full Text Available In a paper [Appl. Math. Comput., 188 (2 (2007 1587--1591], authors have suggested and analyzed a method for solving nonlinear equations. In the present work, we modified this method by using the finite difference scheme, which has a quintic convergence. We have compared this modified Halley method with some other iterative of fifth-orders convergence methods, which shows that this new method having convergence of fourth order, is efficient.
Modified Projection Algorithms for Solving the Split Equality Problems
Directory of Open Access Journals (Sweden)
Qiao-Li Dong
2014-01-01
proposed a CQ algorithm for solving it. In this paper, we propose a modification for the CQ algorithm, which computes the stepsize adaptively and performs an additional projection step onto two half-spaces in each iteration. We further propose a relaxation scheme for the self-adaptive projection algorithm by using projections onto half-spaces instead of those onto the original convex sets, which is much more practical. Weak convergence results for both algorithms are analyzed.
New numerical method for solving the solute transport equation
International Nuclear Information System (INIS)
Ross, B.; Koplik, C.M.
1978-01-01
The solute transport equation can be solved numerically by approximating the water flow field by a network of stream tubes and using a Green's function solution within each stream tube. Compared to previous methods, this approach permits greater computational efficiency and easier representation of small discontinuities, and the results are easier to interpret physically. The method has been used to study hypothetical sites for disposal of high-level radioactive waste
Deterministic methods to solve the integral transport equation in neutronic
International Nuclear Information System (INIS)
Warin, X.
1993-11-01
We present a synthesis of the methods used to solve the integral transport equation in neutronic. This formulation is above all used to compute solutions in 2D in heterogeneous assemblies. Three kinds of methods are described: - the collision probability method; - the interface current method; - the current coupling collision probability method. These methods don't seem to be the most effective in 3D. (author). 9 figs
Solving large mixed linear models using preconditioned conjugate gradient iteration.
Strandén, I; Lidauer, M
1999-12-01
Continuous evaluation of dairy cattle with a random regression test-day model requires a fast solving method and algorithm. A new computing technique feasible in Jacobi and conjugate gradient based iterative methods using iteration on data is presented. In the new computing technique, the calculations in multiplication of a vector by a matrix were recorded to three steps instead of the commonly used two steps. The three-step method was implemented in a general mixed linear model program that used preconditioned conjugate gradient iteration. Performance of this program in comparison to other general solving programs was assessed via estimation of breeding values using univariate, multivariate, and random regression test-day models. Central processing unit time per iteration with the new three-step technique was, at best, one-third that needed with the old technique. Performance was best with the test-day model, which was the largest and most complex model used. The new program did well in comparison to other general software. Programs keeping the mixed model equations in random access memory required at least 20 and 435% more time to solve the univariate and multivariate animal models, respectively. Computations of the second best iteration on data took approximately three and five times longer for the animal and test-day models, respectively, than did the new program. Good performance was due to fast computing time per iteration and quick convergence to the final solutions. Use of preconditioned conjugate gradient based methods in solving large breeding value problems is supported by our findings.
Domain decomposition method for solving the neutron diffusion equation
International Nuclear Information System (INIS)
Coulomb, F.
1989-03-01
The aim of this work is to study methods for solving the neutron diffusion equation; we are interested in methods based on a classical finite element discretization and well suited for use on parallel computers. Domain decomposition methods seem to answer this preoccupation. This study deals with a decomposition of the domain. A theoretical study is carried out for Lagrange finite elements and some examples are given; in the case of mixed dual finite elements, the study is based on examples [fr
Boersma, A.J.; Feringa, B.L.; Roelfes, G.
2007-01-01
alpha,beta-Unsaturated 2-acyl imidazoles are a novel and practical class of dienophiles for the DNA-based catalytic asymmetric Diels-Alder reaction in water. The Diels-Alder products are obtained with very high diastereoselectivities and enantioselectivities in the range of 83-98%. The catalytic
Ortega, J. M.
1986-01-01
Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.
Sukhamrit Kaur; Sandeep Kaur
2015-01-01
Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...
Improve Problem Solving Skills through Adapting Programming Tools
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
There are numerous ways for engineers and students to become better problem-solvers. The use of command line and visual programming tools can help to model a problem and formulate a solution through visualization. The analysis of problem attributes and constraints provide insight into the scope and complexity of the problem. The visualization aspect of the problem-solving approach tends to make students and engineers more systematic in their thought process and help them catch errors before proceeding too far in the wrong direction. The problem-solver identifies and defines important terms, variables, rules, and procedures required for solving a problem. Every step required to construct the problem solution can be defined in program commands that produce intermediate output. This paper advocates improved problem solving skills through using a programming tool. MatLab created by MathWorks, is an interactive numerical computing environment and programming language. It is a matrix-based system that easily lends itself to matrix manipulation, and plotting of functions and data. MatLab can be used as an interactive command line or a sequence of commands that can be saved in a file as a script or named functions. Prior programming experience is not required to use MatLab commands. The GNU Octave, part of the GNU project, a free computer program for performing numerical computations, is comparable to MatLab. MatLab visual and command programming are presented here.
Lee, Junyeong; Hwang, Hyuncheol; Min, Sung-Wook; Shin, Jae Min; Kim, Jin Sung; Jeon, Pyo Jin; Lee, Hee Sung; Im, Seongil
2015-01-28
Although organic field-effect transistors (OFETs) have various advantages of lightweight, low-cost, mechanical flexibility, and nowadays even higher mobility than amorphous Si-based FET, stability issue under bias and ambient condition critically hinder its practical application. One of the most detrimental effects on organic layer comes from penetrated atmospheric species such as oxygen and water. To solve such degradation problems, several molecular engineering tactics are introduced: forming a kinetic barrier, lowering the level of molecule orbitals, and increasing the band gap. However, direct passivation of organic channels, the most promising strategy, has not been reported as often as other methods. Here, we resolved the ambient stability issues of p-type (heptazole)/or n-type (PTCDI-C13) OFETs and their bias-stability issues at once, using DNA-base small molecule guanine (C5H5N5O)/Al2O3 bilayer. The guanine protects the organic channels as buffer/and H getter layer between the channels and capping Al2O3, whereas the oxide capping resists ambient molecules. As a result, both p-type and n-type OFETs are simultaneously protected from gate-bias stress and 30 days-long ambient aging, finally demonstrating a highly stable, high-gain complementary-type logic inverter.
International Nuclear Information System (INIS)
Foehe, C.; Dikomey, E.
1994-01-01
DNA base damage was measured in Chinese hamster ovary cells X-irradiated under aerobic conditions using an extract of the bacterium Micrococcus luteus. The glycosylases and endonucleases present in this extract recognize damaged bases and convert them into strand breaks (termed endonuclease-sensitive sites, enss). Strand breaks were detected by the alkaline unwinding technique. The induction of enss was measured for X-ray doses ranging up to 45 Gy. The relative frequency of all enss related to all radiation induced strand breaks was 1.7 ± 0.4. Repair of enss was studied for a radiation dose of 45 Gy. The number of enss was found to decrease exponentially with time after irradiation with a half-time of τ enss = 37 ± 8 min. The repair kinetics that were also measured for all X-ray-induced DNA strand breaks were found to consist of three phases: fast, intermediate and slow. The intermediate phase was fitted under the assumption that this phase results from the information and repair of secondary single-strand breaks generated by enzymatic incision at the sites of base damage repair. (author)
Gull, Iram; Javed, Attia; Aslam, Muhammad Shahbaz; Mushtaq, Roohi; Athar, Muhammad Amin
2016-01-01
The use of Moringa oleifera as natural food preservative has been evaluated in the present study. In addition, for quality assurance, the study has also been focused on the shelf life of product to authenticate the identification of plant by development of DNA based marker. Among the different extracts prepared from flower pods of Moringa oleifera, methanol and aqueous extract exhibited high antibacterial and antioxidant activity, respectively. The high phenolic contents (53.5 ± 0.169 mg GAE/g) and flavonoid contents (10.9 ± 0.094 mg QE/g) were also recorded in methanol and aqueous extract, respectively. Due to instability of bioactive compounds in aqueous extract, methanol extract is considered as potent natural preservative. The shelf life of methanol extract was observed for two months at 4°C under dark conditions. The developed SCAR primers (MOF217/317/MOR317) specifically amplified a fragment of 317 bp from DNA of Moringa oleifera samples collected from different regions of Punjab province of Pakistan. The methanol extract of Moringa oleifera flower pods has great potential to be used as natural preservative and nutraceutical in food industry.
Liu, Chuan; Duan, Weixia; Xu, Shangcheng; Chen, Chunhai; He, Mindi; Zhang, Lei; Yu, Zhengping; Zhou, Zhou
2013-03-27
Whether exposure to radiofrequency electromagnetic radiation (RF-EMR) emitted from mobile phones can induce DNA damage in male germ cells remains unclear. In this study, we conducted a 24h intermittent exposure (5 min on and 10 min off) of a mouse spermatocyte-derived GC-2 cell line to 1800 MHz Global System for Mobile Communication (GSM) signals in GSM-Talk mode at specific absorption rates (SAR) of 1 W/kg, 2 W/kg or 4 W/kg. Subsequently, through the use of formamidopyrimidine DNA glycosylase (FPG) in a modified comet assay, we determined that the extent of DNA migration was significantly increased at a SAR of 4 W/kg. Flow cytometry analysis demonstrated that levels of the DNA adduct 8-oxoguanine (8-oxoG) were also increased at a SAR of 4 W/kg. These increases were concomitant with similar increases in the generation of reactive oxygen species (ROS); these phenomena were mitigated by co-treatment with the antioxidant α-tocopherol. However, no detectable DNA strand breakage was observed by the alkaline comet assay. Taking together, these findings may imply the novel possibility that RF-EMR with insufficient energy for the direct induction of DNA strand breaks may produce genotoxicity through oxidative DNA base damage in male germ cells. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.
Brovarets', O O; Hovorun, D M
2010-01-01
A novel physico-chemical mechanism of the Watson-Crick DNA base pair Gua.Cyt tautomerization Gua.Cyt*Gua.CytGua*.Cyt (mutagenic tautomers of bases are marked by asterisks) have been revealed and realized in a pathway of single proton transfer through two mutual isoenergetic transition states with Gibbs free energy of activation 30.4 and 30.6 kcal/mol and they are ion pairs stabilized by three (N2H...N3, N1H...N4- and O6+H...N4-) and five (N2H...O2, N1H...O2, N1H...N3, O6+H...N4- and 06+H...N4-) H-bonds accordingly. Stable base pairs Gua-Cyt* and Gua*.Cyt which dissociate comparably easy into monomers have acceptable relative Gibbs energies--12.9 and 14.3 kcal/mol--for the explanation of the nature of the spontaneous transitions of DNA replication. Results are obtained at the MP2/6-311++G(2df,pd)//B3LYP/6-31 1++G(d,p) level of theory in vacuum approach.
Ensafi, Ali A; Jamei, Hamid Reza; Heydari-Bafrooei, Esmaeil; Rezaei, B
2016-10-01
This paper presents the results of an experimental investigation of voltammetric and impedimetric DNA-based biosensors for monitoring biological and chemical redox cycling reactions involving free radical intermediates. The concept is based on associating the amounts of radicals generated with the electrochemical signals produced, using differential pulse voltammetry (DPV) and electrochemical impedance spectroscopy (EIS). For this purpose, a pencil graphite electrode (PGE) modified with multiwall carbon nanotubes and poly-diallydimethlammonium chloride decorated with double stranded fish sperm DNA was prepared to detect DNA damage induced by the radicals generated from a redox cycling quinone (i.e., menadione (MD; 2-methyl-1,4-naphthoquinone)). Menadione was employed as a model compound to study the redox cycling of quinones. A direct relationship was found between free radical production and DNA damage. The relationship between MD-induced DNA damage and free radical generation was investigated in an attempt to identify the possible mechanism(s) involved in the action of MD. Results showed that DPV and EIS were appropriate, simple and inexpensive techniques for the quantitative and qualitative comparisons of different reducing reagents. These techniques may be recommended for monitoring DNA damages and investigating the mechanisms involved in the production of redox cycling compounds. Copyright © 2016 Elsevier B.V. All rights reserved.
Poletto, Mattia; Yang, Di; Fletcher, Sally C; Vendrell, Iolanda; Fischer, Roman; Legrand, Arnaud J; Dianov, Grigory L
2017-09-29
Ataxia telangiectasia (A-T) is a syndrome associated with loss of ATM protein function. Neurodegeneration and cancer predisposition, both hallmarks of A-T, are likely to emerge as a consequence of the persistent oxidative stress and DNA damage observed in this disease. Surprisingly however, despite these severe features, a lack of functional ATM is still compatible with early life, suggesting that adaptation mechanisms contributing to cell survival must be in place. Here we address this gap in our knowledge by analysing the process of human fibroblast adaptation to the lack of ATM. We identify profound rearrangement in cellular proteostasis occurring very early on after loss of ATM in order to counter protein damage originating from oxidative stress. Change in proteostasis, however, is not without repercussions. Modulating protein turnover in ATM-depleted cells also has an adverse effect on the DNA base excision repair pathway, the major DNA repair system that deals with oxidative DNA damage. As a consequence, the burden of unrepaired endogenous DNA lesions intensifies, progressively leading to genomic instability. Our study provides a glimpse at the cellular consequences of loss of ATM and highlights a previously overlooked role for proteostasis in maintaining cell survival in the absence of ATM function. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Rose, Emily; Masonjones, Heather D; Jones, Adam G
2016-11-01
Isolated populations provide special opportunities to study local adaptation and incipient speciation. In some cases, however, morphological evolution can obscure the taxonomic status of recently founded populations. Here, we use molecular markers to show that an anchialine-lake-restricted population of seahorses, originally identified as Hippocampus reidi, appears on the basis of DNA data to be Hippocampus erectus We collected seahorses from Sweetings Pond, on Eleuthera Island, Bahamas, during the summer of 2014. We measured morphological traits and sequenced 2 genes, cytochrome b and ribosomal protein S7, from 19 seahorses in our sample. On the basis of morphology, Sweetings Pond seahorses could not be assigned definitively to either of the 2 species of seahorse, H. reidi and H. erectus, that occur in marine waters surrounding the Bahamas. However, our DNA-based phylogenetic analysis showed that the Sweetings Pond fish were firmly nested within the H. erectus clade with a Bayesian posterior probability greater than 0.99. Thus, Sweetings Pond seahorses most recently shared a common ancestor with H. erectus populations from the Western Atlantic. Interestingly, the seahorses from Sweetings Pond differ morphologically from other marine populations of H. erectus in having a more even torso to tail length ratio. The substantial habitat differences between Sweetings Pond and the surrounding coastal habitat make Sweetings Pond seahorses particularly interesting from the perspectives of conservation, local adaptation, and incipient speciation. © The American Genetic Association 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Lee, Andrea J; Wallace, Susan S
2017-06-01
The first step of the base excision repair (BER) pathway responsible for removing oxidative DNA damage utilizes DNA glycosylases to find and remove the damaged DNA base. How glycosylases find the damaged base amidst a sea of undamaged bases has long been a question in the BER field. Single molecule total internal reflection fluorescence microscopy (SM TIRFM) experiments have allowed for an exciting look into this search mechanism and have found that DNA glycosylases scan along the DNA backbone in a bidirectional and random fashion. By comparing the search behavior of bacterial glycosylases from different structural families and with varying substrate specificities, it was found that glycosylases search for damage by periodically inserting a wedge residue into the DNA stack as they redundantly search tracks of DNA that are 450-600bp in length. These studies open up a wealth of possibilities for further study in real time of the interactions of DNA glycosylases and other BER enzymes with various DNA substrates. Copyright © 2016 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Eryn E. Slankster
2012-06-01
Full Text Available We have developed DNA-based genetic markers for rapid-cycling Brassica rapa (RCBr, also known as Fast Plants. Although markers for Brassica rapa already exist, ours were intentionally designed for use in a teaching laboratory environment. The qualities we selected for were robust amplification in PCR, polymorphism in RCBr strains, and alleles that can be easily resolved in simple agarose slab gels. We have developed two single nucleotide polymorphism (SNP based markers and 14 variable number tandem repeat (VNTR-type markers spread over four chromosomes. The DNA sequences of these markers represent variation in a wide range of genomic features. Among the VNTR-type markers, there are examples of variation in a nongenic region, variation within an intron, and variation in the coding sequence of a gene. Among the SNP-based markers there are examples of polymorphism in intronic DNA and synonymous substitution in a coding sequence. Thus these markers can serve laboratory exercises in both transmission genetics and molecular biology.
Czarny, Piotr; Merecz-Sadowska, Anna; Majchrzak, Kinga; Jabłkowski, Maciej; Szemraj, Janusz; Śliwiński, Tomasz; Karwowski, Bolesław
2017-07-01
Hepatitis C virus (HCV) can infect extrahepatic tissues, including lymphocytes, creating reservoir of the virus. Moreover, HCV proteins can interact with DNA damage response proteins of infected cells. In this article we investigated the influence of the virus infection and a new ombitasvir/paritaprevir/ritonavir ± dasabuvir ± ribavirin (OBV/PTV/r ± DSV ± RBV) anti-HCV therapy on the PBMCs (peripheral blood mononuclear cells, mainly lymphocytes) DNA base excision repair (BER) system. BER protein activity was analyzed in the nuclear and mitochondrial extracts (NE and ME) of PBMC isolated from patients before and after therapy, and from subjects without HCV, using modeled double-strand DNA, with 2'-deoxyuridine substitution as the DNA damage. The NE and ME obtained from patients before therapy demonstrated lower efficacy of 2'-deoxyuridine removal and DNA repair polymerization than those of the control group or patients after therapy. Moreover, the extracts from the patients after therapy had similar activity to those from the control group. However, the efficacy of apurinic/apyrimidinic site excision in NE did not differ between the studied groups. We postulate that infection of lymphocytes by the HCV can lead to a decrease in the activity of BER enzymes. However, the use of novel therapy results in the improvement of glycosylase activity as well as the regeneration of endonuclease and other crucial repair enzymes.
Energy Technology Data Exchange (ETDEWEB)
Romac, S; Leong, P; Sockett, H; Hutchinson, F [Yale Univ., New Haven, CT (USA). Dept. of Molecular Biophysics and Biochemistry
1989-09-20
The DNA base sequence changes induced by mutagenesis with ultraviolet light have been determined in a gene on a chromosome of cultured Chinese hamster ovary (CHO) cells. The gene was the Excherichia coli gpt gene, of which a single copy was stably incorporated and expressed in the CHO cell genome. The cells were irradiated with ultraviolet light and gpt{sup -} colonies were selected by resistance to 6-thioguanine. The gpt gene was amplified from chromosomal DNA by use of the polymerase chain reaction (PCR) and the amplified DNA sequenced directly by the dideoxy method. Of the 58 sequenced mutants of independent origin 53 were base change mutations. Forty-one base substitutions were single base changes, ten had two adjacent (or tandem) base changes, and one had two base changes separated by a single base-pair. Only one mutant had a multiple base change mutation with two or more well separated base changes. In contrast much higher levels of such mutations were reported in ultraviolet mutagenesis of genes on a shuttle vector in primate cells. Two deletions of a single base-pair were observed and three deletions ranging from 6 to 37 base-pairs. The mutation spectrum in the gpt gene had similarities to the ultraviolet mutation spectra for several genes in prokaryotes, which suggests similarities in mutational mechanisms in prokaryotes and eukaryotes. (author).
Mondol, Samrat; Sridhar, Vanjulavalli; Yadav, Prasanjeet; Gubbi, Sanjay; Ramakrishnan, Uma
2015-04-01
Illicit trade in wildlife products is rapidly decimating many species across the globe. Such trade is often underestimated for wide-ranging species until it is too late for the survival of their remaining populations. Policing this trade could be vastly improved if one could reliably determine geographic origins of illegal wildlife products and identify areas where greater enforcement is needed. Using DNA-based assignment tests (i.e., samples are assigned to geographic locations), we addressed these factors for leopards (Panthera pardus) on the Indian subcontinent. We created geography-specific allele frequencies from a genetic reference database of 173 leopards across India to infer geographic origins of DNA samples from 40 seized leopard skins. Sensitivity analyses of samples of known geographic origins and assignments of seized skins demonstrated robust assignments for Indian leopards. We found that confiscated pelts seized in small numbers were not necessarily from local leopards. The geographic footprint of large seizures appeared to be bigger than the cumulative footprint of several smaller seizures, indicating widespread leopard poaching across the subcontinent. Our seized samples had male-biased sex ratios, especially the large seizures. From multiple seized sample assignments, we identified central India as a poaching hotspot for leopards. The techniques we applied can be used to identify origins of seized illegal wildlife products and trade routes at the subcontinent scale and beyond. © 2014 Society for Conservation Biology.
Brettar, Ingrid; Christen, Richard; Höfle, Manfred G
2012-01-01
Understanding structure-function links of microbial communities is a central theme of microbial ecology since its beginning. To this end, we studied the spatial variability of the bacterioplankton community structure and composition across the central Baltic Sea at four stations, which were up to 450 km apart and at a depth profile representative for the central part (Gotland Deep, 235 m). Bacterial community structure was followed by 16S ribosomal RNA (rRNA)- and 16S rRNA gene-based fingerprints using single-strand conformation polymorphism (SSCP) electrophoresis. Species composition was determined by sequence analysis of SSCP bands. High similarities of the bacterioplankton communities across several hundred kilometers were observed in the surface water using RNA- and DNA-based fingerprints. In these surface communities, the RNA- and DNA-based fingerprints resulted in very different pattern, presumably indicating large difference between the active members of the community as represented by RNA-based fingerprints and the present members represented by the DNA-based fingerprints. This large discrepancy changed gradually over depth, resulting in highly similar RNA- and DNA-based fingerprints in the anoxic part of the water column below 130 m depth. A conceivable mechanism explaining this high similarity could be the reduced oxidative stress in the anoxic zone. The stable communities on the surface and in the anoxic zone indicate the strong influence of the hydrography on the bacterioplankton community structure. Comparative analysis of RNA- and DNA-based community structure provided criteria for the identification of the core community, its key members and their links to biogeochemical functions.
Community-powered problem solving.
Gouillart, Francis; Billings, Douglas
2013-04-01
Traditionally, companies have managed their constituencies with specific processes: marketing to customers, procuring from vendors, developing HR policies for employees, and so on. The problem is, such processes focus on repeatability and compliance, so they can lead to stagnation. Inviting your constituencies to collectively help you solve problems and exploit opportunities--"co-creation"--is a better approach. It allows you to continually tap the skills and insights of huge numbers of stakeholders and develop new ways to produce value for all. The idea is to provide stakeholders with platforms (physical and digital forums) on which they can interact, get them to start exploring new experiences and connections, and let the system grow organically. A co-creation initiative by a unit of Becton, Dickinson and Company demonstrates how this works. A global leader in syringes, BD set out to deepen its ties with hospital customers and help them reduce the incidence of infections from unsafe injection and syringe disposal practices. The effort began with a cross-functional internal team, brought in the hospital procurement and supply managers BD had relationships with, and then reached out to hospitals' infection-prevention and occupational health leaders. Eventually product designers, nurses, sustainability staffers, and even hospital CFOs were using the platform, contributing data that generated new best practices and reduced infections.
Solve the Dilemma of Over-Simplification
Schmitt, Gerhard
Complexity science can help to understand the functioning and the interaction of the components of a city. In 1965, Christopher Alexander gave in his book A city is not a tree a description of the complex nature of urban organization. At this time, neither high-speed computers nor urban big data existed. Today, Luis Bettencourt et al. use complexity science to analyze data for countries, regions, or cities. The results can be used globally in other cities. Objectives of complexity science with regard to future cities are the observation and identification of tendencies and regularities in behavioral patterns, and to find correlations between them and spatial configurations. Complex urban systems cannot be understood in total yet. But research focuses on describing the system by finding some simple, preferably general and emerging patterns and rules that can be used for urban planning. It is important that the influencing factors are not just geo-spatial patterns but also consider variables which are important for the design quality. Complexity science is a way to solve the dilemma of oversimplification of insights from existing cities and their applications to new cities. An example: The effects of streets, public places and city structures on citizens and their behavior depend on how they are perceived. To describe this perception, it is not sufficient to consider only particular characteristics of the urban environment. Different aspects play a role and influence each other. Complexity science could take this fact into consideration and handle the non-linearity of the system...
Solving the Examination Timetabling Problem in GPUs
Directory of Open Access Journals (Sweden)
Vasileios Kolonias
2014-07-01
Full Text Available The examination timetabling problem belongs to the class of combinatorial optimization problems and is of great importance for every University. In this paper, a hybrid evolutionary algorithm running on a GPU is employed to solve the examination timetabling problem. The hybrid evolutionary algorithm proposed has a genetic algorithm component and a greedy steepest descent component. The GPU computational capabilities allow the use of very large population sizes, leading to a more thorough exploration of the problem solution space. The GPU implementation, depending on the size of the problem, is up to twenty six times faster than the identical single-threaded CPU implementation of the algorithm. The algorithm is evaluated with the well known Toronto datasets and compares well with the best results found in the bibliography. Moreover, the selection of the encoding of the chromosomes and the tournament selection size as the population grows are examined and optimized. The compressed sparse row format is used for the conflict matrix and was proven essential to the process, since most of the datasets have a small conflict density, which translates into an extremely sparse matrix.
Projective geometry solved problems and theory review
Fortuna, Elisabetta; Pardini, Rita
2016-01-01
This book starts with a concise but rigorous overview of the basic notions of projective geometry, using straightforward and modern language. The goal is not only to establish the notation and terminology used, but also to offer the reader a quick survey of the subject matter. In the second part, the book presents more than 200 solved problems, for many of which several alternative solutions are provided. The level of difficulty of the exercises varies considerably: they range from computations to harder problems of a more theoretical nature, up to some actual complements of the theory. The structure of the text allows the reader to use the solutions of the exercises both to master the basic notions and techniques and to further their knowledge of the subject, thus learning some classical results not covered in the first part of the book. The book addresses the needs of undergraduate and graduate students in the theoretical and applied sciences, and will especially benefit those readers with a solid grasp of ...
IDEAL Problem Solving dalam Pembelajaran Matematika
Directory of Open Access Journals (Sweden)
Eny Susiana
2012-01-01
Full Text Available Most educators agree that problem solving is among the most meaningful and importantkinds of learning and thingking. That is, the central focus of learning and instructionshould be learning to solve problems. There are several warrants supporting that claims.They are authenticity, relevance, problem solving engages deeper learning angtherefore enhances meaning making, and constructed to represent problems (problemsolving is more meaningful. It is the reason why we must provide teaching and learningto make studentâ€™s problem solving skill in progress. There are many informationprocessingmodels of problem solving, such as simplified model of the problem-solvingprocess by Gicks, Polyaâ€™s problem solving process etc. One of them is IDEAL problemsolving. Each letter of IDEAL is stand for an aspect of thinking that is important forproblem solving. IDEAL is identify problem, Define Goal, Explore possible strategies,Anticipate outcme and Act, and Look back and learn. Using peer interaction andquestion prompt in small group in IDEAL problem solving teaching and Learning canimprove problem solving skill.Kata kunci: IDEAL Problem Solving, Interaksi Sebaya, Pertanyaan Penuntun, KelompokKecil.
Administrative Computing in Continuing Education.
Broxton, Harry
1982-01-01
Describes computer applications in the Division of Continuing Education at Brigham Young University. These include instructional applications (computer assisted instruction, computer science education, and student problem solving) and administrative applications (registration, payment records, grades, reports, test scoring, mailing, and others).…
Indirection and computer security.
Energy Technology Data Exchange (ETDEWEB)
Berg, Michael J.
2011-09-01
The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.
Conceptual problem solving in high school physics
Jennifer L. Docktor; Natalie E. Strand; José P. Mestre; Brian H. Ross
2015-01-01
Problem solving is a critical element of learning physics. However, traditional instruction often emphasizes the quantitative aspects of problem solving such as equations and mathematical procedures rather than qualitative analysis for selecting appropriate concepts and principles. This study describes the development and evaluation of an instructional approach called Conceptual Problem Solving (CPS) which guides students to identify principles, justify their use, and plan their solution in w...
Mobile learning and computational thinking
Directory of Open Access Journals (Sweden)
José Manuel Freixo Nunes
2017-11-01
Full Text Available Computational thinking can be thought of as an approach to problem solving which has been applied to different areas of learning and which has become an important field of investigation in the area of educational research. [continue
Evolutionary computation for reinforcement learning
Whiteson, S.; Wiering, M.; van Otterlo, M.
2012-01-01
Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,
Mobile learning and computational thinking
José Manuel Freixo Nunes; Teresa Margarida Loureiro Cardoso
2017-01-01
Computational thinking can be thought of as an approach to problem solving which has been applied to different areas of learning and which has become an important field of investigation in the area of educational research. [continue
Lara, A; Riquelme, M; Vöhringer-Martinez, E
2018-05-11
Partition coefficients serve in various areas as pharmacology and environmental sciences to predict the hydrophobicity of different substances. Recently, they have also been used to address the accuracy of force fields for various organic compounds and specifically the methylated DNA bases. In this study, atomic charges were derived by different partitioning methods (Hirshfeld and Minimal Basis Iterative Stockholder) directly from the electron density obtained by electronic structure calculations in a vacuum, with an implicit solvation model or with explicit solvation taking the dynamics of the solute and the solvent into account. To test the ability of these charges to describe electrostatic interactions in force fields for condensed phases, the original atomic charges of the AMBER99 force field were replaced with the new atomic charges and combined with different solvent models to obtain the hydration and chloroform solvation free energies by molecular dynamics simulations. Chloroform-water partition coefficients derived from the obtained free energies were compared to experimental and previously reported values obtained with the GAFF or the AMBER-99 force field. The results show that good agreement with experimental data is obtained when the polarization of the electron density by the solvent has been taken into account, and when the energy needed to polarize the electron density of the solute has been considered in the transfer free energy. These results were further confirmed by hydration free energies of polar and aromatic amino acid side chain analogs. Comparison of the two partitioning methods, Hirshfeld-I and Minimal Basis Iterative Stockholder (MBIS), revealed some deficiencies in the Hirshfeld-I method related to the unstable isolated anionic nitrogen pro-atom used in the method. Hydration free energies and partitioning coefficients obtained with atomic charges from the MBIS partitioning method accounting for polarization by the implicit solvation model
Brovarets', Ol'ha O; Hovorun, Dmytro M
2015-01-01
In this study, we have theoretically demonstrated the intrinsic ability of the wobble G·T(w)/G*·T*(w)/G·T(w1)/G·T(w2) and Watson-Crick-like G*·T(WC) DNA base mispairs to interconvert into each other via the DPT tautomerization. We have established that among all these transitions, only one single G·T(w) ↔ G*·T(WC) pathway is eligible from a biological perspective. It involves short-lived intermediate - the G·T*(WC) base mispair - and is governed by the planar, highly stable, and zwitterionic [Formula: see text] transition state stabilized by the participation of the unique pattern of the five intermolecular O6(+)H⋯O4(-), O6(+)H⋯N3(-), N1(+)H⋯N3(-), N1(+)H⋯O2(-), and N2(+)H⋯O2(-) H-bonds. This non-dissociative G·T(w) ↔ G*·T(WC) tautomerization occurs without opening of the pair: Bases within mispair remain connected by 14 different patterns of the specific intermolecular interactions that successively change each other along the IRC. Novel kinetically controlled mechanism of the thermodynamically non-equilibrium spontaneous point GT/TG incorporation errors has been suggested. The mutagenic effect of the analogues of the nucleotide bases, in particular 5-bromouracil, can be attributed to the decreasing of the barrier of the acquisition by the wobble pair containing these compounds of the enzymatically competent Watson-Crick's geometry via the intrapair mutagenic tautomerization directly in the essentially hydrophobic recognition pocket of the replication DNA-polymerase machinery. Proposed approaches are able to explain experimental data, namely growth of the rate of the spontaneous point incorporation errors during DNA biosynthesis with increasing temperature.
Directory of Open Access Journals (Sweden)
Mudasir Mudasir
2010-06-01
Full Text Available A research about base-pair specificity of the DNA binding of [Fe(phen3]2+, [Fe(phen2(dip]2+ and [Fe(phen(dip2]2+ complexes and the effect of calf-thymus DNA (ct-DNA binding of these metal complexes on thermal denaturation of ct-DNA has been carried out. This research is intended to evaluate the preferential binding of the complexes to the sequence of DNA (A-T or G-C sequence and to investigate the binding strength and mode upon their interaction with DNA. Base-pair specificity of the DNA binding of the complexes was determined by comparing the equilibrium binding constant (Kb of each complex to polysynthetic DNA that contain only A-T or G-C sequence. The Kb value of the interaction was determined by spectrophotometric titration and thermal denaturation temperature (Tm was determined by monitoring the absorbance of the mixture solution of each complex and ct-DNA at λ =260 nm as temperature was elevated in the range of 25 - 100 oC. Results of the study show that in general all iron(II complexes studied exhibit a base-pair specificity in their DNA binding to prefer the relatively facile A-T sequence as compared to the G-C one. The thermal denaturation experiments have demonstrated that Fe(phen3]2+ and [Fe(phen2(dip]2+ interact weakly with double helical DNA via electrostatic interaction as indicated by insignificant changes in melting temperature, whereas [Fe(phen2(dip]2+ most probably binds to DNA in mixed modes of interaction, i.e.: intercalation and electrostatic interaction. This conclusion is based on the fact that the binding of [Fe(phen2(dip]2+ to ct-DNA moderately increase the Tm value of ct- DNA Keywords: DNA Binding, mixed-ligand complexes
Kopecká, J; Němec, M; Matoulková, D
2016-06-01
Brewing yeasts are classified into two species-Saccharomyces pastorianus and Saccharomyces cerevisiae. Most of the brewing yeast strains are natural interspecies hybrids typically polyploids and their identification is thus often difficult giving heterogenous results according to the method used. We performed genetic characterization of a set of the brewing yeast strains coming from several yeast culture collections by combination of various DNA-based techniques. The aim of this study was to select a method for species-specific identification of yeast and discrimination of yeast strains according to their technological classification. A group of 40 yeast strains were characterized using PCR-RFLP analysis of ITS-5·8S, NTS, HIS4 and COX2 genes, multiplex PCR, RAPD-PCR of genomic DNA, mtDNA-RFLP and electrophoretic karyotyping. Reliable differentiation of yeast to the species level was achieved by PCR-RFLP of HIS4 gene. Numerical analysis of the obtained RAPD-fingerprints and karyotype revealed species-specific clustering corresponding with the technological classification of the strains. Taxonomic position and partial hybrid nature of strains were verified by multiplex PCR. Differentiation among species using the PCR-RFLP of ITS-5·8S and NTS region was shown to be unreliable. Karyotyping and RFLP of mitochondrial DNA evinced small inaccuracies in strain categorization. PCR-RFLP of HIS4 gene and RAPD-PCR of genomic DNA are reliable and suitable methods for fast identification of yeast strains. RAPD-PCR with primer 21 is a fast and reliable method applicable for differentiation of brewing yeasts with only 35% similarity of fingerprint profile between the two main technological groups (ale and lager) of brewing strains. It was proved that PCR-RFLP method of HIS4 gene enables precise discrimination among three technologically important Saccharomyces species. Differentiation of brewing yeast to the strain level can be achieved using the RAPD-PCR technique. © 2016 The
Directory of Open Access Journals (Sweden)
Geoffrey R Bennett
Full Text Available Host base excision repair (BER proteins that repair oxidative damage enhance HIV infection. These proteins include the oxidative DNA damage glycosylases 8-oxo-guanine DNA glycosylase (OGG1 and mutY homolog (MYH as well as DNA polymerase beta (Polβ. While deletion of oxidative BER genes leads to decreased HIV infection and integration efficiency, the mechanism remains unknown. One hypothesis is that BER proteins repair the DNA gapped integration intermediate. An alternative hypothesis considers that the most common oxidative DNA base damages occur on guanines. The subtle consensus sequence preference at HIV integration sites includes multiple G:C base pairs surrounding the points of joining. These observations suggest a role for oxidative BER during integration targeting at the nucleotide level. We examined the hypothesis that BER repairs a gapped integration intermediate by measuring HIV infection efficiency in Polβ null cell lines complemented with active site point mutants of Polβ. A DNA synthesis defective mutant, but not a 5'dRP lyase mutant, rescued HIV infection efficiency to wild type levels; this suggested Polβ DNA synthesis activity is not necessary while 5'dRP lyase activity is required for efficient HIV infection. An alternate hypothesis that BER events in the host genome influence HIV integration site selection was examined by sequencing integration sites in OGG1 and MYH null cells. In the absence of these 8-oxo-guanine specific glycosylases the chromatin elements of HIV integration site selection remain the same as in wild type cells. However, the HIV integration site sequence preference at G:C base pairs is altered at several positions in OGG1 and MYH null cells. Inefficient HIV infection in the absence of oxidative BER proteins does not appear related to repair of the gapped integration intermediate; instead oxidative damage repair may participate in HIV integration site preference at the sequence level.
Zhao, Yu-Wen; Wang, Hai-Xia; Bie, Song-Tao; Shao, Qian; Wang, Chun-Hua; Wang, Dong-Heng; Li, Zheng
2018-03-01
A new method for detection of Escherichia coli exist in licorice decoction was developed by using DNA-based electrochemical biosensor. The thiolated capture probe was immobilized on a gold electrode at first. Then the aptamer for Escherichia coli was combined with the capture probe by hybridization. Due to the stronger interaction between the aptamer and the E. coli, the aptamer can dissociate from the capture probe in the presence of E. coli in licorice decoction. The biotinylated detection probe was hybridized with the single-strand capture probe. As a result, the electrochemical response to Escherichia coli can be measured by using differential pulse voltammetric in the presence of α-naphthyl phosphate. The plot of peak current vs. the logarithm of concentration in the range from 2.7×10² to 2.7×10⁸ CFU·mL⁻¹ displayed a linear relationship with a detection limit of 50 CFU·mL⁻¹. The relative standard deviation of 3 successive scans was 2.5%，2.1%，4.6% for 2×10²，2×10⁴，2×106：⁶ CFU·mL⁻¹ E. coli, respectively. The proposed procedure showed better specificity to E. coli in comparison to Pseudomonas aeruginosa, Staphylococcus aureus and Bacillus subtilis. In the detection of the real extractum glycyrrhizae, the results between the proposed strategy and the GB assay showed high degree of agreement, demonstrating the designed biosensor could be utilized as a powerful tool for microbial examination for traditional Chinese medicine. Copyright© by the Chinese Pharmaceutical Association.
Applying Cooperative Techniques in Teaching Problem Solving
Directory of Open Access Journals (Sweden)
Krisztina Barczi
2013-12-01
Full Text Available Teaching how to solve problems – from solving simple equations to solving difficult competition tasks – has been one of the greatest challenges for mathematics education for many years. Trying to find an effective method is an important educational task. Among others, the question arises as to whether a method in which students help each other might be useful. The present article describes part of an experiment that was designed to determine the effects of cooperative teaching techniques on the development of problem-solving skills.
Assertiveness and problem solving in midwives.
Yurtsal, Zeliha Burcu; Özdemir, Levent
2015-01-01
Midwifery profession is required to bring solutions to problems and a midwife is expected to be an assertive person and to develop midwifery care. This study was planned to examine the relationship between assertiveness and problem-solving skills of midwives. This cross-sectional study was conducted with 201 midwives between July 2008 and February 2009 in the city center of Sivas. The Rathus Assertiveness Schedule (RAS) and Problem Solving Inventory (PSI) were used to determine the level of assertiveness and problem-solving skills of midwives. Statistical methods were used as mean, standard deviation, percentage, Student's T, ANOVA and Tukey HSD, Kruskal Wallis, Fisher Exact, Pearson Correlation and Chi-square tests and P problem-solving skills training. A statistically significant negative correlation was found between the RAS and PSI scores. The RAS scores decreased while the problem-solving scores increased (r: -0451, P problem solving skills of midwives, and midwives who were assertive solved their problems better than did others. Assertiveness and problem-solving skills training will contribute to the success of the midwifery profession. Midwives able to solve problems, and display assertive behaviors will contribute to the development of midwifery profession.
An Integrated Architecture for Engineering Problem Solving
National Research Council Canada - National Science Library
Pisan, Yusuf
1998-01-01
.... This thesis describes the Integrated Problem Solving Architecture (IPSA) that combines qualitative, quantitative and diagrammatic reasoning skills to produce annotated solutions to engineering problems...
System to solve three designs of the fuel management
International Nuclear Information System (INIS)
Castillo M, J. A.; Ortiz S, J. J.; Montes T, J. L.; Perusquia del C, R.; Marinez R, R.
2015-09-01
In this paper preliminary results are presented, obtained with the development of a computer system that resolves three stages of the nuclear fuel management, which are: the axial and radial designs of fuel, as well as the design of nuclear fuel reloads. The novelty of the system is that the solution is obtained solving the 3 mentioned stages, in coupled form. For this, heuristic techniques are used for each stage, in each one of these has a function objective that is applied to particular problems, but in all cases the obtained partial results are used as input data for the next stage. The heuristic techniques that were used to solve the coupled problem are: tabu search, neural networks and a hybrid between the scatter search and path re linking. The system applies an iterative process from the design of a fuel cell to the reload design, since are preliminary results the reload is designed using the operation strategy Haling type. In each one of the stages nuclear parameters inherent to the design are monitored. The results so far show the advantage of solving the problem in a coupled manner, even when a large amount of computer resources is used. (Author)
Pedagogy and/or technology: Making difference in improving students' problem solving skills
Hrepic, Zdeslav; Lodder, Katherine; Shaw, Kimberly A.
2013-01-01
Pen input computers combined with interactive software may have substantial potential for promoting active instructional methodologies and for facilitating students' problem solving ability. An excellent example is a study in which introductory physics students improved retention, conceptual understanding and problem solving abilities when one of three weekly lectures was replaced with group problem solving sessions facilitated with Tablet PCs and DyKnow software [1,2]. The research goal of the present study was to isolate the effect of the methodology itself (using additional time to teach problem solving) from that of the involved technology. In Fall 2011 we compared the performance of students taking the same introductory physics lecture course while enrolled in two separate problem-solving sections. One section used pen-based computing to facilitate group problem solving while the other section used low-tech methods for one third of the semester (covering Kinematics), and then traded technologies for the middle third of the term (covering Dynamics). Analysis of quiz, exam and standardized pre-post test results indicated no significant difference in scores of the two groups. Combining this result with those of previous studies implies primacy of pedagogy (collaborative problem solving itself) over technology for student learning in problem solving recitations.