Landweber, Laura F; Baum, Eric B
1998-01-01
The fledgling field of DNA computers began in 1994 when Leonard Adleman surprised the scientific community by using DNA molecules, protein enzymes, and chemicals to solve an instance of a hard computational problem. This volume presents results from the second annual meeting on DNA computers held at Princeton only one and one-half years after Adleman's discovery. By drawing on the analogy between DNA computing and cutting-edge fields of biology (such as directed evolution), this volume highlights some of the exciting progress in the field and builds a strong foundation for the theory of molecular computation. DNA computing is a radically different approach to computing that brings together computer science and molecular biology in a way that is wholly distinct from other disciplines. This book outlines important advances in the field and offers comprehensive discussion on potential pitfalls and the general practicality of building DNA based computers.
Directory of Open Access Journals (Sweden)
Kenli Li
2008-01-01
Full Text Available Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP, especially over GF(2n, n∈Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2n are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.
A DNA based model for addition computation
Institute of Scientific and Technical Information of China (English)
GAO Lin; YANG Xiao; LIU Wenbin; XU Jin
2004-01-01
Much effort has been made to solve computing problems by using DNA-an organic simulating method, which in some cases is preferable to the current electronic computer. However, No one at present has proposed an effective and applicable method to solve addition problem with molecular algorithm due to the difficulty in solving the carry problem which can be easily solved by hardware of an electronic computer. In this article, we solved this problem by employing two kinds of DNA strings, one is called result and operation string while the other is named carrier. The result and operation string contains some carry information by its own and denotes the ultimate result while the carrier is just for carrying use. The significance of this algorithm is the original code, the fairly easy steps to follow and the feasibility under current molecular biological technology.
Linearly programmed DNA-based molecular computer operated on magnetic particle surface in test-tube
Institute of Scientific and Technical Information of China (English)
ZHAO Jian; ZHANG Zhizhou; SHI Yongyong; Li Xiuxia; HE Lin
2004-01-01
The postgenomic era has seen an emergence of new applications of DNA manipulation technologies, including DNA-based molecular computing. Surface DNA computing has already been reported in a number of studies that, however, all employ different mechanisms other than automaton functions. Here we describe a programmable DNA surface-computing device as a Turing machine-like finite automaton. The laboratory automaton is primarily composed of DNA (inputs, output-detectors, transition molecules as software), DNA manipulating enzymes and buffer system that solve artificial computational problems autonomously. When fluoresceins were labeled in the 5′ end of (-) strand of the input molecule, direct observation of all reaction intermediates along the time scale was made so that the dynamic process of DNA computing could be conveniently visualized. The features of this study are: (i) achievement of finite automaton functions by linearly programmed DNA computer operated on magnetic particle surface and (ii) direct detection of all DNA computing intermediates by capillary electrophoresis. Since DNA computing has the massive parallelism and feasibility for automation, this achievement sets a basis for large-scale implications of DNA computing for functional genomics in the near future.
Robot computer problem solving system
Becker, J. D.; Merriam, E. W.
1974-01-01
The conceptual, experimental, and practical aspects of the development of a robot computer problem solving system were investigated. The distinctive characteristics were formulated of the approach taken in relation to various studies of cognition and robotics. Vehicle and eye control systems were structured, and the information to be generated by the visual system is defined.
Solving computationally expensive engineering problems
Leifsson, Leifur; Yang, Xin-She
2014-01-01
Computational complexity is a serious bottleneck for the design process in virtually any engineering area. While migration from prototyping and experimental-based design validation to verification using computer simulation models is inevitable and has a number of advantages, high computational costs of accurate, high-fidelity simulations can be a major issue that slows down the development of computer-aided design methodologies, particularly those exploiting automated design improvement procedures, e.g., numerical optimization. The continuous increase of available computational resources does not always translate into shortening of the design cycle because of the growing demand for higher accuracy and necessity to simulate larger and more complex systems. Accurate simulation of a single design of a given system may be as long as several hours, days or even weeks, which often makes design automation using conventional methods impractical or even prohibitive. Additional problems include numerical noise often pr...
Designing Computer Software for Problem-Solving Instruction.
Duffield, Judith A.
1991-01-01
Discusses factors that might influence the effectiveness of computer software designed to teach problem solving. Topics discussed include the structure of knowledge; transfer of training; computers and problem solving instruction; human-computer interactions; and types of software, including drill and practice programs, tutorials, instructional…
Computer-Based Assessment of Problem Solving.
Baker, E. L.; Mayer, R. E.
1999-01-01
Examines the components required to assess student problem solving in technology environments. Discusses the purposes of testing, provides an example demonstrating the difference between retention and transfer, defines and analyzes problem solving, and explores techniques and standards for measuring the quality of student understanding. Contains…
Hybrid computer techniques for solving partial differential equations
Hammond, J. L., Jr.; Odowd, W. M.
1971-01-01
Techniques overcome equipment limitations that restrict other computer techniques in solving trivial cases. The use of curve fitting by quadratic interpolation greatly reduces required digital storage space.
Exploring Programmable Self-Assembly in Non-DNA based Molecular Computing
Terrazas, German; Krasnogor, Natalio
2013-01-01
Self-assembly is a phenomenon observed in nature at all scales where autonomous entities build complex structures, without external influences nor centralised master plan. Modelling such entities and programming correct interactions among them is crucial for controlling the manufacture of desired complex structures at the molecular and supramolecular scale. This work focuses on a programmability model for non DNA-based molecules and complex behaviour analysis of their self-assembled conformations. In particular, we look into modelling, programming and simulation of porphyrin molecules self-assembly and apply Kolgomorov complexity-based techniques to classify and assess simulation results in terms of information content. The analysis focuses on phase transition, clustering, variability and parameter discovery which as a whole pave the way to the notion of complex systems programmability.
A New Searching Problem Solved by Quantum Computers
Institute of Scientific and Technical Information of China (English)
闫海洋
2002-01-01
It is well known that a quantum computer can search more quickly than a classical computer while solving the so-called Grover-searching problem. We present a new searching problem which cannot be classified into Grover's problem and can be solved by using the modified searching iterations with the same efficiency as for Grover's problem.
Computational physics problem solving with Python
Landau, Rubin H; Bordeianu, Cristian C
2015-01-01
The use of computation and simulation has become an essential part of the scientific process. Being able to transform a theory into an algorithm requires significant theoretical insight, detailed physical and mathematical understanding, and a working level of competency in programming. This upper-division text provides an unusually broad survey of the topics of modern computational physics from a multidisciplinary, computational science point of view. Its philosophy is rooted in learning by doing (assisted by many model programs), with new scientific materials as well as with the Python progr
Formulating and Solving Problems in Computational Chemistry.
Norris, A. C.
1980-01-01
Considered are the main elements of computational chemistry problems and how these elements can be used to formulate the problems mathematically. Techniques that are useful in devising an appropriate solution are also considered. (Author/TG)
Comprehension and computation in Bayesian problem solving
Directory of Open Access Journals (Sweden)
Eric D. Johnson
2015-07-01
Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.
A New Approach: Computer-Assisted Problem-Solving Systems
Gok, Tolga
2010-01-01
Computer-assisted problem solving systems are rapidly growing in educational use and with the advent of the Internet. These systems allow students to do their homework and solve problems online with the help of programs like Blackboard, WebAssign and LON-CAPA program etc. There are benefits and drawbacks of these systems. In this study, the…
Constructing Bio-molecular Databases on a DNA-based Computer
Chang, Weng-Long; Ho,; Guo, Minyi
2007-01-01
Codd [Codd 1970] wrote the first paper in which the model of a relational database was proposed. Adleman [Adleman 1994] wrote the first paper in which DNA strands in a test tube were used to solve an instance of the Hamiltonian path problem. From [Adleman 1994], it is obviously indicated that for storing information in molecules of DNA allows for an information density of approximately 1 bit per cubic nm (nanometer) and a dramatic improvement over existing storage media such as video tape which store information at a density of approximately 1 bit per 1012 cubic nanometers. This paper demonstrates that biological operations can be applied to construct bio-molecular databases where data records in relational tables are encoded as DNA strands. In order to achieve the goal, DNA algorithms are proposed to perform eight operations of relational algebra (calculus) on bio-molecular relational databases, which include Cartesian product, union, set difference, selection, projection, intersection, join and division. Fu...
Experimental quantum computing to solve systems of linear equations.
Cai, X-D; Weedbrook, C; Su, Z-E; Chen, M-C; Gu, Mile; Zhu, M-J; Li, Li; Liu, Nai-Le; Lu, Chao-Yang; Pan, Jian-Wei
2013-06-07
Solving linear systems of equations is ubiquitous in all areas of science and engineering. With rapidly growing data sets, such a task can be intractable for classical computers, as the best known classical algorithms require a time proportional to the number of variables N. A recently proposed quantum algorithm shows that quantum computers could solve linear systems in a time scale of order log(N), giving an exponential speedup over classical computers. Here we realize the simplest instance of this algorithm, solving 2×2 linear equations for various input vectors on a quantum computer. We use four quantum bits and four controlled logic gates to implement every subroutine required, demonstrating the working principle of this algorithm.
A multidisciplinary approach to solving computer related vision problems.
Long, Jennifer; Helland, Magne
2012-09-01
This paper proposes a multidisciplinary approach to solving computer related vision issues by including optometry as a part of the problem-solving team. Computer workstation design is increasing in complexity. There are at least ten different professions who contribute to workstation design or who provide advice to improve worker comfort, safety and efficiency. Optometrists have a role identifying and solving computer-related vision issues and in prescribing appropriate optical devices. However, it is possible that advice given by optometrists to improve visual comfort may conflict with other requirements and demands within the workplace. A multidisciplinary approach has been advocated for solving computer related vision issues. There are opportunities for optometrists to collaborate with ergonomists, who coordinate information from physical, cognitive and organisational disciplines to enact holistic solutions to problems. This paper proposes a model of collaboration and examples of successful partnerships at a number of professional levels including individual relationships between optometrists and ergonomists when they have mutual clients/patients, in undergraduate and postgraduate education and in research. There is also scope for dialogue between optometry and ergonomics professional associations. A multidisciplinary approach offers the opportunity to solve vision related computer issues in a cohesive, rather than fragmented way. Further exploration is required to understand the barriers to these professional relationships. © 2012 The College of Optometrists.
6th International Conference on Soft Computing for Problem Solving
Bansal, Jagdish; Das, Kedar; Lal, Arvind; Garg, Harish; Nagar, Atulya; Pant, Millie
2017-01-01
This two-volume book gathers the proceedings of the Sixth International Conference on Soft Computing for Problem Solving (SocProS 2016), offering a collection of research papers presented during the conference at Thapar University, Patiala, India. Providing a veritable treasure trove for scientists and researchers working in the field of soft computing, it highlights the latest developments in the broad area of “Computational Intelligence” and explores both theoretical and practical aspects using fuzzy logic, artificial neural networks, evolutionary algorithms, swarm intelligence, soft computing, computational intelligence, etc.
Applying natural evolution for solving computational problems - Lecture 1
CERN. Geneva
2017-01-01
Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.
Applying natural evolution for solving computational problems - Lecture 2
CERN. Geneva
2017-01-01
Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.
DNA computation model to solve 0-1 programming problem.
Zhang, Fengyue; Yin, Zhixiang; Liu, Bo; Xu, Jin
2004-01-01
0-1 programming problem is an important problem in opsearch with very widespread applications. In this paper, a new DNA computation model utilizing solution-based and surface-based methods is presented to solve the 0-1 programming problem. This model contains the major benefits of both solution-based and surface-based methods; including vast parallelism, extraordinary information density and ease of operation. The result, verified by biological experimentation, revealed the potential of DNA computation in solving complex programming problem.
Solving the Stokes problem on a massively parallel computer
DEFF Research Database (Denmark)
Axelsson, Owe; Barker, Vincent A.; Neytcheva, Maya
2001-01-01
We describe a numerical procedure for solving the stationary two‐dimensional Stokes problem based on piecewise linear finite element approximations for both velocity and pressure, a regularization technique for stability, and a defect‐correction technique for improving accuracy. Eliminating...... boundary value problem for each velocity component, are solved by the conjugate gradient method with a preconditioning based on the algebraic multi‐level iteration (AMLI) technique. The velocity is found from the computed pressure. The method is optimal in the sense that the computational work...... is proportional to the number of unknowns. Further, it is designed to exploit a massively parallel computer with distributed memory architecture. Numerical experiments on a Cray T3E computer illustrate the parallel performance of the method....
The Use of GPUs for Solving the Computed Tomography Problem
Directory of Open Access Journals (Sweden)
A.E. Kovtanyuk
2014-07-01
Full Text Available Computed tomography (CT is a widespread method used to study the internal structure of objects. The method has applications in medicine, industry and other fields of human activity. In particular, Electronic Imaging, as a species CT, can be used to restore the structure of nanosized objects. Accurate and rapid results are in high demand in modern science. However, there are computational limitations that bound the possible usefulness of CT. On the other hand, the introduction of high-performance calculations using Graphics Processing Units (GPUs provides improving quality and performance of computed tomography investigations. Moreover, parallel computing with GPUs gives significantly higher computation speeds when compared with (Central Processing Units CPUs, because of architectural advantages of the former. In this paper a computed tomography method of recovering the image using parallel computations powered by NVIDIA CUDA technology is considered. The implementation of this approach significantly reduces the required time for solving the CT problem.
Solving Tensor Structured Problems with Computational Tensor Algebra
Morozov, Oleksii
2010-01-01
Since its introduction by Gauss, Matrix Algebra has facilitated understanding of scientific problems, hiding distracting details and finding more elegant and efficient ways of computational solving. Today's largest problems, which often originate from multidimensional data, might profit from even higher levels of abstraction. We developed a framework for solving tensor structured problems with tensor algebra that unifies concepts from tensor analysis, multilinear algebra and multidimensional signal processing. In contrast to the conventional matrix approach, it allows the formulation of multidimensional problems, in a multidimensional way, preserving structure and data coherence; and the implementation of automated optimizations of solving algorithms, based on the commutativity of all tensor operations. Its ability to handle large scientific tasks is showcased by a real-world, 4D medical imaging problem, with more than 30 million unknown parameters solved on a current, inexpensive hardware. This significantly...
Internet computer coaches for introductory physics problem solving
Xu Ryan, Qing
The ability to solve problems in a variety of contexts is becoming increasingly important in our rapidly changing technological society. Problem-solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving skills throughout the educational system, national studies have shown that the majority of students emerge from such courses having made little progress toward developing good problem-solving skills. The Physics Education Research Group at the University of Minnesota has been developing Internet computer coaches to help students become more expert-like problem solvers. During the Fall 2011 and Spring 2013 semesters, the coaches were introduced into large sections (200+ students) of the calculus based introductory mechanics course at the University of Minnesota. This dissertation, will address the research background of the project, including the pedagogical design of the coaches and the assessment of problem solving. The methodological framework of conducting experiments will be explained. The data collected from the large-scale experimental studies will be discussed from the following aspects: the usage and usability of these coaches; the usefulness perceived by students; and the usefulness measured by final exam and problem solving rubric. It will also address the implications drawn from this study, including using this data to direct future coach design and difficulties in conducting authentic assessment of problem-solving.
Third International Conference on Soft Computing for Problem Solving
Deep, Kusum; Nagar, Atulya; Bansal, Jagdish
2014-01-01
The present book is based on the research papers presented in the 3rd International Conference on Soft Computing for Problem Solving (SocProS 2013), held as a part of the golden jubilee celebrations of the Saharanpur Campus of IIT Roorkee, at the Noida Campus of Indian Institute of Technology Roorkee, India. This book is divided into two volumes and covers a variety of topics including mathematical modelling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining etc. Particular emphasis is laid on soft computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems, which are otherwise difficult to solve by the usual and traditional methods. The book is directed ...
Second International Conference on Soft Computing for Problem Solving
Nagar, Atulya; Deep, Kusum; Pant, Millie; Bansal, Jagdish; Ray, Kanad; Gupta, Umesh
2014-01-01
The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2012), held at JK Lakshmipat University, Jaipur, India. This book provides the latest developments in the area of soft computing and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining, etc. The objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.
Computational neural networks driving complex analytical problem solving.
Hanrahan, Grady
2010-06-01
Neural network computing demonstrates advanced analytical problem solving abilities to meet the demands of modern chemical research. (To listen to a podcast about this article, please go to the Analytical Chemistry multimedia page at pubs.acs.org/page/ancham/audio/index.html .).
Solve the partitioning problem by sticker model in DNA computing
Institute of Scientific and Technical Information of China (English)
QU Huiqin; LU Mingming; ZHU Hong
2004-01-01
The aim of this work is to solve the partitioning problem, the most canonical NP-complete problem containing numerical parameters, within the sticker model of DNA computing. We firstly design a parallel program for addition, and then give a program to calculate the subset sums of a set. At last, a program for partitioning is given, which contains the former programs. Furthermore, the correctness of each program is proved in this paper.
4th International Conference on Soft Computing for Problem Solving
Deep, Kusum; Pant, Millie; Bansal, Jagdish; Nagar, Atulya
2015-01-01
This two volume book is based on the research papers presented at the 4th International Conference on Soft Computing for Problem Solving (SocProS 2014) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and healthcare, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.
Notes on solving and playing peg solitaire on a computer
Bell, George I.
2009-01-01
We consider the one-person game of peg solitaire played on a computer. Two popular board shapes are the 33-hole cross-shaped board, and the 15-hole triangle board---we use them as examples throughout. The basic game begins from a full board with one peg missing and the goal is to finish at a board position with one peg. First, we discuss ways to solve the basic game on a computer. Then we consider the problem of quickly distinguishing board positions where the goal can still be reached ("winn...
A DNA computer model for solving vertex coloring problem
Institute of Scientific and Technical Information of China (English)
XU Jin; QIANG Xiaoli; FANG Gang; ZHOU Kang
2006-01-01
A special DNA computer was designed to solve the vertex coloring problem. The main body of this kind of DNA computer was polyacrylamide gel electrophoresis which could be classified into three parts: melting region, unsatisfied solution region and solution region. This polyacrylamide gel was connected with a controllable temperature device, and the relevant temperature was Tm1, Tm2 and Tm3, respectively. Furthermore, with emphasis on the encoding way, we succeeded in performing the experiment of a graph with 5 vertices. In this paper we introduce the basic structure, the principle and the method of forming the library DNA sequences.
A mathematical model of a computational problem solving system
Aris, Teh Noranis Mohd; Nazeer, Shahrin Azuan
2015-05-01
This paper presents a mathematical model based on fuzzy logic for a computational problem solving system. The fuzzy logic uses truth degrees as a mathematical model to represent vague algorithm. The fuzzy logic mathematical model consists of fuzzy solution and fuzzy optimization modules. The algorithm is evaluated based on a software metrics calculation that produces the fuzzy set membership. The fuzzy solution mathematical model is integrated in the fuzzy inference engine that predicts various solutions to computational problems. The solution is extracted from a fuzzy rule base. Then, the solutions are evaluated based on a software metrics calculation that produces the level of fuzzy set membership. The fuzzy optimization mathematical model is integrated in the recommendation generation engine that generate the optimize solution.
5th International Conference on Soft Computing for Problem Solving
Deep, Kusum; Bansal, Jagdish; Nagar, Atulya; Das, Kedar
2016-01-01
This two volume book is based on the research papers presented at the 5th International Conference on Soft Computing for Problem Solving (SocProS 2015) and covers a variety of topics, including mathematical modelling, image processing, optimization methods, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, medical and health care, data mining, etc. Mainly the emphasis is on Soft Computing and its applications in diverse areas. The prime objective of this book is to familiarize the reader with the latest scientific developments in various fields of Science, Engineering and Technology and is directed to the researchers and scientists engaged in various real-world applications of ‘Soft Computing’.
Numerical methods for solving ODEs on the infinity computer
Mazzia, F.; Sergeyev, Ya. D.; Iavernaro, F.; Amodio, P.; Mukhametzhanov, M. S.
2016-10-01
New algorithms for the numerical solution of Ordinary Differential Equations (ODEs) with initial conditions are proposed. They are designed for working on a new kind of a supercomputer - the Infinity Computer - that is able to deal numerically with finite, infinite and infinitesimal numbers. Due to this fact, the Infinity Computer allows one to calculate the exact derivatives of functions using infinitesimal values of the stepsize. As a consequence, the new methods are able to work with the exact values of the derivatives, instead of their approximations. Within this context, variants of one-step multi-point methods closely related to the classical Taylor formulae and to the Obrechkoff methods are considered. To get numerical evidence of the theoretical results, test problems are solved by means of the new methods and the results compared with the performance of classical methods.
Aono, Masashi; Naruse, Makoto; Kim, Song-Ju; Wakabayashi, Masamitsu; Hori, Hirokazu; Ohtsu, Motoichi; Hara, Masahiko
2013-06-18
Biologically inspired computing devices and architectures are expected to overcome the limitations of conventional technologies in terms of solving computationally demanding problems, adapting to complex environments, reducing energy consumption, and so on. We previously demonstrated that a primitive single-celled amoeba (a plasmodial slime mold), which exhibits complex spatiotemporal oscillatory dynamics and sophisticated computing capabilities, can be used to search for a solution to a very hard combinatorial optimization problem. We successfully extracted the essential spatiotemporal dynamics by which the amoeba solves the problem. This amoeba-inspired computing paradigm can be implemented by various physical systems that exhibit suitable spatiotemporal dynamics resembling the amoeba's problem-solving process. In this Article, we demonstrate that photoexcitation transfer phenomena in certain quantum nanostructures mediated by optical near-field interactions generate the amoebalike spatiotemporal dynamics and can be used to solve the satisfiability problem (SAT), which is the problem of judging whether a given logical proposition (a Boolean formula) is self-consistent. SAT is related to diverse application problems in artificial intelligence, information security, and bioinformatics and is a crucially important nondeterministic polynomial time (NP)-complete problem, which is believed to become intractable for conventional digital computers when the problem size increases. We show that our amoeba-inspired computing paradigm dramatically outperforms a conventional stochastic search method. These results indicate the potential for developing highly versatile nanoarchitectonic computers that realize powerful solution searching with low energy consumption.
Brovarets', Ol'ha O; Hovorun, Dmytro M
2013-11-15
It was established that the cytosine·thymine (C·T) mismatched DNA base pair with cis-oriented N1H glycosidic bonds has propeller-like structure (|N3C4C4N3| = 38.4°), which is stabilized by three specific intermolecular interactions-two antiparallel N4H…O4 (5.19 kcal mol(-1)) and N3H…N3 (6.33 kcal mol(-1)) H-bonds and a van der Waals (vdW) contact O2…O2 (0.32 kcal mol(-1)). The C·T base mispair is thermodynamically stable structure (ΔG(int) = -1.54 kcal mol(-1) ) and even slightly more stable than the A·T Watson-Crick DNA base pair (ΔG(int) = -1.43 kcal mol(-1)) at the room temperature. It was shown that the C·T ↔ C*·T* tautomerization via the double proton transfer (DPT) is assisted by the O2…O2 vdW contact along the entire range of the intrinsic reaction coordinate (IRC). The positive value of the Grunenberg's compliance constants (31.186, 30.265, and 22.166 Å/mdyn for the C·T, C*·T*, and TS(C·T ↔ C*·T*), respectively) proves that the O2…O2 vdW contact is a stabilizing interaction. Based on the sweeps of the H-bond energies, it was found that the N4H…O4/O4H…N4, and N3H…N3 H-bonds in the C·T and C*·T* base pairs are anticooperative and weaken each other, whereas the middle N3H…N3 H-bond and the O2…O2 vdW contact are cooperative and mutually reinforce each other. It was found that the tautomerization of the C·T base mispair through the DPT is concerted and asynchronous reaction that proceeds via the TS(C·T ↔ C*·T*) stabilized by the loosened N4-H-O4 covalent bridge, N3H…N3 H-bond (9.67 kcal mol(-1) ) and O2…O2 vdW contact (0.41 kcal mol(-1) ). The nine key points, describing the evolution of the C·T ↔ C*·T* tautomerization via the DPT, were detected and completely investigated along the IRC. The C*·T* mispair was revealed to be the dynamically unstable structure with a lifetime 2.13·× 10(-13) s. In this case, as for the A·T Watson-Crick DNA base pair, activates the mechanism of the quantum protection of the C
Solving a Hamiltonian Path Problem with a bacterial computer
Directory of Open Access Journals (Sweden)
Treece Jessica
2009-07-01
Full Text Available Abstract Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node
Computer Based Collaborative Problem Solving for Introductory Courses in Physics
Ilie, Carolina; Lee, Kevin
2010-03-01
We discuss collaborative problem solving computer-based recitation style. The course is designed by Lee [1], and the idea was proposed before by Christian, Belloni and Titus [2,3]. The students find the problems on a web-page containing simulations (physlets) and they write the solutions on an accompanying worksheet after discussing it with a classmate. Physlets have the advantage of being much more like real-world problems than textbook problems. We also compare two protocols for web-based instruction using simulations in an introductory physics class [1]. The inquiry protocol allowed students to control input parameters while the worked example protocol did not. We will discuss which of the two methods is more efficient in relation to Scientific Discovery Learning and Cognitive Load Theory. 1. Lee, Kevin M., Nicoll, Gayle and Brooks, Dave W. (2004). ``A Comparison of Inquiry and Worked Example Web-Based Instruction Using Physlets'', Journal of Science Education and Technology 13, No. 1: 81-88. 2. Christian, W., and Belloni, M. (2001). Physlets: Teaching Physics With Interactive Curricular Material, Prentice Hall, Englewood Cliffs, NJ. 3. Christian,W., and Titus,A. (1998). ``Developing web-based curricula using Java Physlets.'' Computers in Physics 12: 227--232.
Gonzalez, E; Lino, J; Deriabina, A; Herrera, J N F; Poltev, V I
2013-01-01
To elucidate details of the DNA-water interactions we performed the calculations and systemaitic search for minima of interaction energy of the systems consisting of one of DNA bases and one or two water molecules. The results of calculations using two force fields of molecular mechanics (MM) and correlated ab initio method MP2/6-31G(d, p) of quantum mechanics (QM) have been compared with one another and with experimental data. The calculations demonstrated a qualitative agreement between geometry characteristics of the most of local energy minima obtained via different methods. The deepest minima revealed by MM and QM methods correspond to water molecule position between two neighbor hydrophilic centers of the base and to the formation by water molecule of hydrogen bonds with them. Nevertheless, the relative depth of some minima and peculiarities of mutual water-base positions in' these minima depend on the method used. The analysis revealed insignificance of some differences in the results of calculations performed via different methods and the importance of other ones for the description of DNA hydration. The calculations via MM methods enable us to reproduce quantitatively all the experimental data on the enthalpies of complex formation of single water molecule with the set of mono-, di-, and trimethylated bases, as well as on water molecule locations near base hydrophilic atoms in the crystals of DNA duplex fragments, while some of these data cannot be rationalized by QM calculations.
Molecular computing towards a novel computing architecture for complex problem solving
Chang, Weng-Long
2014-01-01
This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...
Performance of parallel computation using CUDA for solving the one-dimensional elasticity equations
Darmawan, J. B. B.; Mungkasi, S.
2017-01-01
In this paper, we investigate the performance of parallel computation in solving the one-dimensional elasticity equations. Elasticity equations are usually implemented in engineering science. Solving these equations fast and efficiently is desired. Therefore, we propose the use of parallel computation. Our parallel computation uses CUDA of the NVIDIA. Our research results show that parallel computation using CUDA has a great advantage and is powerful when the computation is of large scale.
Using Evolutionary Computation to Solve the Economic Load Dispatch Problem
Directory of Open Access Journals (Sweden)
Samir SAYAH
2008-06-01
Full Text Available This paper reports on an evolutionary algorithm based method for solving the economic load dispatch (ELD problem. The objective is to minimize the nonlinear function, which is the total fuel cost of thermal generating units, subject to the usual constraints.The IEEE 30 bus test system was used for testing and validation purposes. The results obtained demonstrate the effectiveness of the proposed method for solving the economic load dispatch problem.
Kim, SugHee; Chung, KwangSik; Yu, HeonChang
2013-01-01
The purpose of this paper is to propose a training program for creative problem solving based on computer programming. The proposed program will encourage students to solve real-life problems through a creative thinking spiral related to cognitive skills with computer programming. With the goal of enhancing digital fluency through this proposed…
Computer as a Medium for Overcoming Misconceptions in Solving Inequalities
Abramovich, Sergei; Ehrlich, Amos
2007-01-01
Inequalities are considered among the most useful tools of investigation in pure and applied mathematics; yet their didactical aspects have not received much attention in mathematics education research until recently. An important aspect of teaching problem solving at the secondary level deals with the notion of equivalence of algebraic…
SOLVING MINIMUM SPANNING TREE PROBLEM WITH DNA COMPUTING
Institute of Scientific and Technical Information of China (English)
Liu Xikui; Li Yan; Xu Jin
2005-01-01
Molecular programming is applied to minimum spanning problem whose solution requires encoding of real values in DNA strands. A new encoding scheme is proposed for real values that is biologically plausible and has a fixed code length. According to the characteristics of the problem, a DNA algorithm solving the minimum spanning tree problem is given. The effectiveness of the proposed method is verified by simulation. The advantages and disadvantages of this algorithm are discussed.
Use of Interactive Computer Graphics to Solve Routing Problems.
Gillett, B. E.; Lawrence, J. L.
1981-01-01
Discusses vehicle routing problems and solutions. Describes testing of an interactive computer graphics package combining several types of solutions that allows users with little or no experience to work out routing problems. (Author/RW)
Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story
Gunbas, N.
2015-01-01
The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…
USING CLOUD COMPUTING IN SOLVING THE PROBLEMS OF LOGIC
Directory of Open Access Journals (Sweden)
Pavlo V. Mykytenko
2017-02-01
Full Text Available The article provides an overview of the most popular cloud services, in particular those which have their complete office suites, the basic functional characteristics and highlights the advantages and disadvantages of cloud services in the educational process. It was made a comparative analysis of the spreadsheets that are in office suites such cloud services like Zoho Office Suite, Microsoft Office 365 and Google Docs. On the basis of the research and the findings it was suggested the best cloud services for use in the educational process. The possibility of using spreadsheets in the study of logic, from creating formulas that implement logical operations, the creation of means of automation of problem solving process was considered.
Solving wood chip transport problems with computer simulation.
Dennis P. Bradley; Sharon A. Winsauer
1976-01-01
Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.
Gender and Software Effects in Computer-Based Problem Solving.
Littleton, Karen; And Others
Whether gender differences in performance using computer software are due to sex stereotyping or gender differentiation in the programs was investigated in two studies. An adventure game, "King and Crown," with all male characters, and a gender neutral game, "Honeybears," were played by 26 female and 26 male 11- and 12-year-olds in Milton Keynes…
Computer-Assisted Instruction; How to Solve Drug Formulation Problems.
Mezei, Janos; And Others
1990-01-01
Computer simulation of drug formulation problems involves a database of pharmacological properties, chemical stability, and compatibility data on 20 active ingredients, physiological factors and requirements for parenteral solutions, and additives. The user gathers data from the database, formulates a stable and effective solution, and the drug is…
Huang, Tzu-Hua; Liu, Yuan-Chen; Chang, Hsiu-Chen
2012-01-01
This study developed a computer-assisted mathematical problem-solving system in the form of a network instruction website to help low-achieving second- and third-graders in mathematics with word-based addition and subtraction questions in Taiwan. According to Polya's problem-solving model, the system is designed to guide these low-achievers…
Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction
Zoanetti, Nathan
2010-01-01
This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…
Solving the 0/1 Knapsack Problem by a Biomolecular DNA Computer
Directory of Open Access Journals (Sweden)
Hassan Taghipour
2013-01-01
Full Text Available Solving some mathematical problems such as NP-complete problems by conventional silicon-based computers is problematic and takes so long time. DNA computing is an alternative method of computing which uses DNA molecules for computing purposes. DNA computers have massive degrees of parallel processing capability. The massive parallel processing characteristic of DNA computers is of particular interest in solving NP-complete and hard combinatorial problems. NP-complete problems such as knapsack problem and other hard combinatorial problems can be easily solved by DNA computers in a very short period of time comparing to conventional silicon-based computers. Sticker-based DNA computing is one of the methods of DNA computing. In this paper, the sticker based DNA computing was used for solving the 0/1 knapsack problem. At first, a biomolecular solution space was constructed by using appropriate DNA memory complexes. Then, by the application of a sticker-based parallel algorithm using biological operations, knapsack problem was resolved in polynomial time.
Fuchs, Lynn S; Fuchs, Douglas; Hamlett, Carol L; Lambert, Warren; Stuebing, Karla; Fletcher, Jack M
2008-02-01
The purpose of this study was to explore patterns of difficulty in 2 domains of mathematical cognition: computation and problem solving. Third graders (n = 924; 47.3% male) were representatively sampled from 89 classrooms; assessed on computation and problem solving; classified as having difficulty with computation, problem solving, both domains, or neither domain; and measured on 9 cognitive dimensions. Difficulty occurred across domains with the same prevalence as difficulty with a single domain; specific difficulty was distributed similarly across domains. Multivariate profile analysis on cognitive dimensions and chi-square tests on demographics showed that specific computational difficulty was associated with strength in language and weaknesses in attentive behavior and processing speed; problem-solving difficulty was associated with deficient language as well as race and poverty. Implications for understanding mathematics competence and for the identification and treatment of mathematics difficulties are discussed.
Application of Computer Algebra in Solving Chaffee Infante Equation
Xie, Fu-Ding; Liu, Xiao-Dan; Sun, Xiao-Peng; Tang, Di
2008-04-01
In this paper, a series of two line-soliton solutions and double periodic solutions of Chaffee Infante equation have been obtained by using a new transformation. Unlike the existing methods which are used to find multiple soliton solutions of nonlinear partial differential equations, this approach is constructive and pure algebraic. The results found here are tested on computer and therefore their validity is ensured.
Cloud Computing for Solving E-Learning Problems
Directory of Open Access Journals (Sweden)
N. S. Abu El-Ala
2013-01-01
Full Text Available The integration of information and communication technologies in education according to the global trend occupied a great interest in the Arab world through E-Learning techniques and put it into the form of services within Services Oriented Architecture Technique (SOA, and mixing its input and outputs within the components of the Education Business Intelligence (EBI and enhance it to simulate reality by educational virtual worlds.This paper presents a creative environment derived from both virtual and personal learning environments based on cloud computing which contains variety of tools and techniques to enhance the educational process. The proposed environment focuses on designing and monitoring educational environment based on reusing the existing web tools, techniques, and services to provide Browser-based-Application.
Alcoholado, Cristián; Diaz, Anita; Tagle, Arturo; Nussbaum, Miguel; Infante, Cristián
2016-01-01
This study aims to understand the differences in student learning outcomes and classroom behaviour when using the interpersonal computer, personal computer and pen-and-paper to solve arithmetic exercises. In this multi-session experiment, third grade students working on arithmetic exercises from various curricular units were divided into three…
Computer-Presented Organizational/Memory Aids as Instruction for Solving Pico-Fomi Problems.
Steinberg, Esther R.; And Others
1985-01-01
Describes investigation of effectiveness of computer-presented organizational/memory aids (matrix and verbal charts controlled by computer or learner) as instructional technique for solving Pico-Fomi problems, and the acquisition of deductive inference rules when such aids are present. Results indicate chart use control should be adapted to…
Engelmann, Tanja; Tergan, Sigmar-Olaf; Hesse, Friedrich W.
2010-01-01
Computer-supported collaboration by spatially distributed group members still involves interaction problems within the group. This article presents an empirical study investigating the question of whether computer-supported collaborative problem solving by spatially distributed group members can be fostered by evoking knowledge and information…
Anchoring Problem-Solving and Computation Instruction in Context-Rich Learning Environments
Bottge, Brian A.; Rueda, Enrique; Grant, Timothy S.; Stephens, Ana C.; Laroque, Perry T.
2010-01-01
Middle school students with learning disabilities in math (MLD) used two versions of Enhanced Anchored Instruction (EAI). In one condition, students learned how to compute with fractions on an as-needed basis while they worked to solve the EAI problems. In the other condition, teachers used a computer-based instructional module in place of one of…
Lee, Young-Jin
2015-01-01
This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…
Data science in R a case studies approach to computational reasoning and problem solving
Nolan, Deborah
2015-01-01
Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar
Talk and task mastery: the importance of socially shared talk during computer-based problem solving.
Hagstrom, Fran; White, Michelle
2006-01-01
In order to examine more closely the ways that children use socially constructed dialogue to mediate task mastery a hierarchical set of computer tasks were presented in an animated game format (ToonTalk) to three adult/child (US Kindergarten) dyads over five sessions. Transcriptions of the adult-child talk were used to determine (1) the types of discourses utilized by the children (i.e., procedural, conversation, narrative) during problem solving and (2) the relationship of this talk to task mastery. It was found that (1) shared talk was associated with more successful problem solving; (2) socially shared talk did not have to be on task to be beneficial; and (3) procedural discourse was more successfully and frequently used for independent problem solving if first requested by the child. These results highlight the importance of socially shared talk in the development of problem solving strategies even when using computer technology.
Gourgoulhon, Eric
2011-04-01
clearly the research work of one of the authors, but it is also an opportunity to discuss the Cosmic Censorship conjecture and the Hoop conjecture. Chapter 11 presents the basics of hyperbolic systems and focuses on the famous BSSN formalism employed in most numerical codes. The electromagnetism analogy introduced in chapter 2 is developed, providing some very useful insight. The remainder of the book is devoted to the collapse of rotating stars (chapter 14) and to the coalescence of binary systems of compact objects, either neutron stars or black holes (chapters 12, 13, 15, 16 and 17). This is a unique introduction and review of results about the expected main sources of gravitational radiation. It includes a detailed presentation of the major triumph of numerical relativity: the successful computation of binary black hole merger. I think that Baumgarte and Shapiro have accomplished a genuine tour de force by writing such a comprehensive and self-contained textbook on a highly evolving subject. The primary value of the book is to be extremely pedagogical. The style is definitively at the textbook level and not that of a review article. One may point out the use of boxes to recap important results and the very instructive aspect of many figures, some of them in colour. There are also numerous exercises in the main text, to encourage the reader to find some useful results by himself. The pedagogical trend is manifest up to the book cover, with the subtitle explaining what the title means! Another great value of the book is indisputably its encyclopedic aspect, making it a very good starting point for research on many topics of modern relativity. I have no doubt that Baumgarte and Shapiro's monograph will quicken considerably the learning phase of any master or PhD student beginning numerical relativity. It will also prove to be very valuable for all researchers of the field and should become a major reference. Beyond numerical relativity, the richness and variety of
Computer problem-solving coaches for introductory physics: Design and usability studies
Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew
2016-06-01
The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how effective such coaches might be, they will only be useful if they are attractive to students. We describe the design and testing of a set of web-based computer programs that act as personal coaches to students while they practice solving problems from introductory physics. The coaches are designed to supplement regular human instruction, giving students access to effective forms of practice outside class. We present results from large-scale usability tests of the computer coaches and discuss their implications for future versions of the coaches.
Ergul, Ozgur
2014-01-01
The Multilevel Fast Multipole Algorithm (MLFMA) for Solving Large-Scale Computational Electromagnetic Problems provides a detailed and instructional overview of implementing MLFMA. The book: Presents a comprehensive treatment of the MLFMA algorithm, including basic linear algebra concepts, recent developments on the parallel computation, and a number of application examplesCovers solutions of electromagnetic problems involving dielectric objects and perfectly-conducting objectsDiscusses applications including scattering from airborne targets, scattering from red
Computer science. Heads-up limit hold'em poker is solved.
Bowling, Michael; Burch, Neil; Johanson, Michael; Tammelin, Oskari
2015-01-09
Poker is a family of games that exhibit imperfect information, where players do not have full knowledge of past events. Whereas many perfect-information games have been solved (e.g., Connect Four and checkers), no nontrivial imperfect-information game played competitively by humans has previously been solved. Here, we announce that heads-up limit Texas hold'em is now essentially weakly solved. Furthermore, this computation formally proves the common wisdom that the dealer in the game holds a substantial advantage. This result was enabled by a new algorithm, CFR(+), which is capable of solving extensive-form games orders of magnitude larger than previously possible. Copyright © 2015, American Association for the Advancement of Science.
Proceedings of the International Conference on Soft Computing for Problem Solving
Nagar, Atulya; Pant, Millie; Bansal, Jagdish
2012-01-01
The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.
Proceedings of the International Conference on Soft Computing for Problem Solving
Nagar, Atulya; Pant, Millie; Bansal, Jagdish
2012-01-01
The present book is based on the research papers presented in the International Conference on Soft Computing for Problem Solving (SocProS 2011), held at Roorkee, India. This book is divided into two volumes and covers a variety of topics, including mathematical modeling, image processing, optimization, swarm intelligence, evolutionary algorithms, fuzzy logic, neural networks, forecasting, data mining etc. Particular emphasis is laid on Soft Computing and its application to diverse fields. The prime objective of the book is to familiarize the reader with the latest scientific developments that are taking place in various fields and the latest sophisticated problem solving tools that are being developed to deal with the complex and intricate problems that are otherwise difficult to solve by the usual and traditional methods. The book is directed to the researchers and scientists engaged in various fields of Science and Technology.
The benefits of computer-generated feedback for mathematics problem solving.
Fyfe, Emily R; Rittle-Johnson, Bethany
2016-07-01
The goal of the current research was to better understand when and why feedback has positive effects on learning and to identify features of feedback that may improve its efficacy. In a randomized experiment, second-grade children received instruction on a correct problem-solving strategy and then solved a set of relevant problems. Children were assigned to receive no feedback, immediate feedback, or summative feedback from the computer. On a posttest the following day, feedback resulted in higher scores relative to no feedback for children who started with low prior knowledge. Immediate feedback was particularly effective, facilitating mastery of the material for children with both low and high prior knowledge. Results suggest that minimal computer-generated feedback can be a powerful form of guidance during problem solving.
Nurturing Students' Problem-Solving Skills and Engagement in Computer-Mediated Communications (CMC)
Chen, Ching-Huei
2014-01-01
The present study sought to investigate how to enhance students' well- and ill-structured problem-solving skills and increase productive engagement in computer-mediated communication with the assistance of external prompts, namely procedural and reflection. Thirty-three graduate students were randomly assigned to two conditions: procedural and…
A Computer Algebra Approach to Solving Chemical Equilibria in General Chemistry
Kalainoff, Melinda; Lachance, Russ; Riegner, Dawn; Biaglow, Andrew
2012-01-01
In this article, we report on a semester-long study of the incorporation into our general chemistry course, of advanced algebraic and computer algebra techniques for solving chemical equilibrium problems. The method presented here is an alternative to the commonly used concentration table method for describing chemical equilibria in general…
Interaction Between Conceptual Level and Training Method in Computer Based Problem Solving.
Sustik, Joan M.; Brown, Bobby R.
A study of 130 undergraduate students enrolled in a course on auiovisual techniques sought to determine whether heuristic or algorithmic computer-based problem solving training would be differentially effective for students varying in cognitive complexity as measured by the Educational Set Scale (ESS). The interaction was investigated between one…
Gambari, Amosa Isiaka; Yusuf, Mudasiru Olalere
2015-01-01
This study investigated the effectiveness of computer-assisted Students' Team Achievement Division (STAD) cooperative learning strategy on physics problem solving, students' achievement and retention. It also examined if the student performance would vary with gender. Purposive sampling technique was used to select two senior secondary schools…
A Computer Algebra Approach to Solving Chemical Equilibria in General Chemistry
Kalainoff, Melinda; Lachance, Russ; Riegner, Dawn; Biaglow, Andrew
2012-01-01
In this article, we report on a semester-long study of the incorporation into our general chemistry course, of advanced algebraic and computer algebra techniques for solving chemical equilibrium problems. The method presented here is an alternative to the commonly used concentration table method for describing chemical equilibria in general…
Nurturing Students' Problem-Solving Skills and Engagement in Computer-Mediated Communications (CMC)
Chen, Ching-Huei
2014-01-01
The present study sought to investigate how to enhance students' well- and ill-structured problem-solving skills and increase productive engagement in computer-mediated communication with the assistance of external prompts, namely procedural and reflection. Thirty-three graduate students were randomly assigned to two conditions: procedural…
Nurturing Students' Problem-Solving Skills and Engagement in Computer-Mediated Communications (CMC)
Chen, Ching-Huei
2014-01-01
The present study sought to investigate how to enhance students' well- and ill-structured problem-solving skills and increase productive engagement in computer-mediated communication with the assistance of external prompts, namely procedural and reflection. Thirty-three graduate students were randomly assigned to two conditions: procedural and…
EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.
Jarvis, John J.; And Others
Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…
DEMONSTRATION COMPUTER MODELS USE WHILE SOLVING THE BUILDING OF THE CUT OF THE CYLINDER
Directory of Open Access Journals (Sweden)
Inna O. Gulivata
2010-10-01
Full Text Available Relevance of material presented in the article is the use of effective methods to illustrate the geometric material for the development of spatial imagination of students. As one of the ways to improve problem solving offer to illustrate the use of display computer model (DCM investigated objects created by the software environment PowerPoint. The technique of applying DCM while solving the problems to build a section of the cylinder makes it allows to build effective learning process and promotes the formation of spatial representations of students taking into account their individual characteristics and principles of differentiated instruction.
Solving the Maximum Weighted Clique Problem Based on Parallel Biological Computing Model
Directory of Open Access Journals (Sweden)
Zhaocai Wang
2015-01-01
Full Text Available The maximum weighted clique (MWC problem, as a typical NP-complete problem, is difficult to be solved by the electronic computer algorithm. The aim of the problem is to seek a vertex clique with maximal weight sum in a given undirected graph. It is an extremely important problem in the field of optimal engineering scheme and control with numerous practical applications. From the point of view of practice, we give a parallel biological algorithm to solve the MWC problem. For the maximum weighted clique problem with m edges and n vertices, we use fixed length DNA strands to represent different vertices and edges, fully conduct biochemical reaction, and find the solution to the MVC problem in certain length range with O(n2 time complexity, comparing to the exponential time level by previous computer algorithms. We expand the applied scope of parallel biological computation and reduce computational complexity of practical engineering problems. Meanwhile, we provide a meaningful reference for solving other complex problems.
Solving the SAT problem using a DNA computing algorithm based on ligase chain reaction.
Wang, Xiaolong; Bao, Zhenmin; Hu, Jingjie; Wang, Shi; Zhan, Aibin
2008-01-01
A new DNA computing algorithm based on a ligase chain reaction is demonstrated to solve an SAT problem. The proposed DNA algorithm can solve an n-variable m-clause SAT problem in m steps and the computation time required is O (3m+n). Instead of generating the full-solution DNA library, we start with an empty test tube and then generate solutions that partially satisfy the SAT formula. These partial solutions are then extended step by step by the ligation of new variables using Taq DNA ligase. Correct strands are amplified and false strands are pruned by a ligase chain reaction (LCR) as soon as they fail to satisfy the conditions. If we score and sort the clauses, we can use this algorithm to markedly reduce the number of DNA strands required throughout the computing process. In a computer simulation, the maximum number of DNA strands required was 2(0.48n) when n=50, and the exponent ratio varied inversely with the number of variables n and the clause/variable ratio m/n. This algorithm is highly space-efficient and error-tolerant compared to conventional brute-force searching, and thus can be scaled-up to solve large and hard SAT problems.
Experimental Realization of a One-way Quantum Computer Algorithm Solving Simon's Problem
Tame, M S; Di Franco, C; Wadsworth, W J; Rarity, J G
2014-01-01
We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's Problem - a black box period-finding problem which has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's Problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.
Holst, T. L.; Thomas, S. D.; Kaynak, U.; Gundy, K. L.; Flores, J.; Chaderjian, N. M.
1985-01-01
Transonic flow fields about wing geometries are computed using an Euler/Navier-Stokes approach in which the flow field is divided into several zones. The flow field immediately adjacent to the wing surface is resolved with fine grid zones and solved using a Navier-Stokes algorithm. Flow field regions removed from the wing are resolved with less finely clustered grid zones and are solved with an Euler algorithm. Computational issues associated with this zonal approach, including data base management aspects, are discussed. Solutions are obtained that are in good agreement with experiment, including cases with significant wind tunnel wall effects. Additional cases with significant shock induced separation on the upper wing surface are also presented.
Alam Khan, Najeeb; Razzaq, Oyoon Abdul
2016-03-01
In the present work a wavelets approximation method is employed to solve fuzzy boundary value differential equations (FBVDEs). Essentially, a truncated Legendre wavelets series together with the Legendre wavelets operational matrix of derivative are utilized to convert FB- VDE into a simple computational problem by reducing it into a system of fuzzy algebraic linear equations. The capability of scheme is investigated on second order FB- VDE considered under generalized H-differentiability. Solutions are represented graphically showing competency and accuracy of this method.
Wang, Zhaocai; Pu, Jun; Cao, Liling; Tan, Jian
2015-10-23
The unbalanced assignment problem (UAP) is to optimally resolve the problem of assigning n jobs to m individuals (m parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn) time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.
de Kock, Willem D.; Harskamp, Egbert G.
2014-01-01
Teachers in primary education experience difficulties in teaching word problem solving in their mathematics classes. However, during controlled experiments with a metacognitive computer programme, students' problem-solving skills improved. Also without the supervision of researchers, metacognitive computer programmes can be beneficial in a natural…
Solving the Coupled System Improves Computational Efficiency of the Bidomain Equations
Southern, J.A.
2009-10-01
The bidomain equations are frequently used to model the propagation of cardiac action potentials across cardiac tissue. At the whole organ level, the size of the computational mesh required makes their solution a significant computational challenge. As the accuracy of the numerical solution cannot be compromised, efficiency of the solution technique is important to ensure that the results of the simulation can be obtained in a reasonable time while still encapsulating the complexities of the system. In an attempt to increase efficiency of the solver, the bidomain equations are often decoupled into one parabolic equation that is computationally very cheap to solve and an elliptic equation that is much more expensive to solve. In this study, the performance of this uncoupled solution method is compared with an alternative strategy in which the bidomain equations are solved as a coupled system. This seems counterintuitive as the alternative method requires the solution of a much larger linear system at each time step. However, in tests on two 3-D rabbit ventricle benchmarks, it is shown that the coupled method is up to 80% faster than the conventional uncoupled method-and that parallel performance is better for the larger coupled problem.
Directory of Open Access Journals (Sweden)
Hande Çamlı
2009-09-01
Full Text Available This paper aims to determine the effect of computer aided instruction on students’ academic performance in solving Lowest Common Multiple and Greatest Common Factor problems and multiplicative structures. The study was held in the second semester of 2008 for five weeks with a total number of 102 sixth grade students. The research was carried out experimentally and a post test control group design method was used in this experimental study. An academic level test was used at the beginning of the study to compare the existing knowledge of experimental and control groups; and a post test and software developed by the researchers about the topic were used as data collection instruments in this study. The means of scores received in academic success test and post test were analyzed using t-test technique. The results of the study show that the use of computer support in teaching and learning Lowest Common Multiple and Greatest Common Factor problems and multiplicative structures in mathematics lesson may increase students academic success.
The effect of introducing computers into an introductory physics problem-solving laboratory
McCullough, Laura Ellen
2000-10-01
Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted
Solving the Scheduling Problem in Computational Grid using Artificial Bee Colony Algorithm
Directory of Open Access Journals (Sweden)
Seyyed Mohsen Hashemi
2013-07-01
Full Text Available Scheduling tasks on computational grids is known as NP-complete problem. Scheduling tasks in Grid computing, means assigning tasks to resources such that the time termination and average waiting time criteria and the number of required machines are optimized. Based on heuristic or meta-heuristic search have been proposed to obtain optimal solutions. The presented method tries to optimize all of the mentioned criteria with artificial bee colony system with consideration to precedence of tasks. Bee colony optimization is one of algorithms which categorized in swarm intelligence that can be used in optimization problems. This algorithm is based on the intelligent behavior of honey bees in foraging process. The result shows using bees for solving scheduling problem in computational grid makes better finish time and average waiting time.
Solving Large-Scale Computational Problems Using Insights from Statistical Physics
Energy Technology Data Exchange (ETDEWEB)
Selman, Bart [Cornell University
2012-02-29
Many challenging problems in computer science and related fields can be formulated as constraint satisfaction problems. Such problems consist of a set of discrete variables and a set of constraints between those variables, and represent a general class of so-called NP-complete problems. The goal is to find a value assignment to the variables that satisfies all constraints, generally requiring a search through and exponentially large space of variable-value assignments. Models for disordered systems, as studied in statistical physics, can provide important new insights into the nature of constraint satisfaction problems. Recently, work in this area has resulted in the discovery of a new method for solving such problems, called the survey propagation (SP) method. With SP, we can solve problems with millions of variables and constraints, an improvement of two orders of magnitude over previous methods.
Solving surface parameters of conic asphere mirror based on computer simulation
Huang, Chuan-ke; Wu, Yong-qian; Fan, Bin; Lei, Bai-ping
2016-09-01
Radius of curvature R and conic constant k are important parameters of aspheres.Null testing or CGH are usually used to evaluate the processing quality of aspheric mirrors in fabricating process . When the null compensator emerges a problem, additional method to ensure the accuracy of paraxial radius of curvature and conic constant is required. Based on the equation of conic aspheric, the computing model from which the paraxial radius of curvature R and conic constant k can be obtained was established, and a set of solving algorithm using singular value decomposition (SVD) method was derived. The simulating result of a 1800mm aspheric mirror is presented and the solving precision reaches R=6120+/-0.026mm, k=-1.0194+/-0.0008, thus the supplement to null testing of aspheric mirror is achieved effectively .
Directory of Open Access Journals (Sweden)
Amosa Isiaka Gambari
2015-07-01
Full Text Available This study investigated the effectiveness of computer-assisted Students’ Team Achievement Division (STAD cooperative learning strategy on physics problem solving on students’ achievement and retention. It also examined if the performance of the students would vary with gender. Purposive sampling technique was used to select two senior secondary schools year two physics students (SS II. The schools were assigned into computer-assisted STAD and Individualized Computer Instruction (ICI groups. 84 students from two intact classes participated in the study. Computer-Assisted Learning Package (CALP on physics and Physics Achievement Test (PAT were used as treatment and test instruments respectively. Analysis of Covariance and Scheffe test were used for data analysis. Findings indicated that students taught physics with computer-supported STAD performed and better than their counterparts in ICI group. In addition, they had better retention than those in ICI group. However, gender has no influence on students’ performance. Based on the findings, it was recommended among others that physics teacher should be encouraged to use computer-assisted cooperative instructional to enhance students’ performance.
Applications of GSTARS Computer Models for Solving River and Reservoir Sedimentation Problems
Institute of Scientific and Technical Information of China (English)
YANG Chih Ted
2008-01-01
GSTARS (Generalized Sediment Transport model for Alluvial River Simulation) is a series of computer models developed by the U.S. Bureau of Reclamation while the author was employed by that agency. The stream tube concept is used in all GSTARS models which allow us to solve one-dimensional equations for each stream tube independently and obtain semi-two-dimensional variation of the hydraulic conditions along and across stream tubes for rivers and reservoirs. Sedi-ment transport, scour, and deposition processes are simulated along each stream tube independ-ently to give us a semi-three-dimensional variation of the bed geometry. Most sediment transport computer models assume that channel width is given and cannot change during the simulation process. GSTARS models apply the theory of minimum stream power to the determination of op-timum channel width and channel geometry. The concepts of channel side stability, and active,inactive, and armoring layers are used in all GSTARS models for realistic long-term simulation and prediction of the scour and deposition processes in rivers and reservoirs.GSTARS models have been applied in many countries for solving a wide range of river and reservoir sedimentation prob-lems. Case studies will be used to illustrate the applications of GSTARS computer models.
Elizondo, D.; Cappelaere, B.; Faure, Ch.
2002-04-01
Emerging tools for automatic differentiation (AD) of computer programs should be of great benefit for the implementation of many derivative-based numerical methods such as those used for inverse modeling. The Odyssée software, one such tool for Fortran 77 codes, has been tested on a sample model that solves a 2D non-linear diffusion-type equation. Odyssée offers both the forward and the reverse differentiation modes, that produce the tangent and the cotangent models, respectively. The two modes have been implemented on the sample application. A comparison is made with a manually-produced differentiated code for this model (MD), obtained by solving the adjoint equations associated with the model's discrete state equations. Following a presentation of the methods and tools and of their relative advantages and drawbacks, the performances of the codes produced by the manual and automatic methods are compared, in terms of accuracy and of computing efficiency (CPU and memory needs). The perturbation method (finite-difference approximation of derivatives) is also used as a reference. Based on the test of Taylor, the accuracy of the two AD modes proves to be excellent and as high as machine precision permits, a good indication of Odyssée's capability to produce error-free codes. In comparison, the manually-produced derivatives (MD) sometimes appear to be slightly biased, which is likely due to the fact that a theoretical model (state equations) and a practical model (computer program) do not exactly coincide, while the accuracy of the perturbation method is very uncertain. The MD code largely outperforms all other methods in computing efficiency, a subject of current research for the improvement of AD tools. Yet these tools can already be of considerable help for the computer implementation of many numerical methods, avoiding the tedious task of hand-coding the differentiation of complex algorithms.
An Application of Computer Vision Systems to Solve the Problem of Unmanned Aerial Vehicle Control
Directory of Open Access Journals (Sweden)
Aksenov Alexey Y.
2014-09-01
Full Text Available The paper considers an approach for application of computer vision systems to solve the problem of unmanned aerial vehicle control. The processing of images obtained through onboard camera is required for absolute positioning of aerial platform (automatic landing and take-off, hovering etc. used image processing on-board camera. The proposed method combines the advantages of existing systems and gives the ability to perform hovering over a given point, the exact take-off and landing. The limitations of implemented methods are determined and the algorithm is proposed to combine them in order to improve the efficiency.
Experimental realization of a one-way quantum computer algorithm solving Simon's problem.
Tame, M S; Bell, B A; Di Franco, C; Wadsworth, W J; Rarity, J G
2014-11-14
We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's problem-a black-box period-finding problem that has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.
A computationally efficient method to solve the Takagi-Taupin equations for a large deformed crystal
Honkanen, Ari-Pekka; Huotari, Simo
2016-01-01
We present a treatise on solving the Takagi-Taupin equations in the case of a strain field with an additional, spatially slowly varying component (owing to \\emph{e.g.}~heat expansion or angular compression). We show that the presence of such a component in a typical case merely shifts the reflectivity curve as a function of wavelength or incidence angle, while having a negligible effect on its shape. On the basis of the derived result, we develop a computationally efficient method to calculate the reflectivity curve of a large deformed crystal. The validity of the method is demonstrated by compared computed reflectivity curves with experimental ones for bent silicon wafers. An excellent agreement is observed.
Schacter, J.; Herl, H. E.; Chung, G. K. W. K.; Dennis, R. A.; O'Neil, H. F., Jr.
1999-01-01
Discussion of performance assessments that test for higher-order thinking and problem solving focuses on research by the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) that assessed student problem solving using networked computers and the Web, where both performance and process data could be reported back to…
Xin, Yan Ping; Si, Luo; Hord, Casey; Zhang, Dake; Cetinas, Suleyman; Park, Joo Young
2012-01-01
The study explored the effects of a computer-assisted COnceptual Model-based Problem-Solving (COMPS) program on multiplicative word-problem-solving performance of students with learning disabilities or difficulties. The COMPS program emphasizes mathematical modeling with algebraic expressions of relations. Participants were eight fourth and fifth…
Indian Academy of Sciences (India)
P Chitra; P Venkatesh; R Rajaram
2011-04-01
The task scheduling problem in heterogeneous distributed computing systems is a multiobjective optimization problem (MOP). In heterogeneous distributed computing systems (HDCS), there is a possibility of processor and network failures and this affects the applications running on the HDCS. To reduce the impact of failures on an application running on HDCS, scheduling algorithms must be devised which minimize not only the schedule length (makespan) but also the failure probability of the application (reliability). These objectives are conﬂicting and it is not possible to minimize both objectives at the same time. Thus, it is needed to develop scheduling algorithms which account both for schedule length and the failure probability. Multiobjective Evolutionary Computation algorithms (MOEAs) are well-suited for Multiobjective task scheduling on heterogeneous environment. The two Multi-Objective Evolutionary Algorithms such as Multiobjective Genetic Algorithm (MOGA) and Multiobjective Evolutionary Programming (MOEP) with non-dominated sorting are developed and compared for the various random task graphs and also for a real-time numerical application graph. The metrics for evaluating the convergence and diversity of the obtained non-dominated solutions by the two algorithms are reported. The simulation results conﬁrm that the proposed algorithms can be used for solving the task scheduling at reduced computational times compared to the weighted-sum based biobjective algorithm in the literature.
Institute of Scientific and Technical Information of China (English)
Yang Liu; Guo-Fu Zhang; Zhao-Pin Su; Feng Yue; Jian-Guo Jiang
2016-01-01
Coalitional skill games (CSGs) are a simple model of cooperation in an uncertain environment where each agent has a set of skills that are required to accomplish a variety of tasks and each task requires a set of skills to be completed, but each skill is very hard to be quantified and can only be qualitatively expressed. Thus far, many computational questions surrounding CSGs have been studied. However, to the best of our knowledge, the coalition structure generation problem (CSGP), as a central issue of CSGs, is extremely challenging and has not been well solved. To this end, two different computational intelligence algorithms are herein evaluated: binary particle swarm optimization (BPSO) and binary differential evolution (BDE). In particular, we develop the two stochastic search algorithms with two-dimensional binary encoding and corresponding heuristic for individual repairs. After that, we discuss some fundamental properties of the proposed heuristic. Finally, we compare the improved BPSO and BDE with the state-of-the-art algorithms for solving CSGP in CSGs. The experimental results show that our algorithms can find the same near optimal solutions with the existing approaches but take extremely short time, especially under the large problem size.
Aryal, Bijaya
2016-03-01
We have studied the impacts of web-based Computer Coaches on educational outputs and outcomes. This presentation will describe the technical and conceptual framework related to the Coaches and discuss undergraduate students' favorability of the Coaches. Moreover, its impacts on students' physics problem solving performance and on their conceptual understanding of physics will be reported. We used a qualitative research technique to collect and analyze interview data from 19 undergraduate students who used the Coaches in the interview setting. The empirical results show that the favorability and efficacy of the Computer Coaches differ considerably across students of different educational backgrounds, preparation levels, attitudes and epistemologies about physics learning. The interview data shows that female students tend to have more favorability supporting the use of the Coach. Likewise, our assessment suggests that female students seem to benefit more from the Coaches in their problem solving performance and in conceptual learning of physics. Finally, the analysis finds evidence that the Coach has potential for increasing efficiency in usage and for improving students' educational outputs and outcomes under its customized usage. This work was partially supported by the Center for Educational Innovation, Office of the Senior Vice President for Academic Affairs and Provost, University of Minnesota.
DNA computing model based on lab-on-a-chip and its application to solving the timetabling problem
Institute of Scientific and Technical Information of China (English)
Fengyue Zhang; Bo Liu; Wenbin Liu; Qiang Zhang
2008-01-01
The essential characteristic of DNA computation is its massive parallelism in obtaining and managing information.With the development of molecular biology technique,the field of DNA computation has made a great progress.By using an advanced biochip technique,laboratory-on-a-chip,a new DNA computing model is presented in the paper to solve a simple timetabling problem,which is a special version ofthe optimization problems.It also plays an important role in education and other industries.With a simulated biological experiment,the result suggested that DNA computation with lab-on-a-chip has the potential to solve a real complex timetabling problem.
Zingerle, Philipp; Fecher, Thomas; Pail, Roland; Gruber, Thomas
2016-04-01
One of the major obstacles in modern global gravity field modelling is the seamless combination of lower degree inhomogeneous gravity field observations (e.g. data from satellite missions) with (very) high degree homogeneous information (e.g. gridded and reduced gravity anomalies, beyond d/o 1000). Actual approaches mostly combine such data only on the basis of the coefficients, meaning that previously for both observation classes (resp. models) a spherical harmonic analysis is done independently, solving dense normal equations (NEQ) for the inhomogeneous model and block-diagonal NEQs for the homogeneous. Obviously those methods are unable to identify or eliminate effects as spectral leakage due to band limitations of the models and non-orthogonality of the spherical harmonic base functions. To antagonize such problems a combination of both models on NEQ-basis is desirable. Theoretically this can be achieved using NEQ-stacking. Because of the higher maximum degree of the homogeneous model a reordering of the coefficient is needed which leads inevitably to the destruction of the block diagonal structure of the appropriate NEQ-matrix and therefore also to the destruction of simple sparsity. Hence, a special coefficient ordering is needed to create some new favorable sparsity pattern leading to a later efficient computational solving method. Such pattern can be found in the so called kite-structure (Bosch, 1993), achieving when applying the kite-ordering to the stacked NEQ-matrix. In a first step it is shown what is needed to attain the kite-(NEQ)system, how to solve it efficiently and also how to calculate the appropriate variance information from it. Further, because of the massive computational workload when operating on large kite-systems (theoretically possible up to about max. d/o 100.000), the main emphasis is put on to the presentation of special distributed algorithms which may solve those systems parallel on an indeterminate number of processes and are
Scilab software as an alternative low-cost computing in solving the linear equations problem
Agus, Fahrul; Haviluddin
2017-02-01
Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.
Directory of Open Access Journals (Sweden)
Hong-an Yang
2014-01-01
Full Text Available We focus on solving Stochastic Job Shop Scheduling Problem (SJSSP with random processing time to minimize the expected sum of earliness and tardiness costs of all jobs. To further enhance the efficiency of the simulation optimization technique of embedding Evolutionary Strategy in Ordinal Optimization (ESOO which is based on Monte Carlo simulation, we embed Optimal Computing Budget Allocation (OCBA technique into the exploration stage of ESOO to optimize the performance evaluation process by controlling the allocation of simulation times. However, while pursuing a good set of schedules, “super individuals,” which can absorb most of the given computation while others hardly get any simulation budget, may emerge according to the allocating equation of OCBA. Consequently, the schedules cannot be evaluated exactly, and thus the probability of correct selection (PCS tends to be low. Therefore, we modify OCBA to balance the computation allocation: (1 set a threshold of simulation times to detect “super individuals” and (2 follow an exclusion mechanism to marginalize them. Finally, the proposed approach is applied to an SJSSP comprising 8 jobs on 8 machines with random processing time in truncated normal, uniform, and exponential distributions, respectively. The results demonstrate that our method outperforms the ESOO method by achieving better solutions.
Cloud Computing: solving Availability Problem in Future Framework for e-Governance
Directory of Open Access Journals (Sweden)
Dileep Kumar Gupta
2013-03-01
Full Text Available Cloud is a centralized system where all the software and hardware taken together that provides the services to the users as pay per use system. To compete the world, every time the organizations need to have the better technology and proper utilization of resources to improve the performance. The proper utilization of the available scarce resources needs proper management and implements the better strategies to accomplish the task. Mukherjee and Sahoo [1] have proposed an effective framework of e-Governance based on cloud computing concept, which would be intelligent as well as accessible by all. In this paper, we discussed the availability problem in the existing framework for e-Governance and also provide a better solution to solve this problem.
Directory of Open Access Journals (Sweden)
Zhaocai Wang
2015-10-01
Full Text Available The unbalanced assignment problem (UAP is to optimally resolve the problem of assigning n jobs to m individuals (m < n, such that minimum cost or maximum profit obtained. It is a vitally important Non-deterministic Polynomial (NP complete problem in operation management and applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.
Andersen, Erling B.
A computer program for solving the conditional likelihood equations arising in the Rasch model for questionnaires is described. The estimation method and the computational problems involved are described in a previous research report by Andersen, but a summary of those results are given in two sections of this paper. A working example is also…
Using DNA to solve mathematical problems: Multiplication and 3-colorability computation
Wu, Gang
A functional machine is not only an assembly of parts, but also an assembly of processes. The processing of each part must obey laws that respect to the property of this part. For example, building any kind of computer entails selecting appropriate components and assembling their properties to function in computation. Here, we describe computation using a DNA strand as the basic unit and we have used this unit to achieve the function of multiplication. We exploit the phenomenon of DNA hybridization, in which each strand can represent two individual units that can pair to form a single unit. We represent the numbers we multiply in binary, with different lengths representing each digit present in the number. In principle, all combinations of the numbers will be present in solution. Following hybridization, there is present a collection of duplex molecules that are tailed by single-stranded ends. These intermediates are converted to fully duplex molecules by filling in the ends with DNA polymerase. The lengths that are present represent the digits that are present, and they may be separated by denaturing PAGE. The results give a series of bands for each power of two. The number of bands in the size domain for a particular power of two is converted to binary and the sum of all present bands is then added together. Experimentally, the result of this process always yields the correct answer. The combination of DNA and enzymes has been used to produce a structure that provides the answer to a computational problem. A graph of 6 vertices and 9 edges was constructed to solve an instance of the three-colorability problem. The final construct is obtained by joining coded branched DNA junctions with sticky ends representing the colored vertices. The DNA structure corresponding to the solution of the problem was identified through its resistance to restriction endonuclease cleavage. The method uses a constant number of steps, which is independent of the size of the graph.
Does this computational theory solve the right problem? Marr, Gibson, and the goal of vision.
Warren, William H
2012-01-01
David Marr's book Vision attempted to formulate athoroughgoing formal theory of perception. Marr borrowed much of the "computational" level from James Gibson: a proper understanding of the goal of vision, the natural constraints, and the available information are prerequisite to describing the processes and mechanisms by which the goal is achieved. Yet, as a research program leading to a computational model of human vision, Marr's program did not succeed. This article asks why, using the perception of 3D shape as a morality tale. Marr presumed that the goal of vision is to recover a general-purpose Euclidean description of the world, which can be deployed for any task or action. On this formulation, vision is underdetermined by information, which in turn necessitates auxiliary assumptions to solve the problem. But Marr's assumptions did not actually reflect natural constraints, and consequently the solutions were not robust. We now know that humans do not in fact recover Euclidean structure--rather, they reliably perceive qualitative shape (hills, dales, courses, ridges), which is specified by the second-order differential structure of images. By recasting the goals of vision in terms of our perceptual competencies, and doing the hard work of analyzing the information available under ecological constraints, we can reformulate the problem so that perception is determined by information and prior knowledge is unnecessary.
Rioz-Martínez, Ana; Roelfes, Gerard
2015-04-01
In the past decade, DNA-based hybrid catalysis has merged as a promising novel approach to homogeneous (asymmetric) catalysis. A DNA hybrid catalysts comprises a transition metal complex that is covalently or supramolecularly bound to DNA. The chiral microenvironment and the second coordination sphere interactions provided by the DNA are key to achieve high enantioselectivities and, often, additional rate accelerations in catalysis. Nowadays, current efforts are focused on improved designs, understanding the origin of the enantioselectivity and DNA-induced rate accelerations, expanding the catalytic scope of the concept and further increasing the practicality of the method for applications in synthesis. Herein, the recent developments will be reviewed and the perspectives for the emerging field of DNA-based hybrid catalysis will be discussed.
Institute of Scientific and Technical Information of China (English)
Qin Ni; Ch. Zillober; K. Schittkowski
2005-01-01
In this paper, we describe a method to solve large-scale structural optimization problems by sequential convex programming (SCP). A predictor-corrector interior point method is applied to solve the strictly convex subproblems. The SCP algorithm and the topology optimization approach are introduced. Especially, different strategies to solve certain linear systems of equations are analyzed. Numerical results are presented to show the efficiency of the proposed method for solving topology optimization problems and to compare different variants.
DNA Based Molecular Scale Nanofabrication
2015-12-04
water adsorption on DNA origami template and its impact on DNA- mediated chemical reactions. We also extended the concept of DNA- mediated reaction to...addition, we have expanded our efforts to include DNA- mediated HF etching of SiÜ2, DNA- mediated nanoimprinting lithography, DNA-based patterning of self...detailed kinetics study of DNA- mediated chemical reactions. Examples of such reactions include chemical vapor deposition (CVD) of inorganic oxide and HF
Özyurt, Özcan
2015-01-01
Problem solving is an indispensable part of engineering. Improving critical thinking dispositions for solving engineering problems is one of the objectives of engineering education. In this sense, knowing critical thinking and problem solving skills of engineering students is of importance for engineering education. This study aims to determine…
Özyurt, Özcan
2015-01-01
Problem solving is an indispensable part of engineering. Improving critical thinking dispositions for solving engineering problems is one of the objectives of engineering education. In this sense, knowing critical thinking and problem solving skills of engineering students is of importance for engineering education. This study aims to determine…
A Computational Method for Solving the Schrodinger Equation for a System of N Interacting Fermions*
Zettili, Nouredine
2002-03-01
We introduce here a computational method aimed at finding numerically the solutions of the one-dimensional Schrodinger equation for a system of N fermions. The method is based on a discretization scheme of the wave function as well as on the Numerov algorithm which offers an approximate treatment of the second derivative using the three-point difference formula. After discretizing the wave function, we derive a recursion relation which allows us to integrate forward or backward in the spatial degree of freedom. In this way, by incorporating the boundary conditions of the system, we can calculate iteratively the wave function at different values of the spatial coordinate. This process yields also the energy levels of the system. The search for the energy levels will be carried out incrementally till the wave function converges to the correct value at each boundary. The numerical calculations can be pushed till the energy levels and the wave function are obtained with the desired accuracy. As an application, we consider a one-dimensional system of N fermions interacting via a schematic two-body interaction. We will carry out calculations on several fermionic systems. To assess quantitatively the accuracy of this computational method, we will compare its results with the exact results obtained when solving the Schrodinger equation for a harmonic oscillator potential and an infinite square well potential. To gain additional insight about the accuracy of the method, its results of will be compared also with those obtained from a simple, exactly solvable model. *Supported by a research grant from Jacksonville State University.
Directory of Open Access Journals (Sweden)
José Antonio Martín H
Full Text Available Many practical problems in almost all scientific and technological disciplines have been classified as computationally hard (NP-hard or even NP-complete. In life sciences, combinatorial optimization problems frequently arise in molecular biology, e.g., genome sequencing; global alignment of multiple genomes; identifying siblings or discovery of dysregulated pathways. In almost all of these problems, there is the need for proving a hypothesis about certain property of an object that can be present if and only if it adopts some particular admissible structure (an NP-certificate or be absent (no admissible structure, however, none of the standard approaches can discard the hypothesis when no solution can be found, since none can provide a proof that there is no admissible structure. This article presents an algorithm that introduces a novel type of solution method to "efficiently" solve the graph 3-coloring problem; an NP-complete problem. The proposed method provides certificates (proofs in both cases: present or absent, so it is possible to accept or reject the hypothesis on the basis of a rigorous proof. It provides exact solutions and is polynomial-time (i.e., efficient however parametric. The only requirement is sufficient computational power, which is controlled by the parameter α∈N. Nevertheless, here it is proved that the probability of requiring a value of α>k to obtain a solution for a random graph decreases exponentially: P(α>k≤2(-(k+1, making tractable almost all problem instances. Thorough experimental analyses were performed. The algorithm was tested on random graphs, planar graphs and 4-regular planar graphs. The obtained experimental results are in accordance with the theoretical expected results.
Institute of Scientific and Technical Information of China (English)
HUANGJun; 朱涛; 等
1996-01-01
The matrix analytic analysis of queues with complex arrival,vacation and service characteristics requires the solution of nonlinear matrix equation.The complexity and large dimensionality of the model require an effcient and smart algorithm for the solution.In this paper,we propose and efficient Adaptive Newton-Kantorovich(ANK) method for speeding up the algorithm solving the nonlinear matrix equation which is an inevitable step in the analysis of the queue with embedded Markov chain such as BMAP/SMSP/1/∞ queue or its discrete version.BMAP/SMSP/1/∞ is a queuing model with a Semi Markov Service time Process (SMSP) and a Batch Markovian Arfival Process(BMAP).The numerical result is presented for the discrete case of N-MMBP/D/1 queue which arises in analyzing traffic aspect of computer communication network,where MMBP is Markov Modulated Bermoulli Process.The comparisons of Adaptive Newton-Kantorovich(ANK)with Modified Newton-Kantorovich(MNK) show that ANK saves 30% of CPU tim when the number of user N is 50.
Farshid Mirzaee; Mohammad Komak Yari
2016-01-01
In this paper, we introduce three-dimensional fuzzy differential transform method and we utilize it to solve fuzzy partial differential equations. This technique is a successful method because of reducing such problems to solve a system of algebraic equations; so, the problem can be solved directly. A considerable advantage of this method is to obtain the analytical solutions if the equation has an exact solution that is a polynomial function. Numerical examples are included to demonstrate th...
Hativa, Nira; Cohen, Dorit
1995-01-01
Two fourth-grade classes served in a study that employed computers for promoting autonomous learning processes through solving challenging problems adapted to students' aptitudes using the number line as an intuitive model. Findings showed that students have preinstructional intuitions and informal knowledge of negative numbers. (24 references)…
Nader-Grosbois, Nathalie; Lefevre, Nathalie
2011-01-01
This study compares self-regulation in 29 children with intellectual disability and 30 typically developing children, who solved tasks using physical materials or computers. Their cognitive, linguistic levels were assessed in order to match the children of both groups. In the presence of their mothers and fathers, the children were asked to…
Fessakis, G.; Gouli, E.; Mavroudi, E.
2013-01-01
Computer programming is considered an important competence for the development of higher-order thinking in addition to algorithmic problem solving skills. Its horizontal integration throughout all educational levels is considered worthwhile and attracts the attention of researchers. Towards this direction, an exploratory case study is presented…
The product contains user-friendly computer programs for solving sampling and related statistical problems. All have been updated as well and more programs have been added. Specific, detailed written instructions and examples built into the programs are provided so that the user ...
Mead, C.; Horodyskyj, L.; Buxner, S.; Semken, S. C.; Anbar, A. D.
2016-12-01
Developing scientific reasoning skills is a common learning objective for general-education science courses. However, effective assessments for such skills typically involve open-ended questions or tasks, which must be hand-scored and may not be usable online. Using computer-based learning environments, reasoning can be assessed automatically by analyzing student actions within the learning environment. We describe such an assessment under development and present pilot results. In our content-neutral instrument, students solve a problem by collecting and interpreting data in a logical, systematic manner. We then infer reasoning skill automatically based on student actions. Specifically, students investigate why Earth has seasons, a scientifically simple but commonly misunderstood topic. Students are given three possible explanations and asked to select a set of locations on a world map from which to collect temperature data. They then explain how the data support or refute each explanation. The best approaches will use locations in both the Northern and Southern hemispheres to argue that the contrasting seasonality of the hemispheres supports only the correct explanation. We administered a pilot version to students at the beginning of an online, introductory science course (n = 223) as an optional extra credit exercise. We were able to categorize students' data collection decisions as more and less logically sound. Students who choose the most logical measurement locations earned higher course grades, but not significantly higher. This result is encouraging, but not definitive. In the future, we will clarify our results in two ways. First, we plan to incorporate more open-ended interactions into the assessment to improve the resolving power of this tool. Second, to avoid relying on course grades, we will independently measure reasoning skill with one of the existing hand-scored assessments (e.g., Critical Thinking Assessment Test) to cross-validate our new
Directory of Open Access Journals (Sweden)
Farshid Mirzaee
2016-06-01
Full Text Available In this paper, we introduce three-dimensional fuzzy differential transform method and we utilize it to solve fuzzy partial differential equations. This technique is a successful method because of reducing such problems to solve a system of algebraic equations; so, the problem can be solved directly. A considerable advantage of this method is to obtain the analytical solutions if the equation has an exact solution that is a polynomial function. Numerical examples are included to demonstrate the validity and applicability of the method.
Towards high-performance symbolic computing using MuPAD as a problem solving environment
Sorgatz, A
1999-01-01
This article discusses the approach of developing MuPAD into an open and parallel problem solving environment for mathematical applications. It introduces the key technologies domains and dynamic modules and describes the current $9 state of macro parallelism which covers three fields of parallel programming: message passing, network variables and work groups. First parallel algorithms and examples of using the prototype of the MuPAD problem solving environment $9 are demonstrated. (12 refs).
Liu, Lei; Wu, Hai-Chen
2016-12-05
Nanopore sensing is an attractive, label-free approach that can measure single molecules. Although initially proposed for rapid and low-cost DNA sequencing, nanopore sensors have been successfully employed in the detection of a wide variety of substrates. Early successes were mostly achieved based on two main strategies by 1) creating sensing elements inside the nanopore through protein mutation and chemical modification or 2) using molecular adapters to enhance analyte recognition. Over the past five years, DNA molecules started to be used as probes for sensing rather than substrates for sequencing. In this Minireview, we highlight the recent research efforts of nanopore sensing based on DNA-mediated characteristic current events. As nanopore sensing is becoming increasingly important in biochemical and biophysical studies, DNA-based sensing may find wider applications in investigating DNA-involving biological processes. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chiroplasmonic DNA-based nanostructures
Cecconello, Alessandro; Besteiro, Lucas V.; Govorov, Alexander O.; Willner, Itamar
2017-09-01
Chiroplasmonic properties of nanoparticles, organized using DNA-based nanostructures, have attracted both theoretical and experimental interest. Theory suggests that the circular dichroism spectra accompanying chiroplasmonic nanoparticle assemblies are controlled by the sizes, shapes, geometries and interparticle distances of the nanoparticles. In this Review, we present different methods to assemble chiroplasmonic nanoparticle or nanorod systems using DNA scaffolds, and we discuss the operations of dynamically reconfigurable chiroplasmonic nanostructures. The chiroplasmonic properties of the different systems are characterized by circular dichroism and further supported by high-resolution transmission electron microscopy or cryo-transmission electron microscopy imaging and theoretical modelling. We also outline the applications of chiroplasmonic assemblies, including their use as DNA-sensing platforms and as functional systems for information processing and storage. Finally, future perspectives in applying chiroplasmonic nanoparticles as waveguides for selective information transfer and their use as ensembles for chiroselective synthesis are discussed. Specifically, we highlight the upscaling of the systems to device-like configurations.
Antibody-controlled actuation of DNA-based molecular circuits
Engelen, Wouter; Meijer, Lenny H. H.; Somers, Bram; de Greef, Tom F. A.; Merkx, Maarten
2017-02-01
DNA-based molecular circuits allow autonomous signal processing, but their actuation has relied mostly on RNA/DNA-based inputs, limiting their application in synthetic biology, biomedicine and molecular diagnostics. Here we introduce a generic method to translate the presence of an antibody into a unique DNA strand, enabling the use of antibodies as specific inputs for DNA-based molecular computing. Our approach, antibody-templated strand exchange (ATSE), uses the characteristic bivalent architecture of antibodies to promote DNA-strand exchange reactions both thermodynamically and kinetically. Detailed characterization of the ATSE reaction allowed the establishment of a comprehensive model that describes the kinetics and thermodynamics of ATSE as a function of toehold length, antibody-epitope affinity and concentration. ATSE enables the introduction of complex signal processing in antibody-based diagnostics, as demonstrated here by constructing molecular circuits for multiplex antibody detection, integration of multiple antibody inputs using logic gates and actuation of enzymes and DNAzymes for signal amplification.
Energy Technology Data Exchange (ETDEWEB)
Heydari, M.H., E-mail: heydari@stu.yazd.ac.ir [Faculty of Mathematics, Yazd University, Yazd (Iran, Islamic Republic of); The Laboratory of Quantum Information Processing, Yazd University, Yazd (Iran, Islamic Republic of); Hooshmandasl, M.R., E-mail: hooshmandasl@yazd.ac.ir [Faculty of Mathematics, Yazd University, Yazd (Iran, Islamic Republic of); The Laboratory of Quantum Information Processing, Yazd University, Yazd (Iran, Islamic Republic of); Cattani, C., E-mail: ccattani@unisa.it [Department of Mathematics, University of Salerno, Via Ponte Don Melillo, 84084 Fisciano (Italy); Maalek Ghaini, F.M., E-mail: maalek@yazd.ac.ir [Faculty of Mathematics, Yazd University, Yazd (Iran, Islamic Republic of); The Laboratory of Quantum Information Processing, Yazd University, Yazd (Iran, Islamic Republic of)
2015-02-15
Because of the nonlinearity, closed-form solutions of many important stochastic functional equations are virtually impossible to obtain. Thus, numerical solutions are a viable alternative. In this paper, a new computational method based on the generalized hat basis functions together with their stochastic operational matrix of Itô-integration is proposed for solving nonlinear stochastic Itô integral equations in large intervals. In the proposed method, a new technique for computing nonlinear terms in such problems is presented. The main advantage of the proposed method is that it transforms problems under consideration into nonlinear systems of algebraic equations which can be simply solved. Error analysis of the proposed method is investigated and also the efficiency of this method is shown on some concrete examples. The obtained results reveal that the proposed method is very accurate and efficient. As two useful applications, the proposed method is applied to obtain approximate solutions of the stochastic population growth models and stochastic pendulum problem.
The Effect of Simulation Games on the Learning of Computational Problem Solving
Liu, Chen-Chung; Cheng, Yuan-Bang; Huang, Chia-Wen
2011-01-01
Simulation games are now increasingly applied to many subject domains as they allow students to engage in discovery processes, and may facilitate a flow learning experience. However, the relationship between learning experiences and problem solving strategies in simulation games still remains unclear in the literature. This study, thus, analyzed…
Real-Time Assessment of Problem-Solving of Physics Students Using Computer-Based Technology
Gok, Tolga
2012-01-01
The change in students' problem solving ability in upper-level course through the application of a technological interactive environment--Tablet PC running InkSurvey--was investigated in present study. Tablet PC/InkSurvey interactive technology allowing the instructor to receive real-time formative assessment as the class works through the problem…
Unifying Computer-Based Assessment across Conceptual Instruction, Problem-Solving, and Digital Games
Miller, William L.; Baker, Ryan S.; Rossi, Lisa M.
2014-01-01
As students work through online learning systems such as the Reasoning Mind blended learning system, they often are not confined to working within a single educational activity; instead, they work through various different activities such as conceptual instruction, problem-solving items, and fluency-building games. However, most work on assessing…
Solving American Option Pricing Models by the Front Fixing Method: Numerical Analysis and Computing
Directory of Open Access Journals (Sweden)
R. Company
2014-01-01
analysis of the method is provided. The method preserves positivity and monotonicity of the numerical solution. Consistency and stability properties of the scheme are studied. Explicit calculations avoid iterative algorithms for solving nonlinear systems. Theoretical results are confirmed by numerical experiments. Comparison with other approaches shows that the proposed method is accurate and competitive.
Scaffold Seeking: A Reverse Design of Scaffolding in Computer-Supported Word Problem Solving
Cheng, Hercy N. H.; Yang, Euphony F. Y.; Liao, Calvin C. Y.; Chang, Ben; Huang, Yana C. Y.; Chan, Tak-Wai
2015-01-01
Although well-designed scaffolding may assist students to accomplish learning tasks, its insufficient capability to dynamically assess students' abilities and to adaptively support them may result in the problem of overscaffolding. Our previous project has also shown that students using scaffolds to solve mathematical word problems for a long time…
A comparison of several computational techniques for solving some common aeronomic problems
Turco, R. P.; Whitten, R. C.
1974-01-01
Several numerical integration techniques for solving common aeronomic problems involving species rate equations are compared for speed and accuracy. A newer technique that defines families of species that are nearly conserved is found to be superior to an iterative technique when both methods are applied to simple test problems. The 'conservation' technique is also found to be more economical than the more complex Gear (1969) integration scheme for comparable accuracy.
An Efficient Method for Solving Spread Option Pricing Problem: Numerical Analysis and Computing
Directory of Open Access Journals (Sweden)
R. Company
2016-01-01
Full Text Available This paper deals with numerical analysis and computing of spread option pricing problem described by a two-spatial variables partial differential equation. Both European and American cases are treated. Taking advantage of a cross derivative removing technique, an explicit difference scheme is developed retaining the benefits of the one-dimensional finite difference method, preserving positivity, accuracy, and computational time efficiency. Numerical results illustrate the interest of the approach.
Using Two Types of Computer Algebra Systems to Solve Maxwell Optics Problems
Kulyabov, D. S.
2016-01-01
To synthesize Maxwell optics systems, the mathematical apparatus of tensor and vector analysis is generally employed. This mathematical apparatus implies executing a great number of simple stereotyped operations, which are adequately supported by computer algebra systems. In this paper, we distinguish between two stages of working with a mathematical model: model development and model usage. Each of these stages implies its own computer algebra system. As a model problem, we consider the prob...
Solving the scattering of N photons on a two-level atom without computation
Roulet, Alexandre; Scarani, Valerio
2016-09-01
We propose a novel approach for solving the scattering of light onto a two-level atom coupled to a one-dimensional waveguide. First we express the physical quantity of interest in terms of Feynman diagrams and treat the atom as a non-saturable linear beamsplitter. By using the atomic response to our advantage, a relevant substitution is then made that captures the nonlinearity of the atom, and the final result is obtained in terms of simple integrals over the initial incoming wavepackets. The procedure is not limited to post-scattering quantities and allows for instance to derive the atomic excitation during the scattering event.
Malkov, Ewgenij A.; Poleshkin, Sergey O.; Kudryavtsev, Alexey N.; Shershnev, Anton A.
2016-10-01
The paper presents the software implementation of the Boltzmann equation solver based on the deterministic finite-difference method. The solver allows one to carry out parallel computations of rarefied flows on a hybrid computational cluster with arbitrary number of central processor units (CPU) and graphical processor units (GPU). Employment of GPUs leads to a significant acceleration of the computations, which enables us to simulate two-dimensional flows with high resolution in a reasonable time. The developed numerical code was validated by comparing the obtained solutions with the Direct Simulation Monte Carlo (DSMC) data. For this purpose the supersonic flow past a flat plate at zero angle of attack is used as a test case.
Garrido, Jose
2011-01-01
… offers a solid first step into scientific and technical computing for those just getting started. … Through simple examples that are both easy to conceptualize and straightforward to express mathematically (something that isn't trivial to achieve), Garrido methodically guides readers from problem statement and abstraction through algorithm design and basic programming. His approach offers those beginning in a scientific or technical discipline something unique; a simultaneous introduction to programming and computational thinking that is very relevant to the practical application of computin
Directory of Open Access Journals (Sweden)
Tim ePalmer
2015-10-01
Full Text Available How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete.
Alternative DNA base pairing through metal coordination.
Clever, Guido H; Shionoya, Mitsuhiko
2012-01-01
Base-pairing in the naturally occurring DNA and RNA oligonucleotide duplexes is based on π-stacking, hydrogen bonding, and shape complementarity between the nucleobases adenine, thymine, guanine, and cytosine as well as on the hydrophobic-hydrophilic balance in aqueous media. This complex system of multiple supramolecular interactions is the product of a long-term evolutionary process and thus highly optimized to serve its biological functions such as information storage and processing. After the successful implementation of automated DNA synthesis, chemists have begun to introduce artificial modifications inside the core of the DNA double helix in order to study various aspects of base pairing, generate new base pairs orthogonal to the natural ones, and equip the biopolymer with entirely new functions. The idea to replace the hydrogen bonding interactions with metal coordination between ligand-like nucleosides and suitable transition metal ions culminated in the development of a plethora of artificial base-pairing systems termed "metal base-pairs" which were shown to strongly enhance the DNA duplex stability. Furthermore, they show great potential for the use of DNA as a molecular wire in nanoscale electronic architectures. Although single electrons have proven to be transmitted by natural DNA over a distance of several base pairs, the high ohmic resistance of unmodified oligonucleotides was identified as a serious obstacle. By exchanging some or all of the Watson-Crick base pairs in DNA with metal complexes, this problem may be solved. In the future, these research efforts are supposed to lead to DNA-like materials with superior conductivity for nano-electronic applications. Other fields of potential application such as DNA-based supramolecular architecture and catalysis may be strongly influenced by these developments as well. This text is meant to illustrate the basic concepts of metal-base pairing and give an outline over recent developments in this field.
Immune Algorithm for Solving the Optimization Problems of Computer Communication Networks
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
The basic problem in optimizing communication networks is to assign a proper circuit for each origindestination pair in networks so as to minimize the average network delay, and the network optimal route selection model is a multi-constrained 0-1 nonlinear programming problem. In this paper, a new stochastic optimization algorithm, Immune Algorithm, is applied to solve the optimization problem in communication networks. And the backbone network vBNS is chosen to illustrate the technique of evaluating delay in a virtual network. At last, IA is compared with the optimization method in communication networks based on Genetic Algorithm, and the result shows that IA is better than GA in global optimum finding.
Cognitive processes in solving variants of computer-based problems used in logic teaching
Eysink, Tessa H.S.; Dijkstra, S.; Kuper, Jan
2001-01-01
The effect of two instructional variables, visualisation and manipulation of objects, in learning to use the logical connective, conditional, was investigated. Instructions for 66 first- year social science students were varied in the computer-based learning environment Tarski's World, designed for
Gender and Computer-Mediated Communication: Group Processes in Problem Solving.
Adrianson, L.
2001-01-01
Reports results from a study of university students in Sweden that investigated aspects of communicative processes using face-to-face and computer-mediated communication. Examined influences of gender on communication equality, social relations, and communicative processes and studied differences in self-awareness. Results showed few significant…
Newton, Kristie J.; Willard, Catherine; Teufel, Christopher
2014-01-01
The purpose of this study was to better understand how students with learning disabilities, including those who struggle specifically with mathematics, engage with fraction computation. In particular, we examined error patterns, the influence of like and unlike denominators on these patterns, and correct solution methods. Although skill-related…
Raybould, Barry
1990-01-01
Describes the design of an electronic performance support system (PSS) that was developed to help sales and support personnel access relevant information needed for good job performance. Highlights include expert systems, databases, interactive video discs, formatting information online, information retrieval techniques, HyperCard, computer-based…
Problem-Solving Inquiry-Oriented Biology Tasks Integrating Practical Laboratory and Computer.
Friedler, Yael; And Others
1992-01-01
Presents results of a study that examines the development and use of computer simulations for high school science instruction and for integrated laboratory and computerized tests that are part of the biology matriculation examination in Israel. Eleven implications for teaching are presented. (MDH)
Using Computers to Solve Mathematics by Junior Secondary School Students in Edo State Nigeria
Olusi, F. I.
2008-01-01
The purpose of this study was to determine the effect of computer aided instruction and traditional method of instruction on the junior secondary school students achievement in mathematics. Four research questions and hypotheses were stated and tested in the study. The design of the study was the pre-test post-test control group experimental…
Cognitive processes in solving variants of computer-based problems used in logic teaching
Eysink, T.H.S.; Dijkstra, S.; Kuper, J.
2001-01-01
The effect of two instructional variables, visualisation and manipulation of objects, in learning to use the logical connective, conditional, was investigated. Instructions for 66 first-year social science students were varied in the computer-based learning environment Tarski's World, designed for t
Lee, Chun-Yi; Chen, Ming-Puu.
2009-01-01
The purpose of this study was to investigate the effects of type of question prompt and level of prior knowledge on non-routine mathematical problem solving. A computer game was blended within the pattern reasoning tasks, along with question prompts, in order to demonstrate and enhance the connections between viable problem-solving strategies and…
Oliver, Kevin Matthew
National science standards call for increasing student exposure to inquiry and real-world problem solving. Students can benefit from open-ended learning environments that stress the engagement of real problems and the development of thinking skills and processes. The Internet is an ideal resource for context-bound problems with its seemingly endless supply of resources. Problems may arise, however, since young students are cognitively ill-prepared to manage open-ended learning and may have difficulty processing hypermedia. Computer tools were used in a qualitative case study with 12 eighth graders to determine how such implements might support the process of solving open-ended problems. A preliminary study proposition suggested students would solve open-ended problems more appropriately if they used tools in a manner consistent with higher-order critical and creative thinking. Three research questions sought to identify: how students used tools, the nature of science learning in open-ended environments, and any personal or environmental barriers effecting problem solving. The findings were mixed. The participants did not typically use the tools and resources effectively. They successfully collected basic information, but infrequently organized, evaluated, generated, and justified their ideas. While the students understood how to use most tools procedurally, they lacked strategic understanding for why tool use was necessary. Students scored average to high on assessments of general content understanding, but developed artifacts suggesting their understanding of specific micro problems was naive and rife with misconceptions. Process understanding was also inconsistent, with some students describing basic problem solving processes, but most students unable to describe how tools could support open-ended inquiry. Barriers to effective problem solving were identified in the study. Personal barriers included naive epistemologies, while environmental barriers included a
Computational issues of solving the 1D steady gradually varied flow equation
Directory of Open Access Journals (Sweden)
Artichowicz Wojciech
2014-09-01
Full Text Available In this paper a problem of multiple solutions of steady gradually varied flow equation in the form of the ordinary differential energy equation is discussed from the viewpoint of its numerical solution. Using the Lipschitz theorem dealing with the uniqueness of solution of an initial value problem for the ordinary differential equation it was shown that the steady gradually varied flow equation can have more than one solution. This fact implies that the nonlinear algebraic equation approximating the ordinary differential energy equation, which additionally coincides with the wellknown standard step method usually applied for computing of the flow profile, can have variable number of roots. Consequently, more than one alternative solution corresponding to the same initial condition can be provided. Using this property it is possible to compute the water flow profile passing through the critical stage.
Solving Problems in Various Domains by Hybrid Models of High Performance Computations
Directory of Open Access Journals (Sweden)
Yurii Rogozhin
2014-03-01
Full Text Available This work presents a hybrid model of high performance computations. The model is based on membrane system (P~system where some membranes may contain quantum device that is triggered by the data entering the membrane. This model is supposed to take advantages of both biomolecular and quantum paradigms and to overcome some of their inherent limitations. The proposed approach is demonstrated through two selected problems: SAT, and image retrieving.
Petraglia, Riccardo; Nicolaï, Adrien; Wodrich, Matthew D; Ceriotti, Michele; Corminboeuf, Clemence
2016-01-05
Computational studies of organic systems are frequently limited to static pictures that closely align with textbook style presentations of reaction mechanisms and isomerization processes. Of course, in reality chemical systems are dynamic entities where a multitude of molecular conformations exists on incredibly complex potential energy surfaces (PES). Here, we borrow a computational technique originally conceived to be used in the context of biological simulations, together with empirical force fields, and apply it to organic chemical problems. Replica-exchange molecular dynamics (REMD) permits thorough exploration of the PES. We combined REMD with density functional tight binding (DFTB), thereby establishing the level of accuracy necessary to analyze small molecular systems. Through the study of four prototypical problems: isomer identification, reaction mechanisms, temperature-dependent rotational processes, and catalysis, we reveal new insights and chemistry that likely would be missed using static electronic structure computations. The REMD-DFTB methodology at the heart of this study is powered by i-PI, which efficiently handles the interface between the DFTB and REMD codes.
Webb, D J
2012-01-01
Psychologists have long known that an expert in a field not only knows significantly more individual facts/skills than a novice but also has these facts/skills organized into a mental hierarchy that links the individual facts (at the bottom of the hierarchy) together with larger more-encompassing ideas (at the top of the hierarchy). In the Spring quarter of 2012, UC Davis offered 4 sections (about 180 students each) of the first quarter of introductory physics, Physics 9A, covering Newtonian mechanics. One of these sections is a "treatment" group and had the entire 10-week quarter's set of ideas introduced, largely qualitatively, in the first 6 weeks followed by the 4 weeks where students learn to use those ideas to solve the algebraically complicated problems that physicists prize. The other three sections were organized as usual. The treatment group and one of the other sections were taught by the author and were identical (same homework, discussion, lecture, and lab) except for the organization of the cont...
Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A; Zhong, Ning; Li, Kuncheng
2014-08-01
Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series and letter series completion tasks; even the underlying rules are identical. In the present study, we examined cortical activation as a function of two different reasoning strategies for solving series completion tasks. The retrieval strategy, used in number series completion tasks, involves direct retrieving of arithmetic knowledge to get the relations between items. The procedural strategy, used in letter series completion tasks, requires counting a certain number of times to detect the relations linking two items. The two strategies require essentially the equivalent cognitive processes, but have different working memory demands (the procedural strategy incurs greater demands). The procedural strategy produced significant greater activity in areas involved in memory retrieval (dorsolateral prefrontal cortex, DLPFC) and mental representation/maintenance (posterior parietal cortex, PPC). An ACT-R model of the tasks successfully predicted behavioral performance and BOLD responses. The present findings support a general-purpose dual-process theory of inductive reasoning regarding the cognitive architecture.
A Computational Realization of a Semi-Lagrangian Method for Solving the Advection Equation
Directory of Open Access Journals (Sweden)
Alexander Efremov
2014-01-01
Full Text Available A parallel implementation of a method of the semi-Lagrangian type for the advection equation on a hybrid architecture computation system is discussed. The difference scheme with variable stencil is constructed on the base of an integral equality between the neighboring time levels. The proposed approach allows one to avoid the Courant-Friedrichs-Lewy restriction on the relation between time step and mesh size. The theoretical results are confirmed by numerical experiments. Performance of a sequential algorithm and several parallel implementations with the OpenMP and CUDA technologies in the C language has been studied.
Maddrey, Elizabeth
Research in academia and industry continues to identify a decline in enrollment in computer science. One major component of this decline in enrollment is a shortage of female students. The primary reasons for the gender gap presented in the research include lack of computer experience prior to their first year in college, misconceptions about the field, negative cultural stereotypes, lack of female mentors and role models, subtle discriminations in the classroom, and lack of self-confidence (Pollock, McCoy, Carberry, Hundigopal, & You, 2004). Male students are also leaving the field due to misconceptions about the field, negative cultural stereotypes, and a lack of self-confidence. Analysis of first year attrition revealed that one of the major challenges faced by students of both genders is a lack of problem-solving skills (Beaubouef, Lucas & Howatt, 2001; Olsen, 2005; Paxton & Mumey, 2001). The purpose of this study was to investigate whether specific, non-mathematical problem-solving instruction as part of introductory programming courses significantly increased computer programming self-efficacy and achievement of students. The results of this study showed that students in the experimental group had significantly higher achievement than students in the control group. While this shows statistical significance, due to the effect size and disordinal nature of the data between groups, care has to be taken in its interpretation. The study did not show significantly higher programming self-efficacy among the experimental students. There was not enough data collected to statistically analyze the effect of the treatment on self-efficacy and achievement by gender. However, differences in means were observed between the gender groups, with females in the experimental group demonstrating a higher than average degree of self-efficacy when compared with males in the experimental group and both genders in the control group. These results suggest that the treatment from this
A New DNA-based Logical Gate Comes into Being
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
@@ Across-disciplinary research team, headed by Prof. FAN Chunhai from the CAS Shanghai Institute of Applied Physics, Prof. HE Lin, a CAS Member, and Prof. ZHANG Zhizhou at the Bio-X Research Center under Shanghai Jiao Tong University (SJTU), succeeded in developing a new type of logical gates by applying the deoxyribozyme (DNAzyme), adding a new brick to the groundwork of a DNA-based computation. The related research results have been reported on the German journal Angew. Chem. Int.Ed., 2006, 45, 1759.
Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly
2014-05-01
Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological
Stevens, R H; Najafi, K
1993-04-01
Artificial neural networks were trained by supervised learning to recognize the test selection patterns associated with students' successful solutions to seven immunology computer-based simulations. New test selection patterns evaluated by the trained neural network were correctly classified as successful or unsuccessful solutions to the problem > 90% of the time. The examination of the neural networks output weights after each test selection revealed a progressive and selective increase for the relevant problem suggesting that a successful solution is represented by the neural network as the accumulation of relevant tests. Unsuccessful problem solutions were classified by the neural network software into two patterns of students performance. The first pattern was characterized by low neural network output weights for all seven problems reflecting extensive searching and lack of recognition of relevant information. In the second pattern, the output weights from the neural network were biased toward one of the remaining six incorrect problems suggesting that the student misrepresented the current problem as an instance of a previous problem. Finally, neural network analysis could detect cases where the students switched hypotheses during the problem solving exercises.
Energy Technology Data Exchange (ETDEWEB)
Gritzo, L.A.; Skocypec, R.D. [Sandia National Labs., Albuquerque, NM (United States); Tong, T.W. [Arizona State Univ., Tempe, AZ (United States). Dept. of Mechanical and Aerospace Engineering
1995-01-11
Radiation in participating media is an important transport mechanism in many physical systems. The simulation of complex radiative transfer has not effectively exploited high-performance computing capabilities. In response to this need, a workshop attended by members active in the high-performance computing community, members active in the radiative transfer community, and members from closely related fields was held to identify how high-performance computing can be used effectively to solve the transport equation and advance the state-of-the-art in simulating radiative heat transfer. This workshop was held on March 29-30, 1994 in Albuquerque, New Mexico and was conducted by Sandia National Laboratories. The objectives of this workshop were to provide a vehicle to stimulate interest and new research directions within the two communities to exploit the advantages of high-performance computing for solving complex radiative heat transfer problems that are otherwise intractable.
Kinsella, John J.
1970-01-01
Discussed are the nature of a mathematical problem, problem solving in the traditional and modern mathematics programs, problem solving and psychology, research related to problem solving, and teaching problem solving in algebra and geometry. (CT)
DNA-based applications in nanobiotechnology.
Abu-Salah, Khalid M; Ansari, Anees A; Alrokayan, Salman A
2010-01-01
Biological molecules such as deoxyribonucleic acid (DNA) have shown great potential in fabrication and construction of nanostructures and devices. The very properties that make DNA so effective as genetic material also make it a very suitable molecule for programmed self-assembly. The use of DNA to assemble metals or semiconducting particles has been extended to construct metallic nanowires and functionalized nanotubes. This paper highlights some important aspects of conjugating the unique physical properties of dots or wires with the remarkable recognition capabilities of DNA which could lead to miniaturizing biological electronics and optical devices, including biosensors and probes. Attempts to use DNA-based nanocarriers for gene delivery are discussed. In addition, the ecological advantages and risks of nanotechnology including DNA-based nanobiotechnology are evaluated.
Communication: Electron ionization of DNA bases
Rahman, M. A.; Krishnakumar, E.
2016-04-01
No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve the existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.
Nader-Grosbois, Nathalie; Lefevre, Nathalie
2012-01-01
This study compared mothers and fathers' regulation with respect to 29 children with intellectual disability (ID) and 30 typically developing (TD) children, matched on their mental age (MA), as they solved eight tasks using physical materials and computers. Seven parents' regulatory strategies were coded as they supported their child's…
Serin, Oguz
2011-01-01
This study aims to investigate the effects of the computer-based instruction on the achievements and problem solving skills of the science and technology students. This is a study based on the pre-test/post-test control group design. The participants of the study consist of 52 students; 26 in the experimental group, 26 in the control group. The…
Uysal, Murat Pasa
2014-01-01
The introductory computer programming (CP) course has been taught for three decades in the faculty. Besides pursuing CP technology, one major goal has been enhancing learners' problem-solving (PS) skills. However, the current situation has implied that this might not be the case. Therefore, a research was conducted to investigate the effects of a…
Ismail, Mohd Nasir; Ngah, Nor Azilah; Umar, Irfan Naufal
2010-01-01
The purpose of the study is to investigate the effects of mind mapping with cooperative learning (MMCL) and cooperative learning (CL) on: (a) programming performance; (b) problem solving skill; and (c) metacognitive knowledge among computer science students in Malaysia. The moderating variable is the students' logical thinking level with two…
Twyman, Todd; Tindal, Gerald
2006-01-01
The purpose of this study was to improve the comprehension and problem-solving skills of students with disabilities in social studies using a conceptually framed, computer-adapted history text. Participants were 11th and 12th grade students identified with learning disabilities in reading and writing from two intact, self-contained social studies…
Gossner, Johannes
2014-01-01
Central venous port systems are now routinely used in oncology. The non-functioning port system is a common issue in radiology departments. Fluoroscopy is a first-line imaging modality. The potential usefulness of computed tomography as a problem-solving tool in three complex cases with non-radiopaque central venous port systems is presented.
Twyman, Todd; Tindal, Gerald
2006-01-01
The purpose of this study was to improve the comprehension and problem-solving skills of students with disabilities in social studies using a conceptually framed, computer-adapted history text. Participants were 11th and 12th grade students identified with learning disabilities in reading and writing from two intact, self-contained social studies…
Clariana, Roy B.; Engelmann, Tanja; Yu, Wu
2013-01-01
Problem solving likely involves at least two broad stages, problem space representation and then problem solution (Newell and Simon, Human problem solving, 1972). The metric centrality that Freeman ("Social Networks" 1:215-239, 1978) implemented in social network analysis is offered here as a potential measure of both. This development research…
Institute of Scientific and Technical Information of China (English)
KARIYAWASAM K A; TURNER S J; HILL G J
2012-01-01
This paper looks at student's view of the usefulness of a problem solving and programming module in the first year of a 3-year undergraduate program. The School of Science and Technology, University of Northampton, UK has been investigating, over the last seven years the teaching of problem solving. Including looking at whether a more visual approach has any benefits (the visual programming includes both 2-d and graphical user interfaces). Whilst the authors have discussed the subject problem solving and programming in the past [~J this paper considers the students perspective from research collected/collated by a student researcher under a new initiative within the University.All students interviewed either had completed the module within the two years of the survey or were completing the problem-solving module in their first year.
DNA-Based Vaccine Protects Against Zika in Animal Study
... page: https://medlineplus.gov/news/fullstory_161959.html DNA-Based Vaccine Protects Against Zika in Animal Study ... In animals infected with Zika virus, the synthetic DNA-based vaccine was 100 percent effective in protecting ...
DNA-Based Vaccine Guards Against Zika in Monkey Study
... page: https://medlineplus.gov/news/fullstory_161106.html DNA-Based Vaccine Guards Against Zika in Monkey Study ... THURSDAY, Sept. 22, 2016 (HealthDay News) -- An experimental DNA-based vaccine protected monkeys from infection with the ...
National Research Council Canada - National Science Library
Yaritza Tardo Fernández; Alexander Gorina Sánchez; Isabel Alonso Berenguer; Antonio Salgado Castillo
2013-01-01
The cultural, technological and eminently social character of the computer programming problems solving process, joined with the complexity and difficulties detected in their teaching, has contributed...
DEFF Research Database (Denmark)
Salvatore, Princia; Nazmutdinov, Renat R.; Ulstrup, Jens
2015-01-01
of the adlayers of the four DNA bases by EC-STM disclosed lifting of the Au(110) reconstruction, specific molecular packing in dense monolayers, and pH dependence of the A and G adsorption. DFT computations based on a cluster model for the Au(110) surface were performed to investigate the adsorption energy...... and geometry of the DNA bases in different adsorbate orientations. The optimized geometry is further used to compute models for STM images which are compared with the recorded STM images. This has provided insight into the physical nature of the adsorption. The specific orientations of A, C, G, and T on Au(110......, accompanied by a pair of strong voltammetry peaks in the double-layer region in acid solutions. Adsorption of the DNA bases gives featureless voltammograms with lower double-layer capacitance, suggesting that all the bases are chemisorbed on the Au(110) surface. Further investigation of the surface structures...
Directory of Open Access Journals (Sweden)
Jale İPEK
2013-12-01
Full Text Available The aim of research is to determine the effect of STAR strategy on 2nd grade students’ academic achievement and problem solving skills in the computer assisted mathematics instruction. 30 students who attend to 2nd grade course in a primary school in Aydın in 2010-2011 education year join the study group. The research has been taken place in 7 week. 3 types of tests have been used as means of data collecting tool, “Academic Achievement Test”, “Problem Solving Achievement Test” and “The Evaluation Form of Problem Solving Skills”. At the end of research students’ views about computer assisted mathematics instruction were evaluated. It has been examined that whether the differences between the scores of pre-test and post-test are statistically meaningful or not. According to the results, a positive increase on the academic achievement and problem solving skills has been determined at the end of the education carried out with STAR strategy.
Games that Enlist Collective Intelligence to Solve Complex Scientific Problems
Directory of Open Access Journals (Sweden)
Stephen Burnett
2015-09-01
Full Text Available There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article.
Games that Enlist Collective Intelligence to Solve Complex Scientific Problems.
Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard
2016-03-01
There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article.
Kester, Liesbeth; Kirschner, Paul A.; Van Merriënboer, Jeroen
2007-01-01
This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, whi
Kester, Liesbeth; Kirschner, Paul A.; Van Merriënboer, Jeroen
2007-01-01
This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram,
Kester, Liesbeth; Kirschner, Paul A.; Van Merriënboer, Jeroen
2007-01-01
This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, whi
DNA-based assembly lines and nanofactories.
Simmel, Friedrich C
2012-08-01
With the invention of the DNA origami technique, DNA self-assembly has reached a new level of sophistication. DNA can now be used to arrange molecules and other nanoscale components into almost arbitrary geometries-in two and even three dimensions and with nanometer precision. One exciting prospect is the realization of dynamic systems based on DNA, in which chemical reactions are precisely controlled by the spatial arrangement of components, ultimately resulting in nanoscale analogs of molecular assembly lines or 'nanofactories'. This review will discuss recent progress toward this goal, ranging from DNA-templated synthesis over artificial DNA-based enzyme cascades to first examples of 'molecular robots'. Copyright © 2012. Published by Elsevier Ltd.
Yadav, Aman; Hong, Hai; Stephenson, Chris
2016-01-01
The recent focus on computational thinking as a key 21st century skill for all students has led to a number of curriculum initiatives to embed it in K-12 classrooms. In this paper, we discuss the key computational thinking constructs, including algorithms, abstraction, and automation. We further discuss how these ideas are related to current…
Design of computer learning environment based on problem solving%基于问题解决的计算机学习环境的设计
Institute of Scientific and Technical Information of China (English)
杨辉军
2014-01-01
以提升学习者问题解决能力为目的所构建的学习环境，不仅需要体现出传统意义上学习环境的特质，同时还要有针对性用于学习者问题解决能力提升的相关资源。基于问题解决的计算机学习环境相对于传统的学习环境而言，有较为显著的优势，这一学习环境的建设完全围绕着学习者展开，同时着重强调了其在学习中的主体位置，满足了对培养学习者问题解决能力的根本目的。%The learning environment for the purpose of improving problem-solving abilities of learners,not only need to reflect the characteristics of the learning environment in the traditional sense,but also need to target at related resources for improving learners'problem-solving skills.Compared with traditional learning environment , computer learning environment based on problem-solving has more significant advantages ,which is learners-oriented and lay emphasis on subjects and can meet the purpose of solving problems.
Algebraic characterization of RNA operations for DNA-based computation
Institute of Scientific and Technical Information of China (English)
LI Shuchao
2004-01-01
Any RNA strand can be presented by a word in the language X*, where X={A,C,G,U}. By encoding A as 010, C as 000, G as 111, and U as 101, the RNA operations can be considered as the performance of concatenation, union, reverse, complement, in terms of the algebraic characterization. The concatenation and union play the roles of multiplication and addition over some algebraic structures, respectively. The rest of the operations turn out to be the homomorphisms or anti-homomorphisms of these algebraic structures. Using this technique, we find the relationship among these RNA operations.
Kester, Liesbeth; Kirschner, Paul A; van Merriënboer, Jeroen J G
2005-03-01
This study compared the effects of two information presentation formats on learning to solve problems in electrical circuits. In one condition, the split-source format, information relating to procedural aspects of the functioning of an electrical circuit was not integrated in a circuit diagram, while information in the integrated format condition was integrated in the circuit diagram. It was hypothesized that learners in the integrated format would achieve better test results than the learners in the split-source format. Equivalent-test problem and transfer-test problem performance were studied. Transfertest scores confirmed the hypothesis, though no differences were found on the equivalent-test scores.
Organization Of Experimental Computing in GeoGebra 5.0 in Solving Problems of Probability Theory
Directory of Open Access Journals (Sweden)
Elena Semenikhina
2015-03-01
Full Text Available The article analyzes the use of various mathematics software in the study of stochastic. The idea of dynamic visualization of the results of random experiments on the example of the classical problem of the meeting, which can be solved in two ways: using statistical definition of probability, which is based on random experiments, and traditionally using the geometric definition of probability. Some tasks, on the base of which the idea of visualization of the results of random experiments can be implemented, are offered with instructions.
基于三值光学计算机的旅行商问题的求解实现%Solving TSP based on ternary optical computer
Institute of Scientific and Technical Information of China (English)
沈云付; 樊孝领
2011-01-01
根据三值光计算机具有的巨并行性特点,对给定城市数的旅行商问题进行研究.首先将旅行商问题进行预处理,并转换为改进的符号数(modified signed-digit,MSD)表示形式;然后根据三值光学计算机的位数众多和MSD加法的无进位过程,建立了相应的计算方法,用自主开发的三值逻辑光处理器系统进行求解.结果表明,在数据量同样的情况下,与电子计算机相比,三值光学计算机需较少的计算步数就可以解决旅行商问题,显示出三值光学计算机潜在的优势.%Through the ternary optical computer, this paper studied the traveling salesman problem with arbitrarily definite number of nodes. The traveling salesman problem was first preproeessed and transformed into the form of modified signed-digit ( MSD ) data format. Then based on the characteristic of giant parallelism of the ternary optical computer and the MSD addition of non-carry process, established a calculation to solve the problem using the self-developed ternary optical processor system.Experiments show that under the same amount of data, as compared with the electronic computer, ternary optical computer can solve the traveling salesman problem with less number of steps, which demonstrates the potential advantages of ternary optical computer.
2015-07-14
these experimental and computational studies demonstrate a sizable impact of team communication structure and...43) was conducted to examine the effects of examples on design processes, an issue that has been the focus of a good bit of attention . The analysis...computational modelling and human experimental investigations, have provided some converging findings about factors that determine the efficacy of team
Le Roy, Robert J.
2017-01-01
This paper describes program LEVEL, which can solve the radial or one-dimensional Schrödinger equation and automatically locate either all of, or a selected number of, the bound and/or quasibound levels of any smooth single- or double-minimum potential, and calculate inertial rotation and centrifugal distortion constants and various expectation values for those levels. It can also calculate Franck-Condon factors and other off-diagonal matrix elements, either between levels of a single potential or between levels of two different potentials. The potential energy function may be defined by any one of a number of analytic functions, or by a set of input potential function values which the code will interpolate over and extrapolate beyond to span the desired range.
Engelmann, Tanja; Hesse, Friedrich W.
2010-01-01
For collaboration in learning situations, it is important to know what the collaborators know. However, developing such knowledge is difficult, especially for newly formed groups participating in a computer-supported collaboration. The solution for this problem described in this paper is to provide to group members access to the knowledge…
Gürbüz, Hasan; Evlioglu, Bengisu; Erol, Çigdem Selçukcan; Gülseçen, Hulusi; Gülseçen, Sevinç
2017-01-01
Computer-based games as developments in information technology seem to grow and spread rapidly. Using of these games by children and teenagers have increased. The presence of more beneficial and educational games in contrast to the violent and harmful games is remarkable. Many scientific studies have indicated that the useful (functional) games…
Hirvijoki, Eero; Äkäslompolo, Simppa; Varje, Jari; Koskela, Tuomas; Miettunen, Juho
2015-01-01
This paper explains how to obtain the distribution function of minority ions in tokamak plasmas using the Monte Carlo method. Since the emphasis is on energetic ions, the guiding-center transformation is outlined, including also the transformation of the collision operator. Even within the guiding-center formalism, the fast particle simulations can still be very CPU intensive and, therefore, we introduce the reader also to the world of high-performance computing. The paper is concluded with a few examples where the presented method has been applied.
Detecting Chemically Modified DNA Bases Using Surface Enhanced Raman Spectroscopy.
Barhoumi, Aoune; Halas, Naomi J
2011-12-15
Post-translational modifications of DNA- changes in the chemical structure of individual bases that occur without changes in the DNA sequence- are known to alter gene expression. They are believed to result in frequently deleterious phenotypic changes, such as cancer. Methylation of adenine, methylation and hydroxymethylation of cytosine, and guanine oxidation are the primary DNA base modifications identified to date. Here we show it is possible to use surface enhanced Raman spectroscopy (SERS) to detect these primary DNA base modifications. SERS detection of modified DNA bases is label-free and requires minimal additional sample preparation, reducing the possibility of additional chemical modifications induced prior to measurement. This approach shows the feasibility of DNA base modification assessment as a potentially routine analysis that may be further developed for clinical diagnostics.
How stable are the mutagenic tautomers of DNA bases?
Brovarets’ O. O.; Hovorun D. M.
2010-01-01
Aim. To determine the lifetime of the mutagenic tautomers of DNA base pairs through the investigation of the physicochemical mechanisms of their intramolecular proton transfer. Methods. Non-empirical quantum chemistry, the analysis of the electron density by means of Bader’s atom in molecules (AIM) theory and physicochemical kinetics were used. Results. Physicochemical character of the transition state of the intramolecular tautomerisation of DNA bases was investigated, the lifetime of mutage...
Nader-Grosbois, Nathalie; Lefèvre, Nathalie
2012-01-01
This study compared mothers and fathers' regulation with respect to 29 children with intellectual disability (ID) and 30 typically developing (TD) children, matched on their mental age (MA), as they solved eight tasks using physical materials and computers. Seven parents' regulatory strategies were coded as they supported their child's identification of the objective, planning, attention, motivation, joint attention, behaviour regulation and evaluation. Children's performance was scored. Regulation by the parents of the two groups did not differ significantly, regardless of the medium, except that the degree of parental regulation of the child's behaviour was greater in the ID group than in the TD group. In tasks involving the computer, we observed a higher degree of regulation of children's planning and a lower degree of regulation of their evaluation for the two groups. The parents displayed significantly less regulation with respect to the children with the highest MA than towards the children with the lowest MA, in each group. There was a significant interaction effect of medium and children's MA on overall parents' regulation and on their support of identification of objective and of planning. Most parental strategies were negatively linked with ID and TD children's performance in tasks. In both groups, with control for MA, parental support with the identification of the objective, with planning and with attention was negatively linked to the corresponding self-regulatory strategies of the children with each medium; however, parents' joint attention was positively linked with children's joint attention.
大规模有限元系统的GPU加速计算研究%Solving large finite element system by GPU computation
Institute of Scientific and Technical Information of China (English)
刘小虎; 胡耀国; 符伟
2012-01-01
Some techniques for applying GPU（Graphics Processing Units） computation in FEM（Finite El- ement Method） were investigated in this paper, which include element stiffness matrix parallel calcula- tion and global stiffness matrix assembly method, unstructured sparse matrix-vector multiplication and large-scale linear system solving method. A FEM code was implemented by using CUDA（Compute Uni- fied Device Architecture） platform and tested on nVidia GeForce GPU device. The system stiffness ma- trix was stored in the graphics memory in CSR（Compressed Sparse Row） format,and assembled via element coloring. Conjugate gradient method was used to solve FEM linear system iteratively. For the truss and 2D examples, the GPU-based FEM code gained speedups up to 9. 5x and 6.5x, respectively. It is found that the GPU speedup values are roughly linear with system DOFs（Degree Of Freedoms）,and the peak values of GFLOP/s increase approximately 10 times when comparing with those of CPU＇s.%研究了GPU（Graphics Processing Units）计算应用于有限元方法中的总刚计算和组装、稀疏矩阵与向量乘积运算、线性方程组求解问题，并基于CUDA（Compute Unified Device Arehiteeture）平台利用GTX295GPU进行程序实现和测试。系统总刚采用CSR（Compressed SparseRow）压缩格式存放于GPU显存中，用单元染色方法实现总刚并行计算组装，用共轭梯度迭代法求解大规模线性方程组。对300万自由度以内的空间桁架和平面问题算例，GPU有限元计算分别获得最高9．5倍和6．5倍的计算加速比，并且加速比随系统自由度的增加而近似线性增加，GFLOP／s峰值也有近10倍的增加。
Pol, Henk J.; Harskamp, Egbert G.; Suhre, Cor J. M.
Many students experience difficulties in solving applied physics problems. Researchers claim that the development of strategic knowledge (analyze, explore, plan, implement, verify) is just as necessary for solving problems as the development of content knowledge. In order to improve these
Computations of Wall Distances by Solving a Transport Equation%通过求解输运方程计算壁面距离
Institute of Scientific and Technical Information of China (English)
徐晶磊; 阎超; 范晶晶
2011-01-01
壁面距离在当代湍流模化中仍然扮演着关键角色,然而苦于遍历计算壁面距离的高昂代价,该文考虑了求解偏微分方程的途径.基于Eikonal方程构造出类Euler形式的输运方程,这样,可以直接利用求解Euler和Navier-Stokes方程的CFD程序使用的高效数值格式和部分代码.基于北航的MI-CFD(CFD for missles)数值平台,详尽地介绍了该输运方程在直角坐标下的求解过程.使用隐式LIJSGS时间推进和迎风空间离散,发现该方程具有鲁棒快速的收敛特性.为了保证精度,网格度量系数必须也迎风插值计算.讨论了初始条件和边界条件的特殊处理.成功应用该壁面距离求解方法计算了几个含1-1对应网格和重叠网格的复杂外形.%Motivated by the large expense to compute wall distances which still play a key role in modern turbulence modeling, the approach of solving partial differential equations is considered. An Euler-like transport equation was prposed based on Eikonal equation so that efficient algorithms and code components developed for solving transport equations such as Euler and Navier-Stokes can be reused. A detailed implementation of the transport equation in Cartesian Coordinates was provided based on code MI-CFD of BUAA. The transport equation was found to have robust and rapid convergence using implicit LUSGS time advancement and upwind spatial discretization. Geometric derivatives must also be upwind determined for accuracy assurance. Special treatments on initial and boundary conditions were discussed. This distance solving approach is successfully applied on several complex geometries with 1-1 blocking or overset grids.
Problem Solving Using Microcomputers.
Demana, Franklin; Waits, Bert
1987-01-01
It is argued that microcomputer technology has evolved to the stage that it should be routinely used by mathematics students at all levels. It is shown how the use of microcomputers can change the way problems are solved. Computer-generated graphics are highlighted. (PK)
DNA base excision repair nanosystem engineering: model development.
Sokhansanj, B A
2005-01-01
DNA base damage results from a combination of endogenous sources, (normal metabolism, increased metabolism due to obesity, stress from diseases such as arthritis and diabetes, and ischemia) and the environment (ingested toxins, ionizing radiation, etc.). If unrepaired DNA base damage can lead to diminished cell function, and potentially diseases and eventually mutations that lead to cancer. Sophisticated DNA repair mechanisms have evolved in all living cells to preserve the integrity of inherited genetic information and transcriptional control. Understanding a system like DNA repair is greatly enhanced by using engineering methods, in particular modeling interactions and using predictive simulation to analyze the impact of perturbations. We describe the use of such a "nanosystem engineering" approach to analyze the DNA base excision repair pathway in human cells, and use simulation to predict the impact of varying enzyme concentration on DNA repair capacity.
How stable are the mutagenic tautomers of DNA bases?
Directory of Open Access Journals (Sweden)
Brovarets’ O. O.
2010-02-01
Full Text Available Aim. To determine the lifetime of the mutagenic tautomers of DNA base pairs through the investigation of the physicochemical mechanisms of their intramolecular proton transfer. Methods. Non-empirical quantum chemistry, the analysis of the electron density by means of Bader’s atom in molecules (AIM theory and physicochemical kinetics were used. Results. Physicochemical character of the transition state of the intramolecular tautomerisation of DNA bases was investigated, the lifetime of mutagenic tautomers was calculated. Conclusions. The lifetime of the DNA bases mutagenic tautomers by 3–10 orders exceeds typical time of DNA replication in the cell (~103 s. This fact confirms that the postulate, on which the Watson-Crick tautomeric hypothesis of spontaneous transitions grounds, is adequate. The absence of intramolecular H-bonds in the canonical and mutagenic tautomeric forms determine their high stability
A quantum theoretical study of reactions of methyldiazonium ion with DNA base pairs
Energy Technology Data Exchange (ETDEWEB)
Shukla, P.K. [Department of Physics, Assam University, Silchar 788 011 (India); Ganapathy, Vinay [Department of Physics, Banaras Hindu University, Varanasi 221 005 (India); Mishra, P.C., E-mail: pcmishra_in@yahoo.com [Department of Physics, Banaras Hindu University, Varanasi 221 005 (India)
2011-09-22
Graphical abstract: Reactions of methyldiazonium ion at the different sites of the DNA bases in the Watson-Crick GC and AT base pairs were investigated employing density functional and second order Moller-Plesset (MP2) perturbation theories. Display Omitted Highlights: {yields} Methylation of the DNA bases is important as it can cause mutation and cancer. {yields} Methylation reactions of the GC and AT base pairs with CH{sub 3}N{sub 2}{sup +} were not studied earlier theoretically. {yields} Experimental observations have been explained using theoretical methods. - Abstract: Methylation of the DNA bases in the Watson-Crick GC and AT base pairs by the methyldiazonium ion was investigated employing density functional and second order Moller-Plesset (MP2) perturbation theories. Methylation at the N3, N7 and O6 sites of guanine, N1, N3 and N7 sites of adenine, O2 and N3 sites of cytosine and the O2 and O4 sites of thymine were considered. The computed reactivities for methylation follow the order N7(guanine) > N3(adenine) > O6(guanine) which is in agreement with experiment. The base pairing in DNA is found to play a significant role with regard to reactivities of the different sites.
DNA-based technology helps people solve problems. It can be used to correctly match organ donors with recipients, identify victims of natural and man-made disasters, and detect bacteria and other organisms that may pollute air, soil, food, or water.
Pricing Mechanism of TSP Solving Service in Cloud Computing%云计算中TSP问题求解服务的定价机制
Institute of Scientific and Technical Information of China (English)
曾栩鸿; 曾国荪
2011-01-01
旅行商问题(TSP)是一个典型的路径优化问题,在城市交通规划、物流运输、通信网络设置等领域都存在类似的问题和应用.但是,TSP问题的求解是NP难的,当问题规模很大时,必须借助大规模并行计算环境,例如云计算平台,以较大的计算开销来获得可行解.以TSP问题为具体实例,研究云计算服务的定价机制.一般情况下,定价机制要满足公平、灵活、动态、自适应.从公平合理角度来看,影响计算服务定价的因素主要有两方面:一是求解问题的难度,包括计算时间复杂性、空间复杂性、输入输出数据规模等；二是求解服务质量,即服务契约,包括可以作为服务等级协定指标的求解精度、响应时间、资源要求等.由此,提出了一种新的云计算中的服务定价机制:CloudPricing.该机制给出了服务定价的一般和具体原则,并给出了相应的定价公式.针对TSP问题求解,进行了具体的定价实例分析,这对云计算中NP难问题求解服务的定价有参考意义.%The traveling salesman problem(TSP) is a typical path optimization problem which has similar problems and applications in urban transportation planning, logistic transport and communication network settings. However,TSP is a NP hard problem. When problem scale is very large, large scale parallel computing environment such as cloud computing platform is needed. In this paper,we illustrated cloud service pricing mechanism with TSP. Generally,pricing mechanism should be fair, flexible, dynamic and flexible. To be fair and reasonable, there are two main aspects to be considered when pricing a service. One is the difficulty of solving the problem including time complexity, space complexity and quantity of data the application input and output. The other is the quality of service including precision of the result, response time and whether the service is provided in peak time or not which can be served for Service Level
Angeli, Charoula
2013-01-01
An investigation was carried out to examine the effects of cognitive style on learners' performance and interaction during complex problem solving with a computer modeling tool. One hundred and nineteen undergraduates volunteered to participate in the study. Participants were first administered a test, and based on their test scores they were…
Leh, Jayne
2011-01-01
Substantial evidence indicates that teacher-delivered schema-based instruction (SBI) facilitates significant increases in mathematics word problem solving (WPS) skills for diverse students; however research is unclear whether technology affordances facilitate superior gains in computer-mediated (CM) instruction in mathematics WPS when compared to…
Problem Solving with General Semantics.
Hewson, David
1996-01-01
Discusses how to use general semantics formulations to improve problem solving at home or at work--methods come from the areas of artificial intelligence/computer science, engineering, operations research, and psychology. (PA)
Pol, Henk J.; Harskamp, Egbert G.; Suhre, Cor J. M.
2008-01-01
Many students experience difficulties in solving applied physics problems. Researchers claim that the development of strategic knowledge (analyze, explore, plan, implement, verify) is just as necessary for solving problems as the development of content knowledge. In order to improve these problem-so
Charge Transport across DNA-Based Three-Way Junctions.
Young, Ryan M; Singh, Arunoday P N; Thazhathveetil, Arun K; Cho, Vincent Y; Zhang, Yuqi; Renaud, Nicolas; Grozema, Ferdinand C; Beratan, David N; Ratner, Mark A; Schatz, George C; Berlin, Yuri A; Lewis, Frederick D; Wasielewski, Michael R
2015-04-22
DNA-based molecular electronics will require charges to be transported from one site within a 2D or 3D architecture to another. While this has been shown previously in linear, π-stacked DNA sequences, the dynamics and efficiency of charge transport across DNA three-way junction (3WJ) have yet to be determined. Here, we present an investigation of hole transport and trapping across a DNA-based three-way junction systems by a combination of femtosecond transient absorption spectroscopy and molecular dynamics simulations. Hole transport across the junction is proposed to be gated by conformational fluctuations in the ground state which bring the transiently populated hole carrier nucleobases into better aligned geometries on the nanosecond time scale, thus modulating the π-π electronic coupling along the base pair sequence.
Envisioning the molecular choreography of DNA base excision repair.
Parikh, S S; Mol, C D; Hosfield, D J; Tainer, J A
1999-02-01
Recent breakthroughs integrate individual DNA repair enzyme structures, biochemistry and biology to outline the structural cell biology of the DNA base excision repair pathways that are essential to genome integrity. Thus, we are starting to envision how the actions, movements, steps, partners and timing of DNA repair enzymes, which together define their molecular choreography, are elegantly controlled by both the nature of the DNA damage and the structural chemistry of the participating enzymes and the DNA double helix.
Progress of DNA-based Methods for Species Identification
Institute of Scientific and Technical Information of China (English)
HU Zhen; ZHANG Su-hua; WANG Zheng; BIAN Ying-nan; LI Cheng-tao
2015-01-01
Species identification of biological samples is widely used in such fields as forensic science and food industry. A variety of accurate and reliable methods have been developed in recent years. The cur-rent reviewshows common target genes and screening criteria suitable for species identification, and de-scribed various DNA-based molecular biology methods about species identification. Additionally, it dis-cusses the future development of species identification combined with real-time PCR and sequencing technologies.
Interactive problem solving using LOGO
Boecker, Heinz-Dieter; Fischer, Gerhard
2014-01-01
This book is unique in that its stress is not on the mastery of a programming language, but on the importance and value of interactive problem solving. The authors focus on several specific interest worlds: mathematics, computer science, artificial intelligence, linguistics, and games; however, their approach can serve as a model that may be applied easily to other fields as well. Those who are interested in symbolic computing will find that Interactive Problem Solving Using LOGO provides a gentle introduction from which one may move on to other, more advanced computational frameworks or more
Controlling charge current through a DNA based molecular transistor
Behnia, S.; Fathizadeh, S.; Ziaei, J.
2017-01-01
Molecular electronics is complementary to silicon-based electronics and may induce electronic functions which are difficult to obtain with conventional technology. We have considered a DNA based molecular transistor and study its transport properties. The appropriate DNA sequence as a central chain in molecular transistor and the functional interval for applied voltages is obtained. I-V characteristic diagram shows the rectifier behavior as well as the negative differential resistance phenomenon of DNA transistor. We have observed the nearly periodic behavior in the current flowing through DNA. It is reported that there is a critical gate voltage for each applied bias which above it, the electrical current is always positive.
Main features of DNA-based immunization vectors
Directory of Open Access Journals (Sweden)
V. Azevedo
1999-02-01
Full Text Available DNA-based immunization has initiated a new era of vaccine research. One of the main goals of gene vaccine development is the control of the levels of expression in vivo for efficient immunization. Modifying the vector to modulate expression or immunogenicity is of critical importance for the improvement of DNA vaccines. The most frequently used vectors for genetic immunization are plasmids. In this article, we review some of the main elements relevant to their design such as strong promoter/enhancer region, introns, genes encoding antigens of interest from the pathogen (how to choose and modify them, polyadenylation termination sequence, origin of replication for plasmid production in Escherichia coli, antibiotic resistance gene as selectable marker, convenient cloning sites, and the presence of immunostimulatory sequences (ISS that can be added to the plasmid to enhance adjuvanticity and to activate the immune system. In this review, the specific modifications that can increase overall expression as well as the potential of DNA-based vaccination are also discussed.
Huang, C. J.; Motard, R. L.
1978-01-01
The computing equipment in the engineering systems simulation laboratory of the Houston University Cullen College of Engineering is described and its advantages are summarized. The application of computer techniques in aerospace-related research psychology and in chemical, civil, electrical, industrial, and mechanical engineering is described in abstracts of 84 individual projects and in reprints of published reports. Research supports programs in acoustics, energy technology, systems engineering, and environment management as well as aerospace engineering.
A Rewritable, Random-Access DNA-Based Storage System
Tabatabaei Yazdi, S. M. Hossein; Yuan, Yongbo; Ma, Jian; Zhao, Huimin; Milenkovic, Olgica
2015-09-01
We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.
Application of DNA-based methods in forensic entomology.
Wells, Jeffrey D; Stevens, Jamie R
2008-01-01
A forensic entomological investigation can benefit from a variety of widely practiced molecular genotyping methods. The most commonly used is DNA-based specimen identification. Other applications include the identification of insect gut contents and the characterization of the population genetic structure of a forensically important insect species. The proper application of these procedures demands that the analyst be technically expert. However, one must also be aware of the extensive list of standards and expectations that many legal systems have developed for forensic DNA analysis. We summarize the DNA techniques that are currently used in, or have been proposed for, forensic entomology and review established genetic analyses from other scientific fields that address questions similar to those in forensic entomology. We describe how accepted standards for forensic DNA practice and method validation are likely to apply to insect evidence used in a death or other forensic entomological investigation.
A universal, photocleavable DNA base: nitropiperonyl 2'-deoxyriboside.
Pirrung, M C; Zhao, X; Harris, S V
2001-03-23
A universal, photochemically cleavable DNA base analogue would add desirable versatility to a number of methods in molecular biology. A novel C-nucleoside, nitropiperonyl deoxyriboside (NPdR, P), has been investigated for this purpose. NPdR can be converted to its 5'-DMTr-3'-CE-phosphoramidite and was incorporated into pentacosanucleotides by conventional synthesis techniques. The destabilizing effect on hybrid formation with a complementary strand when this P base opposes A, T, and G was found to be 3-5 kcal/mol, but 9 kcal/mol when it opposes C. Brief irradiation (lambda > 360 nm, 20 min) of DNA containing the P base and piperidine treatment causes strand cleavage giving the 3'- and 5'-phosphates. Two significant recent interests, universal/non-hydrogen-bonding base analogues and photochemical backbone cleavage, have thus been combined in a single molecule that serves as a light-based DNA scissors.
Spectroscopic investigation on the telomeric DNA base sequence repeat
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
Telomeres are protein-DNA complexes at the terminals of linear chromosomes, which protect chromosomal integrity and maintain cellular replicative capacity.From single-cell organisms to advanced animals and plants,structures and functions of telomeres are both very conservative. In cells of human and vertebral animals, telomeric DNA base sequences all are (TTAGGG)n. In the present work, we have obtained absorption and fluorescence spectra measured from seven synthesized oligonucleotides to simulate the telomeric DNA system and calculated their relative fluorescence quantum yields on which not only telomeric DNA characteristics are predicted but also possibly the shortened telomeric sequences during cell division are imrelative fluorescence quantum yield and remarkable excitation energy innerconversion, which tallies with the telomeric sequence of (TTAGGG)n. This result shows that telomeric DNA has a strong non-radiative or innerconvertible capability.``
Magnetic Propulsion of Microswimmers with DNA-Based Flagellar Bundles.
Maier, Alexander M; Weig, Cornelius; Oswald, Peter; Frey, Erwin; Fischer, Peer; Liedl, Tim
2016-02-10
We show that DNA-based self-assembly can serve as a general and flexible tool to construct artificial flagella of several micrometers in length and only tens of nanometers in diameter. By attaching the DNA flagella to biocompatible magnetic microparticles, we provide a proof of concept demonstration of hybrid structures that, when rotated in an external magnetic field, propel by means of a flagellar bundle, similar to self-propelling peritrichous bacteria. Our theoretical analysis predicts that flagellar bundles that possess a length-dependent bending stiffness should exhibit a superior swimming speed compared to swimmers with a single appendage. The DNA self-assembly method permits the realization of these improved flagellar bundles in good agreement with our quantitative model. DNA flagella with well-controlled shape could fundamentally increase the functionality of fully biocompatible nanorobots and extend the scope and complexity of active materials.
Ab initio Study of Naptho-Homologated DNA Bases
Energy Technology Data Exchange (ETDEWEB)
Sumpter, Bobby G [ORNL; Vazquez-Mayagoitia, Alvaro [ORNL; Huertas, Oscar [Universitat de Barcelona; Fuentes-Cabrera, Miguel A [ORNL; Orozco, Modesto [Institut de Recerca Biomedica, Parc Cientific de Barcelona, Barcelona, Spain; Luque, Javier [Universitat de Barcelona
2008-01-01
Naptho-homologated DNA bases have been recently used to build a new type of size expanded DNA known as yyDNA. We have used theoretical techniques to investigate the structure, tautomeric preferences, base-pairing ability, stacking interactions, and HOMO-LUMO gaps of the naptho-bases. The structure of these bases is found to be similar to that of the benzo-fused predecessors (y-bases) with respect to the planarity of the aromatic rings and amino groups. Tautomeric studies reveal that the canonical-like form of naptho-thymine (yyT) and naptho-adenine (yyA) are the most stable tautomers, leading to hydrogen-bonded dimers with the corresponding natural nucleobases that mimic the Watson-Crick pairing. However, the canonical-like species of naptho-guanine (yyG) and naptho-cytosine (yyC) are not the most stable tautomers, and the most favorable hydrogen-bonded dimers involve wobble-like pairings. The expanded size of the naphto-bases leads to stacking interactions notably larger than those found for the natural bases, and they should presumably play a dominant contribution in modulating the structure of yyDNA duplexes. Finally, the HOMO-LUMO gap of the naptho-bases is smaller than that of their benzo-base counterparts, indicating that size-expansion of DNA bases is an efficient way of reducing their HOMO-LUMO gap. These results are examined in light of the available experimental evidence reported for yyT and yyC.
Techniques for Solving Sudoku Puzzles
Chi, Eric C
2012-01-01
Solving Sudoku puzzles is one of the most popular pastimes in the world. Puzzles range in difficulty from easy to very challenging; the hardest puzzles tend to have the most empty cells. The current paper compares the performance of three computer algorithms in solving puzzles. Backtracking, simulated annealing, and alternating projections are generic methods for attacking combinatorial optimization problems. Our results favor backtracking. It infallibly solves Sudoku puzzles or deduces that a unique solution does not exist. However, backtracking does not scale well in high-dimensional combinatorial optimization. Hence, it is useful to expose statistics students to the other two solution techniques in a concrete setting. Simulated annealing shares a common structure with MCMC (Markov chain Monte Carlo) and enjoys wide applicability. The method of alternating projections solves the feasibility problem in convex programming. Converting a discrete optimization problem into a continuous optimization problem opens...
White, Joey
The applicability of the dataflow architecture to a telemetry simulation is examined with particular reference to the problem of interfacing the simulation with an engineering model flight computer. The discussion covers the transport loop lag problem, simulation moding and control, the dataflow architecture solution, telemetry formatting and serialization, uplink command synchronization and reception, command validation and routing, and on-board computer interface and telemetry data request/response processing. The concepts discussed here have been developed for application on a training simulation for the NASA Orbital Maneuvering Vehicle.
Wahl, Sharon C.
Nursing educators and administrators are concerned about medication errors made by students which jeopardize patient safety. The inability to conceptualize and calculate medication dosages, often related to math anxiety, is implicated in such errors. A computer-assisted instruction (CAI) program is seen as a viable method of allowing students to…
A DNA-based semantic fusion model for remote sensing data.
Directory of Open Access Journals (Sweden)
Heng Sun
Full Text Available Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.
A DNA-based semantic fusion model for remote sensing data.
Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H
2013-01-01
Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.
Applications of nanoparticles for DNA based rabies vaccine.
Shah, Muhammad Ali A; Khan, Sajid Umar; Ali, Zeeshan; Yang, Haowen; Liu, Keke; Mao, Lanlan
2014-01-01
Rabies is a fatal encephalomyelitis. Most cases occur in developing countries and are transmitted by dogs. The cell culture vaccines as associated with high cost; therefore, have not replaced the unsafe brain-derived vaccines. In the developing countries these brain-derived rabies vaccines still can be seen in action. Moreover, there will be a need for vaccines against rabies-related viruses against which classical vaccines are not always effective. The worldwide incidence of rabies and the inability of currently used vaccination strategies to provide highly potent and cost-effective therapy indicate the need for alternate control strategies. DNA vaccines have emerged as the safest vaccines and best remedy for complicated diseases like hepatitis, HIV, and rabies. A number of recombinant DNA vaccines are now being developed against several diseases such as AIDS and malaria. Therefore, it can be a valuable alternative for the production of cheaper rabies vaccines against its larger spectrum of viruses. In this review we report published data on DNA-based immunization with sequences encoding rabies with special reference to nanotechnology.
DNA based random key generation and management for OTP encryption.
Zhang, Yunpeng; Liu, Xin; Sun, Manhui
2017-09-01
One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.
Porzel, Robert
2011-01-01
This book uses the latest in knowledge representation and human-computer interaction to address the problem of contextual computing in artificial intelligence. It uses high-level context to solve some challenging problems in natural language understanding.
Kereszturya, László; Rajczya, Katalin; Lászikb, András; Gyódia, Eva; Pénzes, Mária; Falus, András; Petrányia, Gyõzõ G
2002-03-01
In cases of disputed paternity, the scientific goal is to promote either the exclusion of a falsely accused man or the affiliation of the alleged father. Until now, in addition to anthropologic characteristics, the determination of genetic markers included human leukocyte antigen gene variants; erythrocyte antigens and serum proteins were used for that reason. Recombinant DNA techniques provided a new set of highly variable genetic markers based on DNA nucleotide sequence polymorphism. From the practical standpoint, the application of these techniques to paternity testing provides greater versatility than do conventional genetic marker systems. The use of methods to detect the polymorphism of human leukocyte antigen loci significantly increases the chance of validation of ambiguous results in paternity testing. The outcome of 2384 paternity cases investigated by serologic and/or DNA-based human leukocyte antigen typing was statistically analyzed. Different cases solved by DNA typing are presented involving cases with one or two accused men, exclusions and nonexclusions, and tests of the paternity of a deceased man. The results provide evidence for the advantage of the combined application of various techniques in forensic diagnostics and emphasizes the outstanding possibilities of DNA-based assays. Representative examples demonstrate the strength of combined techniques in paternity testing.
Rosati, Fiora; Boersma, Arnold J.; Klijn, Jaap E.; Meetsma, Auke; Feringa, Ben L.; Roelfes, Gerard
2009-01-01
The recently developed concept of DNA-based asymmetric catalysis involves the transfer of chirality from the DNA double helix in reactions using a noncovalently bound catalyst. To date, two generations of DNA-based catalysts have been reported that differ in the design of the ligand for the metal. H
CERN. Geneva
2008-01-01
What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...
Solving global optimization problems on GPU cluster
Barkalov, Konstantin; Gergel, Victor; Lebedev, Ilya
2016-06-01
The paper contains the results of investigation of a parallel global optimization algorithm combined with a dimension reduction scheme. This allows solving multidimensional problems by means of reducing to data-independent subproblems with smaller dimension solved in parallel. The new element implemented in the research consists in using several graphic accelerators at different computing nodes. The paper also includes results of solving problems of well-known multiextremal test class GKLS on Lobachevsky supercomputer using tens of thousands of GPU cores.
Mathematical Problem Solving: A Review of the Literature.
Funkhouser, Charles
The major perspectives on problem solving of the twentieth century are reviewed--associationism, Gestalt psychology, and cognitive science. The results of the review on teaching problem solving and the uses of computers to teach problem solving are included. Four major issues related to the teaching of problem solving are discussed: (1)…
Directory of Open Access Journals (Sweden)
Yaritza Tardo Fernández
2013-02-01
Full Text Available The cultural, technological and eminently social character of the computer programming problems solving process, joined with the complexity and difficulties detected in their teaching, has contributed to increase the concern about the study of the processes of communication, transmission and understanding of computer programming and to attract the attention of a wide scientific community in correspondence with the growing development that this reaches at the present time. That is the reason why this paper has the objective of discover, from the didactic point of view, the integrators axes of an algorithmic logic that solves the contradiction that is revealed in the formative process between the mathematic modeling and their algorithmic systematization to empower an efficient performance of the professionals of Computer Science and Computer Engineering. In this sense a new didactic proposal is based, that consist in an algorithmic logic, in which are specified and explained those essentials processes that should be carry out to solve computer programming problems. Based on the theoretical fundaments, we concluded that these processes constitute didactics moments, required in order to solve the contradiction mentioned before.RESUMENEl carácter eminentemente social, cultural y tecnológico del proceso de resolución de problemas de programación computacional, junto a la complejidad y dificultades detectadas en su enseñanza, han contribuido a despertar la preocupación por el estudio de los procesos de comunicación, transmisión y comprensión de la Programación y a interesar a una amplia comunidad científica en correspondencia con el creciente desarrollo que ésta alcanza en la actualidad. Razón por la cual este trabajo tiene como objetivo que se develen, desde el punto de vista didáctico, los ejes integradores de una lógica algorítmica que sea contentiva de la solución a la contradicción que se revela en el proceso formativo entre la
Artifacts associated with the measurement of oxidized DNA bases.
Cadet, J; Douki, T; Ravanat, J L
1997-10-01
In this paper we review recent aspects of the measurement of oxidized DNA bases, currently a matter of debate. There has long been an interest in the determination of the level of oxidized bases in cellular DNA under both normal and oxidative stress conditions. In this respect, the situation is confusing because variations that may be as large as two orders of magnitude have been reported for the yield of the formation of 8-oxo-7,8-dihydroguanine (8-oxoGua) in similar DNA samples. However, recent findings clearly show that application of several assays like gas chromatography-mass spectrometry (GC-MS) and -32P--postlabeling may lead to a significant overestimation of the level of oxidized bases in cellular DNA. In particular, the silylation step, which is required to make the samples volatile for the GC-MS analysis, has been shown to induce oxidation of normal bases at the level of about one oxidized base per 10(4) normal bases. This has been found to be a general process that applies in particular to 8-oxoGua, 8-oxo-7, 8-dihydroadenine,5-hydroxycytosine, 5-(hydroxymethyl)uracil, and 5-formyluracil. Interestingly, prepurification of the oxidized bases from DNA hydrolysate prior to the derivatization reaction prevents artefactual oxidation. Under these conditions, the level of oxidized bases measured by GC-MS is similar to that obtained by HPLC associated with electrochemical detection (HPLC-EC). It should be added that the level of 8-oxo-7,8-dihydro-2;-deoxyguanosine in control cellular DNA has been found to be about fivefold lower than in earlier HPLC-EC measurements by using appropriate conditions of extraction and enzymatic digestion of DNA. Similar conclusions were reached by measuring formamidopyrimidine-DNA glycosylase sensitive sites as revealed by the single cell gel electrophoresis (comet) assay.
How to make DNA count: DNA-based diagnostic tools in veterinary parasitology.
Hunt, P W; Lello, J
2012-05-04
Traditional methods for the diagnosis of parasitic helminth infections of livestock have a number of limitations, such as the inability to distinguish mixed-species infections, a heavy reliance on technical experience and also sub-sampling errors. Some of these limitations may be overcome through the development of rapid and accurate DNA-based tests. For example, DNA-based tests can specifically detect individual species in a mixed infection at either the larval or egg stages, in the absence of morphological differences among species. Even so, some diagnostic problems remain the same, irrespective of whether a DNA-based or traditional method is used. For example, sub-sampling errors from an aggregated distribution are likely to persist. It is proposed, however, that DNA-based diagnostic technologies offer an opportunity to expand diagnostic capabilities, and are discussed in the current review. The future introduction of DNA-based diagnostic technologies into routine diagnostic settings will also be discussed.
Some Applications of Algebraic System Solving
Roanes-Lozano, Eugenio
2011-01-01
Technology and, in particular, computer algebra systems, allows us to change both the way we teach mathematics and the mathematical curriculum. Curiously enough, unlike what happens with linear system solving, algebraic system solving is not widely known. The aim of this paper is to show that, although the theory lying behind the "exact…
Institute of Scientific and Technical Information of China (English)
单美贤
2015-01-01
CSCL合作问题解决活动发生在协商和共享的内容空间和关系空间中，情感维度的研究有助于更好地推动CSCL环境中的有效合作互动。在梳理情感、学习中的情感等概念的基础上，文章探讨了CSCL合作问题解决中的情感维度，认为团队成员之间的亲和力、共享心智模型/共享理解和情感管理是CSCL环境中合作问题解决的关键，其中团队成员间的亲和力是营造良好关系空间的基础，心智模型的共享促使合作过程中的交流真正开放，情感管理过程则致力于维持团队的共识和稳定的情感以推动有效的合作互动。文章最后从情感感知和情感反馈两个方面分析学习者的合作情感行为，以期为CSCL合作问题解决提供更好的情感支持。%The way that students try to solve problems together in Computer-supported Collaborative Learning ( CSCL) environments depends on the relationships between three collective activity dimensions: cognitive dimension, socio-relational dimension and affective dimension. The cognitive, socio-relational and affective dimensions of CSCL are interconnected components that mutually support one another. Cognitive aspects in collaborative learning research have been considered to be primary. However, in recent years, more and more researchers realized that the importance of affective aspects in CSCL environments. In order to understand the importance of students’ affective interpretation in computer-supported collaborative problem solving environment, Section one of this paper reviews the concept about emo-tion, affect, and emotion in learning. Section two discusses the influence from students’ affective awareness on collab-orative problem solving processes and the outcomes from three key factors to CSCL cooperative problem solving:the af-finity between team members, shared mental models and emotion management. Effective collaborative problem solving in CSCL depends on
Engineering bacteria to solve the Burnt Pancake Problem
Directory of Open Access Journals (Sweden)
Rosemond Sabriya
2008-05-01
Full Text Available Abstract Background We investigated the possibility of executing DNA-based computation in living cells by engineering Escherichia coli to address a classic mathematical puzzle called the Burnt Pancake Problem (BPP. The BPP is solved by sorting a stack of distinct objects (pancakes into proper order and orientation using the minimum number of manipulations. Each manipulation reverses the order and orientation of one or more adjacent objects in the stack. We have designed a system that uses site-specific DNA recombination to mediate inversions of genetic elements that represent pancakes within plasmid DNA. Results Inversions (or "flips" of the DNA fragment pancakes are driven by the Salmonella typhimurium Hin/hix DNA recombinase system that we reconstituted as a collection of modular genetic elements for use in E. coli. Our system sorts DNA segments by inversions to produce different permutations of a promoter and a tetracycline resistance coding region; E. coli cells become antibiotic resistant when the segments are properly sorted. Hin recombinase can mediate all possible inversion operations on adjacent flippable DNA fragments. Mathematical modeling predicts that the system reaches equilibrium after very few flips, where equal numbers of permutations are randomly sorted and unsorted. Semiquantitative PCR analysis of in vivo flipping suggests that inversion products accumulate on a time scale of hours or days rather than minutes. Conclusion The Hin/hix system is a proof-of-concept demonstration of in vivo computation with the potential to be scaled up to accommodate larger and more challenging problems. Hin/hix may provide a flexible new tool for manipulating transgenic DNA in vivo.
Physics: Quantum problems solved through games
Maniscalco, Sabrina
2016-04-01
Humans are better than computers at performing certain tasks because of their intuition and superior visual processing. Video games are now being used to channel these abilities to solve problems in quantum physics. See Letter p.210
Improving mathematical problem solving : A computerized approach
Harskamp, EG; Suhre, CJM
Mathematics teachers often experience difficulties in teaching students to become skilled problem solvers. This paper evaluates the effectiveness of two interactive computer programs for high school mathematics problem solving. Both programs present students with problems accompanied by instruction
Improving mathematical problem solving : A computerized approach
Harskamp, EG; Suhre, CJM
2006-01-01
Mathematics teachers often experience difficulties in teaching students to become skilled problem solvers. This paper evaluates the effectiveness of two interactive computer programs for high school mathematics problem solving. Both programs present students with problems accompanied by instruction
Problem Solving and Reasoning.
1984-02-01
6 here Acquisition of Problem - Solving Skill . An imporrant qLestinn is how the knowledge required For solving problems in a domain such as geometry is...Neves, 0. 4. (1981). Acquisition of problem - solving skill . In J. R. Anderson (Eds), Cognitive skills and their acquisition. Hillsdale, NJ: Erlbaum...NJ: Erlbaum. Voss, J. F., Greene, T. R., Post, T. A., & Penner, B. C. (1983). Problem solving skill in the social sciences. In G. H. Bower (Ed.), The
M. Kasemann
Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...
Institute of Scientific and Technical Information of China (English)
文立华; 张京妹; 孙进才
2001-01-01
Traditional methods for solving acoustic problems in engineering often require the solution of non-symmetric full matrix, whose dimension may be even higher than 10 000 and thus computational cost becomes quite high. To overcome this serious shortcoming, we propose a new periodic wavelet approach for the Helmholtz integral-equation solution of two-dimensional acoustic radiation and scattering over curved computation domain. We expand the boundary quantities in terms of periodic and orthogonal wavelets and we obtain the algebraic equations needed for solving the acoustic problems with Dirichlet, Neumann and mixed conditions. We evaluate the coefficients with fast wavelet transform. The advantage of the new approach is a highly sparse matrix system. We compare the numerical results obtained with our new approach, boundary element method or analytical solutions; the numerical results, as given in Table 1, show that our new approach converges rapidly and is of good accuracy.%提出了一种新的求解二维Helmholtz积分方程的方法。它通过将边界量用周期子波展开，将Helmholtz积分方程化为一组代数方程求解。即可求解Dirichlet、Neumann问题，也可求解混合边值问题。方程的系数形成可用快速子波变换。用该方法形成的Helmholtz积分方程的系数矩阵是一稀疏矩阵。这样大大提高了计算效率。本文算例表明：该方法收敛快，精度高，相同的精度下，本文方法求解的未知量大大少于边界元所用未知量。
Zhang, Yunpeng; Wang, Zhiwen; Wang, Zhenzhen; Liu, Xin; Yuan, Xiaojing
2017-09-27
Researchers have gained a deeper understanding of DNA-based encryption and its effectiveness in enhancing information security in recent years. However, there are many theoretical and technical issues about DNA-based encryption that need to be addressed before it can be effectively used in the field of security. Currently, the most popular DNA-based encryption schemes are based on traditional cryptography and the integration of existing DNA technology. These schemes are not completely based on DNA computing and biotechnology. Herein, as inspired by nature, encryption based on DNA has been developed, which is, in turn, based on two fundamental biological axioms about DNA sequencing: 1) DNA sequencing is difficult under the conditions of not knowing the correct sequencing primers and probes, and 2) without knowing the correct probe, it is difficult to decipher precisely and sequence the information of unknown and mixed DNA/peptide nucleic acid (PNA) probes, which only differ in nucleotide sequence, arranged on DNA chips (microarrays). In essence, when creating DNA-based encryption by means of biological technologies, such as DNA chips and polymerase chain reaction (PCR) amplification, the encryption method discussed herein cannot be decrypted, unless the DNA/PNA probe or PCR amplification is known. The biological analysis, mathematical analysis, and simulation results demonstrate the feasibility of the method, which provides much stronger security and reliability than that of traditional encryption methods. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
M. Kasemann
Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...
I. Fisk
2011-01-01
Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...
P. McBride
The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...
Brovarets', O O
2013-01-01
At the MP2/6-311++G(2df,pd)//B3LYP/6-311++G(d,p) level of theory it was established for the first time, that the Löwdin's G*.C* DNA base pair formed by the mutagenic tautomers can acquire, as the A-T Watson-Crick DNA base pair, four biologically important configurations, namely: Watson-Crick, reverse Watson-Crick, Hoogsteen and reverse Hoogsteen. This fact demonstrates rather unexpected role of the tautomerisation of the one of the Watson-Crick DNA base pairs, in particular, via double proton transfer: exactly the G.C-->G*.C* tautomerisation allows to overcome steric hindrances for the implementation of the above mentioned configurations. Geometric, electron-topological and energetic properties of the H-bonds that stabilise the studied pairs, as well as the energetic characteristics of the latters are presented.
The essential component in DNA-based information storage system: robust error-tolerating module
Directory of Open Access Journals (Sweden)
Aldrin Kay-Yuen eYim
2014-11-01
Full Text Available The size of digital data is ever increasing and is expected to grow to 40,000EB by 2020, yet the estimated global information storage capacity in 2011 is less than 300EB, indicating that most of the data are transient. DNA, as a very stable nano-molecule, is an ideal massive storage device for long-term data archive. The two most notable illustrations are from Church et al. and Goldman et al., whose approaches are well-optimized for most sequencing platforms – short synthesized DNA fragments without homopolymer. Here we suggested improvements on error handling methodology that could enable the integration of DNA-based computational process, e.g. algorithms based on self-assembly of DNA. As a proof of concept, a picture of size 438 bytes was encoded to DNA with Low-Density Parity-Check error-correction code. We salvaged a significant portion of sequencing reads with mutations generated during DNA synthesis and sequencing and successfully reconstructed the entire picture. A modular-based programming framework - DNAcodec with a XML-based data format was also introduced. Our experiments demonstrated the practicability of long DNA message recovery with high error-tolerance, which opens the field to biocomputing and synthetic biology.
I. Fisk
2013-01-01
Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...
Atkinson, Paul
2011-01-01
The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati
On an algorithm for solving parabolic and elliptic equations
D'Ascenzo, N.; Saveliev, V. I.; Chetverushkin, B. N.
2015-08-01
The present-day rapid growth of computer power, in particular, parallel computing systems of ultrahigh performance requires a new approach to the creation of models and solution algorithms for major problems. An algorithm for solving parabolic and elliptic equations is proposed. The capabilities of the method are demonstrated by solving astrophysical problems on high-performance computer systems with massive parallelism.
I. Fisk
2010-01-01
Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...
M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley
Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...
Recent technological advances have driven rapid development of DNA-based methods designed to facilitate detection and monitoring of invasive species in aquatic environments. These tools promise to significantly alleviate difficulties associated with traditional monitoring approac...
Recent technological advances have driven rapid development of DNA-based methods designed to facilitate detection and monitoring of invasive species in aquatic environments. These tools promise to significantly alleviate difficulties associated with traditional monitoring approac...
Laughlin, Patrick R
2011-01-01
Experimental research by social and cognitive psychologists has established that cooperative groups solve a wide range of problems better than individuals. Cooperative problem solving groups of scientific researchers, auditors, financial analysts, air crash investigators, and forensic art experts are increasingly important in our complex and interdependent society. This comprehensive textbook--the first of its kind in decades--presents important theories and experimental research about group problem solving. The book focuses on tasks that have demonstrably correct solutions within mathematical
Microwave-induced inactivation of DNA-based hybrid catalyst in asymmetric catalysis.
Zhao, Hua; Shen, Kai
2016-03-01
DNA-based hybrid catalysts have gained strong interests in asymmetric reactions. However, to maintain the high enantioselectivity, these reactions are usually conducted at relatively low temperatures (e.g. DNA-based hybrid catalyst even at low temperatures (such as 5 °C). Circular dichroism (CD) spectra and gel electrophoresis of DNA suggest that microwave exposure degrades DNA molecules and disrupts DNA double-stranded structures, causing changes of DNA-metal ligand binding properties and thus poor DNA catalytic performance.
Tree Searching and Student Problem Solving
Alderman, Donald L.
1978-01-01
Tree searching was applied as a computer model of simple addition sentences. Results indicated that the number of problem reductions performed in tree searching accounted for most of the variance across problems in student error rate and solution time. The technique constitutes a computer test for the adequacy of a problem solving prescription.…
Solving the drift control problem
Directory of Open Access Journals (Sweden)
Melda Ormeci Matoglu
2015-12-01
Full Text Available We model the problem of managing capacity in a build-to-order environment as a Brownian drift control problem. We formulate a structured linear program that models a practical discretization of the problem and exploit a strong relationship between relative value functions and dual solutions to develop a functional lower bound for the continuous problem from a dual solution to the discrete problem. Refining the discretization proves a functional strong duality for the continuous problem. The linear programming formulation is so badly scaled, however, that solving it is beyond the capabilities of standard solvers. By demonstrating the equivalence between strongly feasible bases and deterministic unichain policies, we combinatorialize the pivoting process and by exploiting the relationship between dual solutions and relative value functions, develop a mechanism for solving the LP without ever computing its coefficients. Finally, we exploit the relationship between relative value functions and dual solutions to develop a scheme analogous to column generation for refining the discretization so as to drive the gap between the discrete approximation and the continuous problem to zero quickly while keeping the LP small. Computational studies show our scheme is much faster than simply solving a regular discretization of the problem both in terms of finding a policy with a low average cost and in terms of providing a lower bound on the optimal average cost.
I. Fisk
2010-01-01
Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...
P. McBride
It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...
M. Kasemann
Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...
M. Kasemann
CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes. Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...
I. Fisk
2011-01-01
Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...
I. Fisk
2012-01-01
Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...
M. Kasemann
Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...
Intelligent DNA-based molecular diagnostics using linked genetic markers
Energy Technology Data Exchange (ETDEWEB)
Pathak, D.K.; Perlin, M.W.; Hoffman, E.P.
1994-12-31
This paper describes a knowledge-based system for molecular diagnostics, and its application to fully automated diagnosis of X-linked genetic disorders. Molecular diagnostic information is used in clinical practice for determining genetic risks, such as carrier determination and prenatal diagnosis. Initially, blood samples are obtained from related individuals, and PCR amplification is performed. Linkage-based molecular diagnosis then entails three data analysis steps. First, for every individual, the alleles (i.e., DNA composition) are determined at specified chromosomal locations. Second, the flow of genetic material among the individuals is established. Third, the probability that a given individual is either a carrier of the disease or affected by the disease is determined. The current practice is to perform each of these three steps manually, which is costly, time consuming, labor-intensive, and error-prone. As such, the knowledge-intensive data analysis and interpretation supersede the actual experimentation effort as the major bottleneck in molecular diagnostics. By examining the human problem solving for the task, we have designed and implemented a prototype knowledge-based system capable of fully automating linkage-based molecular diagnostics in X-linked genetic disorders, including Duchenne Muscular Dystrophy (DMD). Our system uses knowledge-based interpretation of gel electrophoresis images to determine individual DNA marker labels, a constraint satisfaction search for consistent genetic flow among individuals, and a blackboard-style problem solver for risk assessment. We describe the system`s successful diagnosis of DMD carrier and affected individuals from raw clinical data.
Matthias Kasemann
Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...
M. Kasemann
Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...
P. MacBride
The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...
Contributions from I. Fisk
2012-01-01
Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences. Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...
I. Fisk
2012-01-01
Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently. Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...
2010-01-01
Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...
I. Fisk
2013-01-01
Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites. Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month. Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB. Figure 3: The volume of data moved between CMS sites in the last six months The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...
An Open Environment for Cooperative Equational Solving
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
We describe a system called CFLP which aims at the integration ofthe best featu res of functional logic programming (FLP), cooperative constraint solving (CCS), and distributed computing. FLP provides support for defining one's own abstract ions over a constraint domain in an easy and comfortable way, whereas CCS is emp loyed to solve systems of mixed constraints by iterating specialized constraint solving methods in accordance with a well defined strategy. The system is a di s tributed implementation of a cooperative constraint functional logic programming scheme that combines higher-order lazy narrowing with cooperative constraint s o lving. The model takes advantage of the existence of several constraint solving resources located in a distributed environment (e.g., a network of computers), w hich communicate asynchronously via message passing. To increase the openness of the system, we are redesigning CFLP based on CORBA. We discuss some design and implementation issues of the system.
Capturing Problem-Solving Processes Using Critical Rationalism
Chitpin, Stephanie; Simon, Marielle
2012-01-01
The examination of problem-solving processes continues to be a current research topic in education. Knowing how to solve problems is not only a key aspect of learning mathematics but is also at the heart of cognitive theories, linguistics, artificial intelligence, and computers sciences. Problem solving is a multistep, higher-order cognitive task…
Capturing Problem-Solving Processes Using Critical Rationalism
Chitpin, Stephanie; Simon, Marielle
2012-01-01
The examination of problem-solving processes continues to be a current research topic in education. Knowing how to solve problems is not only a key aspect of learning mathematics but is also at the heart of cognitive theories, linguistics, artificial intelligence, and computers sciences. Problem solving is a multistep, higher-order cognitive task…
I. Fisk
2011-01-01
Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...
Solving the Schroedinger equation using Smolyak interpolants.
Avila, Gustavo; Carrington, Tucker
2013-10-07
In this paper, we present a new collocation method for solving the Schroedinger equation. Collocation has the advantage that it obviates integrals. All previous collocation methods have, however, the crucial disadvantage that they require solving a generalized eigenvalue problem. By combining Lagrange-like functions with a Smolyak interpolant, we device a collocation method that does not require solving a generalized eigenvalue problem. We exploit the structure of the grid to develop an efficient algorithm for evaluating the matrix-vector products required to compute energy levels and wavefunctions. Energies systematically converge as the number of points and basis functions are increased.
Institute of Scientific and Technical Information of China (English)
岳科来
2016-01-01
There have considerable number of design philosophies and design methods in this world,but today I’d like to intorduce a new design problem solving system which comes from Chinese traditonal religion Dao.
Singh, Chandralekha
2016-01-01
One finding of cognitive research is that people do not automatically acquire usable knowledge by spending lots of time on task. Because students' knowledge hierarchy is more fragmented, "knowledge chunks" are smaller than those of experts. The limited capacity of short term memory makes the cognitive load high during problem solving tasks, leaving few cognitive resources available for metacognition. The abstract nature of the laws of physics and the chain of reasoning required to draw meaningful inferences makes these issues critical. In order to help students, it is crucial to consider the difficulty of a problem from the perspective of students. We are developing and evaluating interactive problem-solving tutorials to help students in the introductory physics courses learn effective problem-solving strategies while solidifying physics concepts. The self-paced tutorials can provide guidance and support for a variety of problem solving techniques, and opportunity for knowledge and skill acquisition.
M. Kasemann
CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...
Miller, Michael B
2010-01-01
Solving tooth sensitivity requires both you and the patients to be resilient and to understand that if one approach doesn't work, you can try another one that is non-invasive or, at worst, minimally invasive. Much like the clinician who posted the original question, I strongly believe that it is our responsibility to convince patients that jumping to a radical solution could be totally unnecessary--and expensive-- and still might not solve the problem.
An investigation on solving cooperative problem solving
Directory of Open Access Journals (Sweden)
Masoumeh Sadat Abtahi
2014-03-01
Full Text Available One of the most important techniques to improve teaching skills is to use cooperative problem solving (CPS approach. Implementing CPS techniques in elementary schools helps us train more creative generations. This paper presents an empirical investigation to find out how much elementary teachers use CPS techniques at different schools located in city of Zanjan, Iran. The study designs a questionnaire and distributes it among 90 volunteers out of 120 teachers who were enrolled in elementary schools. The study analyzes the data using some basic statistics and the result indicates that teachers maintain an average CPS score of 39.37, which is well above the average level. The study provides some guidelines for exploring teachers CPS’s capabilities.
Ab initio research on DNA base alkylation by the β-position metabolite of methylethylnitrosamine
Institute of Scientific and Technical Information of China (English)
ZHAO Lijiao; ZHONG Rugang; YUAN Xiaolong; CUI Yasong; DAI Qianhuan
2004-01-01
Ab initio calculation is carried out to study the different supposed mechanisms of DNA base alkylation by β-sulphate-nitrosamines at RHF/6-31G(d) and MP2/6-31G(d)levels. Full geometric structure optimization is done for all reactants, intermediates, products and transition states. The activation energy and IRC are obtained. The results show that the anchimeric assistant effect promotes the alkylation of DNA base by β-sulphate-nitrosamines. Solvent calculation is carried out with Onsager model of SCRF method at the same level. The results indicate that the activation energy is decreased obviously in water.
Trading a Problem-solving Task
Matsubara, Shigeo
This paper focuses on a task allocation problem, especially cases where the task is to find a solution in a search problem or a constraint satisfaction problem. If the search problem is hard to solve, a contractor may fail to find a solution. Here, the more computational resources such as the CPU time the contractor invests in solving the search problem, the more a solution is likely to be found. This brings about a new problem that a contractee has to find an appropriate level of the quality in a task achievement as well as to find an efficient allocation of a task among contractors. For example, if the contractee asks the contractor to find a solution with certainty, the payment from the contractee to the contractor may exceed the contractee's benefit from obtaining a solution, which discourages the contractee from trading a task. However, solving this problem is difficult because the contractee cannot ascertain the contractor's problem-solving ability such as the amount of available resources and knowledge (e.g. algorithms, heuristics) or monitor what amount of resources are actually invested in solving the allocated task. To solve this problem, we propose a task allocation mechanism that is able to choose an appropriate level of the quality in a task achievement and prove that this mechanism guarantees that each contractor reveals its true information. Moreover, we show that our mechanism can increase the contractee's utility compared with a simple auction mechanism by using computer simulation.
Learning Matlab a problem solving approach
Gander, Walter
2015-01-01
This comprehensive and stimulating introduction to Matlab, a computer language now widely used for technical computing, is based on an introductory course held at Qian Weichang College, Shanghai University, in the fall of 2014. Teaching and learning a substantial programming language aren’t always straightforward tasks. Accordingly, this textbook is not meant to cover the whole range of this high-performance technical programming environment, but to motivate first- and second-year undergraduate students in mathematics and computer science to learn Matlab by studying representative problems, developing algorithms and programming them in Matlab. While several topics are taken from the field of scientific computing, the main emphasis is on programming. A wealth of examples are completely discussed and solved, allowing students to learn Matlab by doing: by solving problems, comparing approaches and assessing the proposed solutions.
Solving the factorization problem with P systems
Institute of Scientific and Technical Information of China (English)
Alberto Leporati; Claudio Zandron; Giancarlo Mauri
2007-01-01
P systems have been used many times to face with computationally difficult problems, such as NP-complete decision problems and NP-hard optimization problems. In this paper we focus our attention on another computationally intractable problem: factorization. In particular, we first propose a simple method to encode binary numbers using multisets. Then, we describe three families of P systems: the first two allow to add and to multiply two binary encoded numbers, respectively, and the third solves the factorization problem.
TAA Polyepitope DNA-Based Vaccines: A Potential Tool for Cancer Therapy
Directory of Open Access Journals (Sweden)
Roberto Bei
2010-01-01
Full Text Available DNA-based cancer vaccines represent an attractive strategy for inducing immunity to tumor associated antigens (TAAs in cancer patients. The demonstration that the delivery of a recombinant plasmid encoding epitopes can lead to epitope production, processing, and presentation to CD8+ T-lymphocytes, and the advantage of using a single DNA construct encoding multiple epitopes of one or more TAAs to elicit a broad spectrum of cytotoxic T-lymphocytes has encouraged the development of a variety of strategies aimed at increasing immunogenicity of TAA polyepitope DNA-based vaccines. The polyepitope DNA-based cancer vaccine approach can (a circumvent the variability of peptide presentation by tumor cells, (b allow the introduction in the plasmid construct of multiple immunogenic epitopes including heteroclitic epitope versions, and (c permit to enroll patients with different major histocompatibility complex (MHC haplotypes. This review will discuss the rationale for using the TAA polyepitope DNA-based vaccination strategy and recent results corroborating the usefulness of DNA encoding polyepitope vaccines as a potential tool for cancer therapy.
Potential for DNA-based ID of Great Lakes fauna: Species inventories vs. barcode libraries
DNA-based identification of mixed-organism samples offers the potential to greatly reduce the need for resource-intensive morphological identification, which would be of value both to biotic condition assessment and non-native species early-detection monitoring. However the abil...
DNA-based approaches to identify forest fungi in Pacific Islands: A pilot study
Anna E. Case; Sara M. Ashiglar; Phil G. Cannon; Ernesto P. Militante; Edwin R. Tadiosa; Mutya Quintos-Manalo; Nelson M. Pampolina; John W. Hanna; Fred E. Brooks; Amy L. Ross-Davis; Mee-Sook Kim; Ned B. Klopfenstein
2013-01-01
DNA-based diagnostics have been successfully used to characterize diverse forest fungi (e.g., Hoff et al. 2004, Kim et al. 2006, Glaeser & Lindner 2011). DNA sequencing of the internal transcribed spacer (ITS) and large subunit (LSU) regions of nuclear ribosomal DNA (rDNA) has proved especially useful (Sonnenberg et al. 2007, Seifert 2009, Schoch et al. 2012) for...
DNA-based identification of Armillaria isolates from peach orchards in Mexico state
Ruben Damian Elias Roman; Ned B. Klopfenstein; Dionicio Alvarado Rosales; Mee-Sook Kim; Anna E. Case; Sara M. Ashiglar; John W. Hanna; Amy L. Ross-Davis; Remigio A. Guzman Plazola
2012-01-01
A collaborative project between the Programa de FitopatologÃa, Colegio de Postgraduados, Texcoco, Estado de Mexico and the USDA Forest Service - RMRS, Moscow Forest Pathology Laboratory has begun this year (2011) to assess which species of Armillaria are causing widespread and severe damage to the peach orchards from MÃ©xico state, Mexico. We are employing a DNA-based...
Jiji, Sun; Xiaoxu, Zhao; Lihua, Qiao; Shuang, Mei; Zhipeng, Nie; Qinghai, Zhang; Yanchun, Ji; Pingping, Jiang; Min-Xin, Guan
2016-07-20
Mitochondrial DNA (mtDNA) mutations cause a variety of mitochondrial DNA-based diseases which have been studied using Lymphoblastoid cell lines (LCLs) and transmitochondrial cybrids. Individual genetic information is preserved permanently in LCLs while the development of transmitochondrial cybrids provide ex-vivo cellular platform to study molecular mechanism of mitochondrial DNA-based diseases. The cytoplasmic donor cells for previous transmitochondrial cybrids come from patient's tissue or platelet directly. Here, we depicted in details the principle, methods and techniques to establish LCLs from frozen peripheral bloods harboring mitochondrial 4401G > A mutation by infection of Epstein Barr virus, and then to generate cybrids using ρ(0) 206 and LCLs. The process of establishing these two cellular models was summarized into four steps as follows: (1) Generation of LCLs; (2) Transformation; (3) Selection; (4) Verification. To faithfully represent the function of mtDNA mutation, we analyzed and identified the sites of mtDNA mutations and copy numbers of each cellular models as well as the karyotype of transmitochondrial cybrids. Those clones with consistent parameters were selected for preservation and future analysis of the function of point mutations of mtDNA. Although these two cellular models play important roles in understanding molecular mechanism of mitochondrial DNA-based diseases on the cellular level, their limitations should be considered when elucidating the character of tissue specificity of mitochondrial DNA-based diseases.
DNA-based identification and phylogeny of North American Armillaria species
Amy L. Ross-Davis; John W. Hanna; Ned B. Klopfenstein
2011-01-01
Because Armillaria species display different ecological behaviors across diverse forest ecosystems, it is critical to identify Armillaria species accurately for any assessment of forest health. To further develop DNA-based identification methods, partial sequences of the translation elongation factor-1 alpha (EF-1α) gene were used to examine the phylogenetic...
Solving Problems through Circles
Grahamslaw, Laura; Henson, Lisa H.
2015-01-01
Several problem-solving interventions that utilise a "circle" approach have been applied within the field of educational psychology, for example, Circle Time, Circle of Friends, Sharing Circles, Circle of Adults and Solution Circles. This research explored two interventions, Solution Circles and Circle of Adults, and used thematic…
Solving Problems through Circles
Grahamslaw, Laura; Henson, Lisa H.
2015-01-01
Several problem-solving interventions that utilise a "circle" approach have been applied within the field of educational psychology, for example, Circle Time, Circle of Friends, Sharing Circles, Circle of Adults and Solution Circles. This research explored two interventions, Solution Circles and Circle of Adults, and used thematic…
Problem Solving Techniques Seminar.
Massachusetts Career Development Inst., Springfield.
This booklet is one of six texts from a workplace literacy curriculum designed to assist learners in facing the increased demands of the workplace. Six problem-solving techniques are developed in the booklet to assist individuals and groups in making better decisions: problem identification, data gathering, data analysis, solution analysis,…
DEFF Research Database (Denmark)
Foss, Kirsten; Foss, Nicolai Juul
2006-01-01
as a general approach to problem solving. We apply these Simonian ideas to organisational issues, specifically new organisational forms. Specifically, Simonian ideas allow us to develop a morphology of new organisational forms and to point to some design problems that characterise these forms....
Mathematics as Problem Solving.
Soifer, Alexander
This book contains about 200 problems. It is suggested that it be used by students, teachers or anyone interested in exploring mathematics. In addition to a general discussion on problem solving, there are problems concerned with number theory, algebra, geometry, and combinatorics. (PK)
Ayrinhac, Simon
2014-01-01
We present in this work a demonstration of the maze-solving problem with electricity. Electric current flowing in a maze as a printed circuit produces Joule heating and the right way is instantaneously revealed with infrared thermal imaging. The basic properties of electric current can be discussed in this context, with this challenging question:…
Universal Design Problem Solving
Sterling, Mary C.
2004-01-01
Universal design is made up of four elements: accessibility, adaptability, aesthetics, and affordability. This article addresses the concept of universal design problem solving through experiential learning for an interior design studio course in postsecondary education. Students' experiences with clients over age 55 promoted an understanding of…
Newton type methods for solving nonsmooth equations
Institute of Scientific and Technical Information of China (English)
Gao Yan
2005-01-01
Numerical methods for the solution of nonsmooth equations are studied. A new subdifferential for a locally Lipschitzian function is proposed. Based on this subdifferential, Newton methods for solving nonsmooth equations are developed and their convergence is shown. Since this subdifferential is easy to be computed, the present Newton methods can be executed easily in some applications.
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.
Berezin, I S
1965-01-01
Computing Methods, Volume 2 is a five-chapter text that presents the numerical methods of solving sets of several mathematical equations. This volume includes computation sets of linear algebraic equations, high degree equations and transcendental equations, numerical methods of finding eigenvalues, and approximate methods of solving ordinary differential equations, partial differential equations and integral equations.The book is intended as a text-book for students in mechanical mathematical and physics-mathematical faculties specializing in computer mathematics and persons interested in the
Quantum Computing for Computer Architects
Metodi, Tzvetan
2011-01-01
Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore
DEFF Research Database (Denmark)
Hansen, David
2012-01-01
employee strengths in continuou simprovements of the work system. The research question was: “How can Lean problem solving and Appreciative Inquiry be combined for optimized work system innovation?” The research project was carried out as a co-creation process with close cooperation between researcher......Many industrial production work systems have increased in complexity, and their new business model scompete on innovation, rather than low cost.At a medical device production facility committed to Lean Production, a research project was carried out to use Appreciative Inquiry to better engage...... and participants and was documented by qualitative methods. This paper presents an academic literature review on Appreciative Inquiry and problem solving for continuous improvements that did not reveal successful attempts in combining the two.Both the literature and the empirical study showed one of the main...
Creativity and problem Solving
Directory of Open Access Journals (Sweden)
René Victor Valqui Vidal
2004-12-01
Full Text Available This paper presents some modern and interdisciplinary concepts about creativity and creative processes of special relevance for Operational Research workers. Central publications in the area Creativity-Operational Research are shortly reviewed. Some creative tools and the Creative Problem Solving approach are also discussed. Finally, some applications of these concepts and tools are outlined. Some central references are presented for further study of themes related to creativity or creative tools.
Creativity and Problem Solving
DEFF Research Database (Denmark)
Vidal, Rene Victor Valqui
2004-01-01
This paper presents some modern and interdisciplinary concepts about creativity and creative processes of special relevance for Operational Research workers. Central publications in the area Creativity-Operational Research are shortly reviewed. Some creative tools and the Creative Problem Solving...... approach are also discussed. Finally, some applications of these concepts and tools are outlined. Some central references are presented for further study of themes related to creativity or creative tools....
Hansen, David
2012-01-01
Many industrial production work systems have increased in complexity, and their new business model scompete on innovation, rather than low cost.At a medical device production facility committed to Lean Production, a research project was carried out to use Appreciative Inquiry to better engage employee strengths in continuou simprovements of the work system. The research question was: “How can Lean problem solving and Appreciative Inquiry be combined for optimized work system innovation?”The r...
Creativity and Problem Solving
DEFF Research Database (Denmark)
Vidal, Rene Victor Valqui
2004-01-01
This paper presents some modern and interdisciplinary concepts about creativity and creative processes of special relevance for Operational Research workers. Central publications in the area Creativity-Operational Research are shortly reviewed. Some creative tools and the Creative Problem Solving...... approach are also discussed. Finally, some applications of these concepts and tools are outlined. Some central references are presented for further study of themes related to creativity or creative tools....
2017-03-01
the "SAS+-level Changes" section. Many modern heuristics use a technique called "delete- relaxation ". Delete relaxation does not handle counts and...use the same algorithm, representation, and use the same technique to generate their heuristics. The drawback of this is that there is almost never...one problem solving technique , one representation, or one way to create heuristics that works well on all problems/domains. There is a tradeoff
Programming languages for business problem solving
Wang, Shouhong
2007-01-01
It has become crucial for managers to be computer literate in today's business environment. It is also important that those entering the field acquire the fundamental theories of information systems, the essential practical skills in computer applications, and the desire for life-long learning in information technology. Programming Languages for Business Problem Solving presents a working knowledge of the major programming languages, including COBOL, C++, Java, HTML, JavaScript, VB.NET, VBA, ASP.NET, Perl, PHP, XML, and SQL, used in the current business computing environment. The book examin
Multiscale empirical interpolation for solving nonlinear PDEs
Calo, Victor M.
2014-12-01
In this paper, we propose a multiscale empirical interpolation method for solving nonlinear multiscale partial differential equations. The proposed method combines empirical interpolation techniques and local multiscale methods, such as the Generalized Multiscale Finite Element Method (GMsFEM). To solve nonlinear equations, the GMsFEM is used to represent the solution on a coarse grid with multiscale basis functions computed offline. Computing the GMsFEM solution involves calculating the system residuals and Jacobians on the fine grid. We use empirical interpolation concepts to evaluate these residuals and Jacobians of the multiscale system with a computational cost which is proportional to the size of the coarse-scale problem rather than the fully-resolved fine scale one. The empirical interpolation method uses basis functions which are built by sampling the nonlinear function we want to approximate a limited number of times. The coefficients needed for this approximation are computed in the offline stage by inverting an inexpensive linear system. The proposed multiscale empirical interpolation techniques: (1) divide computing the nonlinear function into coarse regions; (2) evaluate contributions of nonlinear functions in each coarse region taking advantage of a reduced-order representation of the solution; and (3) introduce multiscale proper-orthogonal-decomposition techniques to find appropriate interpolation vectors. We demonstrate the effectiveness of the proposed methods on several nonlinear multiscale PDEs that are solved with Newton\\'s methods and fully-implicit time marching schemes. Our numerical results show that the proposed methods provide a robust framework for solving nonlinear multiscale PDEs on a coarse grid with bounded error and significant computational cost reduction.
MULTILEVEL AUGMENTATION METHODS FOR SOLVING OPERATOR EQUATIONS
Institute of Scientific and Technical Information of China (English)
Chen Zhongying; Wu Bin; Xu Yuesheng
2005-01-01
We introduce multilevel augmentation methods for solving operator equations based on direct sum decompositions of the range space of the operator and the solution space of the operator equation and a matrix splitting scheme. We establish a general setting for the analysis of these methods, showing that the methods yield approximate solutions of the same convergence order as the best approximation from the subspace. These augmentation methods allow us to develop fast, accurate and stable nonconventional numerical algorithms for solving operator equations. In particular, for second kind equations, special splitting techniques are proposed to develop such algorithms. These algorithms are then applied to solve the linear systems resulting from matrix compression schemes using wavelet-like functions for solving Fredholm integral equations of the second kind. For this special case, a complete analysis for computational complexity and convergence order is presented. Numerical examples are included to demonstrate the efficiency and accuracy of the methods. In these examples we use the proposed augmentation method to solve large scale linear systems resulting from the recently developed wavelet Galerkin methods and fast collocation methods applied to integral equations of the secondkind. Our numerical results confirm that this augmentation method is particularly efficient for solving large scale linear systems induced from wavelet compression schemes.
A Parallel Processing Algorithms for Solving Factorization and Knapsack Problems
Directory of Open Access Journals (Sweden)
G.Aloy Anuja Mary
2012-03-01
Full Text Available Quantum and Evolutionary computation are new forms of computing by their unique paradigm for designing algorithms.The Shors algorithm is based on quantum concepts such as Qubits, superposition and interference which is used to solve factoring problem that has a great impact on cryptography once the quantum computers becomes a reality. The Genetic algorithm is a computational paradigm based on natural evolution including survival of the fittest, reproduction, and mutation is used to solve NP_hard knapsack problem. These two algorithms are unique in achieving speedup in computation by their adaptation of parallelism in processing.
Solve the Master Equation in Python
Fan, Wei; Chen, Bing; Ye, Qianqian
2011-01-01
A brief introduction to the Python computing environment is given. By solving the master equation encountered in quantum transport, we give an example of how to solve the ODE problems in Python. The ODE solvers used are the ZVODE routine in Scipy and the bsimp solver in GSL. For the former, the equation can be in its complex-valued form, while for the latter, it has to be rewritten to a real-valued form. The focus is on the detailed workflow of the implementation process, rather than on the syntax of the python language, with the hope to help readers simulate their own models in Python.
DNA鉴定技术在法科学中的应用%Studies and Applications of DNA-based Identity Test in Forensic Sciences
Institute of Scientific and Technical Information of China (English)
李生斌; 阎春霞; 赖江华; 汪建; 杨焕明
2001-01-01
人类基因组遗传多态现象研究的深入,导致了法科学领域个体识别和亲权鉴定发生根本性变化。本文就新的遗传标记和各种DNA鉴定技术在法科学中的研究进展、应用前景与亟待解决的问题进行了探讨。%With the study advances on DNA polymorphism of human genome,radical changes have been taking place in forensic individual identification and paternity tests.In this paper,we introduced research progress of new genetic markers,their applications and problems should be solved urgently in the field of DNA-based identity test in forensic sciences.
Hydrogen bond disruption in DNA base pairs from (14)C transmutation.
Sassi, Michel; Carter, Damien J; Uberuaga, Blas P; Stanek, Christopher R; Mancera, Ricardo L; Marks, Nigel A
2014-09-04
Recent ab initio molecular dynamics simulations have shown that radioactive carbon does not normally fragment DNA bases when it decays. Motivated by this finding, density functional theory and Bader analysis have been used to quantify the effect of C → N transmutation on hydrogen bonding in DNA base pairs. We find that (14)C decay has the potential to significantly alter hydrogen bonds in a variety of ways including direct proton shuttling (thymine and cytosine), thermally activated proton shuttling (guanine), and hydrogen bond breaking (cytosine). Transmutation substantially modifies both the absolute and relative strengths of the hydrogen bonding pattern, and in two instances (adenine and cytosine), the density at the critical point indicates development of mild covalent character. Since hydrogen bonding is an important component of Watson-Crick pairing, these (14)C-induced modifications, while infrequent, may trigger errors in DNA transcription and replication.
An Efficient Approach in Analysis of DNA Base Calling Using Neural Fuzzy Model
2017-01-01
This paper presented the issues of true representation and a reliable measure for analyzing the DNA base calling is provided. The method implemented dealt with the data set quality in analyzing DNA sequencing, it is investigating solution of the problem of using Neurofuzzy techniques for predicting the confidence value for each base in DNA base calling regarding collecting the data for each base in DNA, and the simulation model of designing the ANFIS contains three subsystems and main system; obtain the three features from the subsystems and in the main system and use the three features to predict the confidence value for each base. This is achieving effective results with high performance in employment. PMID:28261268
Electronic properties and assambly of DNA-based molecules on gold surfaces
DEFF Research Database (Denmark)
Salvatore, Princia
, highly base specific voltammetric peak in the presence of spermidine ions. A capacitive origin was attributed to this peak, and a novel route to detection of hybridization and base pair mismatches proposed on the basis of the high sensitivity to base pair mismatches showed by such ON-based monolayers...... voltammetry and in situSTM) and SERS, respectively. Such studies proved adsorption of DNA bases (adenine, cytosine, guanine, and thymine) on the gold substrate, disclosing distinct adsorption patterns for each of them, with new insight into nucleobase assembly on a freshly cleaned Au(110) surface (not nearly...... as widely employed as Au(111) surfaces). In particular, SERS offered a valuable and rapid way ofcharacterising interactions between the DNA-based molecules and the NP surface, with no need for complex sample preparation....
Universal spectrum for DNA base CG frequency distribution in Takifugu rubripes (Puffer fish) genome
Selvam, A M
2007-01-01
The frequency distribution of DNA bases A, C, G, T exhibit fractal fluctuations, namely a zigzag pattern of an increase followed by a decrease of all orders of magnitude along the length of the DNA molecule. Selfsimilar fractal fluctuations are ubiquitous to space-time fluctuations of dynamical systems in nature. The power spectra of fractal fluctuations exhibit inverse power law form signifying long-range space-time correlations such that there is two-way communication between local (small-scale) and global (large-scale) perturbations. In this paper it is shown that DNA base CG frequency distribution in Takifugu rubripes (Puffer fish) Genome Release 4 exhibit universal inverse power law form of the statistical normal distribution consistent with a general systems theory model prediction of quantumlike chaos governing fractal space-time distributions. The model predictions are (i) quasicrystalline Penrose tiling pattern for the nested coiled structure thereby achieving maximum packing efficiency for the DNA m...
Universal spectrum for DNA base C+G concentration variability in Human chromosome Y
Selvam, A M
2004-01-01
The spatial distribution of DNA base sequence A, C, G and T exhibit selfsimilar fractal fluctuations and the corresponding power spectra follow inverse power law form, which implies the following: (1) A scale invariant eddy continuum, namely, the amplitudes of component eddies are related to each other by a scale factor alone. In general, the scale factor is different for different scale ranges and indicates a multifractal structure for the spatial distribution of DNA base sequence. (2) Long-range spatial correlations of the eddy fluctuations. Multifractal structure to space-time fluctuations and the associated inverse power law form for power spectra is generic to spatially extended dynamical systems in nature and is a signature of self-organized criticality. The exact physical mechanism for the observed self-organized criticality is not yet identified. The author has developed a general systems theory where quantum mechanical laws emerge as self-consistent explanations for the observed long-range space-time...
Huang, Shuo; Chang, Shuai; He, Jin; Zhang, Peiming; Liang, Feng; Tuchband, Michael; Li, Shengqing; Lindsay, Stuart
2010-12-09
The DNA bases interact strongly with gold electrodes, complicating efforts to measure the tunneling conductance through hydrogen-bonded Watson Crick base pairs. When bases are embedded in a self-assembled alkane-thiol monolayer to minimize these interactions, new features appear in the tunneling data. These new features track the predictions of density-functional calculations quite well, suggesting that they reflect tunnel conductance through hydrogen-bonded base pairs.
DNA Bases Thymine and Adenine in Bio-Organic Light Emitting Diodes
2014-11-24
DNA Bases Thymine and Adenine in Bio-Organic Light Emitting Diodes Eliot F. Gomez1, Vishak Venkatraman1, James G. Grote2 & Andrew J. Steckl1...45433-7707 USA. We report on the use of nucleic acid bases (NBs) in organic light emitting diodes (OLEDs). NBs are small molecules that are the basic...polymer has been a frequent natural material integrated in electronic devices. DNA has been used in organic light - emitting diodes (OLEDs)4,5,7–14
DNA-based dye lasers: progress in this half a decade
Kawabe, Yutaka
2016-09-01
After the invention of DNA-surfactant films and the proposal of dye doping into them by Ogata, many applications were demonstrated. Among them tunable thin film laser is one of the most attractive functional devices. Development and progress in DNA based lasers after the first observation of amplified spontaneous emission (ASE) by us has been reviewed in a former paper published in 2011.1 In this proceeding, progresses in the subsequent half a decade are described.
The impact of chimerism in DNA-based forensic sex determination analysis
George, Renjith; Donald, Preethy Mary; Nagraj, Sumanth Kumbargere; Idiculla, Jose Joy; Hj Ismail, Rashid
2013-01-01
Sex determination is the most important step in personal identification in forensic investigations. DNA-based sex determination analysis is comparatively more reliable than the other conventional methods of sex determination analysis. Advanced technology like real-time polymerase chain reaction (PCR) offers accurate and reproducible results and is at the level of legal acceptance. But still there are situations like chimerism where an individual possess both male and female specific factors t...
1982-10-01
Artificial Intelig ~ence (Vol. III, edited by Paul R. Cohen and’ Edward A.. Feigenbaum)’, The chapter was written B’ Paul Cohen, with contributions... Artificial Intelligence (Vol. III, edited by Paul R. Cohen and EdWard A. Feigenbaum). The chapter was written by Paul R. Cohen, with contributions by Stephen...Wheevoats"EntermdI’ Planning and Problem ’Solving by Paul R. Cohen Chaptb-rXV-of Volumec III’of the Handbook of Artificial Intelligence edited by Paul R
DNA-based cryptographic methods for data hiding in DNA media.
Marwan, Samiha; Shawish, Ahmed; Nagaty, Khaled
2016-12-01
Information security can be achieved using cryptography, steganography or a combination of them, where data is firstly encrypted using any of the available cryptography techniques and then hid into any hiding medium. Recently, the famous genomic DNA has been introduced as a hiding medium, known as DNA steganography, due to its notable ability to hide huge data sets with a high level of randomness and hence security. Despite the numerous cryptography techniques, to our knowledge only the vigenere cipher and the DNA-based playfair cipher have been combined with the DNA steganography, which keeps space for investigation of other techniques and coming up with new improvements. This paper presents a comprehensive analysis between the DNA-based playfair, vigenere, RSA and the AES ciphers, each combined with a DNA hiding technique. The conducted analysis reports the performance diversity of each combined technique in terms of security, speed, hiding capacity in addition to both key size and data size. Moreover, this paper proposes a modification of the current combined DNA-based playfair cipher technique, which makes it not only simple and fast but also provides a significantly higher hiding capacity and security. The conducted extensive experimental studies confirm such outstanding performance in comparison with all the discussed combined techniques. Copyright Â© 2016 Elsevier Ireland Ltd. All rights reserved.
Alternative options for DNA-based experimental therapy of β-thalassemia.
Gambari, Roberto
2012-04-01
Beta-thalassemias are caused by more than 200 mutations of the β-globin gene, leading to low or absent production of adult hemoglobin. Achievements have been made with innovative therapeutic strategies for β-thalassemias, based on research conducted at the levels of gene structure, transcription, mRNA processing and protein synthesis. The objective of this review is to describe the development of therapeutic strategies employing viral and non-viral DNA-based approaches for treatment of β-thalassemia. Modification of β-globin gene expression in β-thalassemia cells has been achieved by gene therapy, correction of the mutated β-globin gene and RNA repair. In addition, cellular therapy has been proposed for β-thalassemia, including reprogramming of somatic cells to generate induced pluripotent stem cells to be genetically corrected. Based on the concept that increased production of fetal hemoglobin (HbF) is beneficial in β-thalassemia, DNA-based approaches to increase HbF production have been optimized, including treatment of target cells with lentiviral vectors carrying γ-globin genes. Finally, DNA-based targeting of α-globin gene expression has been applied to reduce the excess of α-globin production by β-thalassemia cells, one of the major causes of the clinical phenotype.
Institute of Scientific and Technical Information of China (English)
无
2004-01-01
The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.
Solving Differential Equations in R: Package deSolve
Soetaert, K.E.R.; Petzoldt, T.; Setzer, R.W.
2010-01-01
In this paper we present the R package deSolve to solve initial value problems (IVP) written as ordinary differential equations (ODE), differential algebraic equations (DAE) of index 0 or 1 and partial differential equations (PDE), the latter solved using the method of lines approach. The differenti
Solving jigsaw puzzles using image features
DEFF Research Database (Denmark)
Nielsen, Ture R.; Drewsen, Peter; Hansen, Klaus
2008-01-01
algorithm which exploits the divide and conquer paradigm to reduce the combinatorially complex problem by classifying the puzzle pieces and comparing pieces drawn from the same group. The paper includes a brief preliminary investigation of some image features used in the classification.......In this article, we describe a method for automatic solving of the jigsaw puzzle problem based on using image features instead of the shape of the pieces. The image features are used for obtaining an accurate measure for edge similarity to be used in a new edge matching algorithm. The algorithm...... is used in a general puzzle solving method which is based on a greedy algorithm previously proved successful. We have been able to solve computer generated puzzles of 320 pieces as well as a real puzzle of 54 pieces by exclusively using image information. Additionally, we investigate a new scalable...
High School Students' Use of Meiosis When Solving Genetics Problems.
Wynne, Cynthia F.; Stewart, Jim; Passmore, Cindy
2001-01-01
Paints a different picture of students' reasoning with meiosis as they solved complex, computer-generated genetics problems, some of which required them to revise their understanding of meiosis in response to anomalous data. Students were able to develop a rich understanding of meiosis and can utilize that knowledge to solve genetics problems.…
High School Students' Use of Meiosis When Solving Genetics Problems.
Wynne, Cynthia F.; Stewart, Jim; Passmore, Cindy
2001-01-01
Paints a different picture of students' reasoning with meiosis as they solved complex, computer-generated genetics problems, some of which required them to revise their understanding of meiosis in response to anomalous data. Students were able to develop a rich understanding of meiosis and can utilize that knowledge to solve genetics problems.…
(Eds.). Resource-bounded problem solving (Dagstuhl Seminar 14341)
Haxhimusa, Y.; Rooij, I.J.E.I. van; Varma, S.; Wareham, H.T.
2014-01-01
This report documents the program and the outcomes of Dagstuhl Seminar 14341 'Resource-bounded Problem Solving'. This seminar is a successor to Dagstuhl Seminar 11351: 'Computer Science & Problem Solving: New Foundations', held in August 2011, which was the first Dagstuhl event to bring together com
Examining Multiscale Movement Coordination in Collaborative Problem Solving
DEFF Research Database (Denmark)
Wiltshire, Travis; Steffensen, Sune Vork
2017-01-01
During collaborative problem solving (CPS), coordination occurs at different spatial and temporal scales. This multiscale coordination should, at least on some scales, play a functional role in facilitating effective collaboration outcomes. To evaluate this, we conducted a study of computer...
Genetic Algorithm for Solving Simple Mathematical Equality Problem
Hermawanto, Denny
2013-01-01
This paper explains genetic algorithm for novice in this field. Basic philosophy of genetic algorithm and its flowchart are described. Step by step numerical computation of genetic algorithm for solving simple mathematical equality problem will be briefly explained
Fostering Information Problem Solving Skills Through Completion Problems and Prompts
Frerejean, Jimmy; Brand-Gruwel, Saskia; Kirschner, Paul A.
2012-01-01
Frerejean, J., Brand-Gruwel, S., & Kirschner, P. A. (2012, September). Fostering Information Problem Solving Skills Through Completion Problems and Prompts. Poster presented at the EARLI SIG 6 & 7 "Instructional Design" and "Learning and Instruction with Computers", Bari, Italy.
Towards molecular computers that operate in a biological environment
Kahan, Maya; Gil, Binyamin; Adar, Rivka; Shapiro, Ehud
2008-07-01
Even though electronic computers are the only computer species we are accustomed to, the mathematical notion of a programmable computer has nothing to do with electronics. In fact, Alan Turing’s notional computer [L.M. Turing, On computable numbers, with an application to the entcheidungsproblem, Proc. Lond. Math. Soc. 42 (1936) 230-265], which marked in 1936 the birth of modern computer science and still stands at its heart, has greater similarity to natural biomolecular machines such as the ribosome and polymerases than to electronic computers. This similarity led to the investigation of DNA-based computers [C.H. Bennett, The thermodynamics of computation - Review, Int. J. Theoret. Phys. 21 (1982) 905-940; A.M. Adleman, Molecular computation of solutions to combinatorial problems, Science 266 (1994) 1021-1024]. Although parallelism, sequence specific hybridization and storage capacity, inherent to DNA and RNA molecules, can be exploited in molecular computers to solve complex mathematical problems [Q. Ouyang, et al., DNA solution of the maximal clique problem, Science 278 (1997) 446-449; R.J. Lipton, DNA solution of hard computational problems, Science 268 (1995) 542-545; R.S. Braich, et al., Solution of a 20-variable 3-SAT problem on a DNA computer, Science 296 (2002) 499-502; Liu Q., et al., DNA computing on surfaces, Nature 403 (2000) 175-179; D. Faulhammer, et al., Molecular computation: RNA solutions to chess problems, Proc. Natl. Acad. Sci. USA 97 (2000) 1385-1389; C. Mao, et al., Logical computation using algorithmic self-assembly of DNA triple-crossover molecules, Nature 407 (2000) 493-496; A.J. Ruben, et al., The past, present and future of molecular computing, Nat. Rev. Mol. Cell. Biol. 1 (2000) 69-72], we believe that the more significant potential of molecular computers lies in their ability to interact directly with a biochemical environment such as the bloodstream and living cells. From this perspective, even simple molecular computations may have
Pol, Henk J.; Harskamp, Egbert G.; Suhre, Cor J. M.; Goedhart, Martin J.
This study investigates the effectiveness of computer-delivered hints in relation to problem-solving abilities in two alternative indirect instruction schemes. In one instruction scheme, hints are available to students immediately after they are given a new problem to solve as well as after they
Pol, Henk J.; Harskamp, Egbert G.; Suhre, Cor J. M.; Goedhart, Martin J.
2009-01-01
This study investigates the effectiveness of computer-delivered hints in relation to problem-solving abilities in two alternative indirect instruction schemes. In one instruction scheme, hints are available to students immediately after they are given a new problem to solve as well as after they hav
Solved problems in electromagnetics
Salazar Bloise, Félix; Bayón Rojo, Ana; Gascón Latasa, Francisco
2017-01-01
This book presents the fundamental concepts of electromagnetism through problems with a brief theoretical introduction at the beginning of each chapter. The present book has a strong didactic character. It explains all the mathematical steps and the theoretical concepts connected with the development of the problem. It guides the reader to understand the employed procedures to learn to solve the exercises independently. The exercises are structured in a similar way: The chapters begin with easy problems increasing progressively in the level of difficulty. This book is written for students of physics and engineering in the framework of the new European Plans of Study for Bachelor and Master and also for tutors and lecturers. .
A Problem Solving Environment Based on CORBA
Directory of Open Access Journals (Sweden)
David Lancaster
2001-01-01
Full Text Available We have investigated aspects of the design of Problem Solving Environments (PSE by constructing a prototype using CORBA as middleware. The two issues we are mainly concerned with are the use of non-trivial (containing more than just a start method CORBA interfaces for the computational components, and the provision of interactivity using the same mechanisms used for flow control. After describing the design decisions that allow us to investigate these issues, and contrasting them with alternatives, we describe the architecture of the prototype and its use in the context of a study of photonic materials. We argue that having several methods on a component interface can be used to mitigate performance problems that may arise when trying to solve problems in PSE's based on small components. We describe how our mechanism allows a high degree of computational steering over all components.
Solving Integer Programming by Evolutionary Soft Agent
Institute of Scientific and Technical Information of China (English)
Yin Jian
2003-01-01
Many practical problems in commerce and industry involve finding the best way to allocate scarce resources a mong competing activities. This paper focuses on the problem of integer programming, and describes an evolutionary soft agent model to solve it. In proposed model, agent is composed of three components: goal, environment and behavior. Experirnental shows thne model has the characters of parallel computing and goal driving.
Hyndman, D E
2013-01-01
Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl
Dual functions of ASCIZ in the DNA base damage response and pulmonary organogenesis.
Directory of Open Access Journals (Sweden)
Sabine Jurado
2010-10-01
Full Text Available Zn²(+-finger proteins comprise one of the largest protein superfamilies with diverse biological functions. The ATM substrate Chk2-interacting Zn²(+-finger protein (ASCIZ; also known as ATMIN and ZNF822 was originally linked to functions in the DNA base damage response and has also been proposed to be an essential cofactor of the ATM kinase. Here we show that absence of ASCIZ leads to p53-independent late-embryonic lethality in mice. Asciz-deficient primary fibroblasts exhibit increased sensitivity to DNA base damaging agents MMS and H2O2, but Asciz deletion knock-down does not affect ATM levels and activation in mouse, chicken, or human cells. Unexpectedly, Asciz-deficient embryos also exhibit severe respiratory tract defects with complete pulmonary agenesis and severe tracheal atresia. Nkx2.1-expressing respiratory precursors are still specified in the absence of ASCIZ, but fail to segregate properly within the ventral foregut, and as a consequence lung buds never form and separation of the trachea from the oesophagus stalls early. Comparison of phenotypes suggests that ASCIZ functions between Wnt2-2b/ß-catenin and FGF10/FGF-receptor 2b signaling pathways in the mesodermal/endodermal crosstalk regulating early respiratory development. We also find that ASCIZ can activate expression of reporter genes via its SQ/TQ-cluster domain in vitro, suggesting that it may exert its developmental functions as a transcription factor. Altogether, the data indicate that, in addition to its role in the DNA base damage response, ASCIZ has separate developmental functions as an essential regulator of respiratory organogenesis.
A DNA-based system for selecting and displaying the combined result of two input variables
DEFF Research Database (Denmark)
Liu, Huajie; Wang, Jianbang; Song, S
2015-01-01
Oligonucleotide-based technologies for biosensing or bio-regulation produce huge amounts of rich high-dimensional information. There is a consequent need for flexible means to combine diverse pieces of such information to form useful derivative outputs, and to display those immediately. Here we...... demonstrate this capability in a DNA-based system that takes two input numbers, represented in DNA strands, and returns the result of their multiplication, writing this as a number in a display. Unlike a conventional calculator, this system operates by selecting the result from a library of solutions rather...
DFT Description of Intermolecular Forces between 9-Aminoacridines and DNA Base Pairs
Directory of Open Access Journals (Sweden)
Sandra Cotes Oyaga
2013-01-01
Full Text Available The B3LYP method with 6-31G* basis set was used to predict the geometries of five 9-aminoacridines (9-AA 1(a–e, DNA base pairs, and respective complexes. Polarizabilities, charge distribution, frontier molecular orbital (FMO, and dipole moments were used to analyze the nature of interactions that allow reasonable drug diffusion levels. The results showed that charge delocalization, high polarizabilities, and high dipole moments play an important role in intermolecular interactions with DNA. The interactions of 9-AA 1(a–e with GC are the strongest. 9-AA 1(d displayed the strongest interaction and 9-AA 1(b the weakest.
The impact of chimerism in DNA-based forensic sex determination analysis.
George, Renjith; Donald, Preethy Mary; Nagraj, Sumanth Kumbargere; Idiculla, Jose Joy; Hj Ismail, Rashid
2013-01-01
Sex determination is the most important step in personal identification in forensic investigations. DNA-based sex determination analysis is comparatively more reliable than the other conventional methods of sex determination analysis. Advanced technology like real-time polymerase chain reaction (PCR) offers accurate and reproducible results and is at the level of legal acceptance. But still there are situations like chimerism where an individual possess both male and female specific factors together in their body. Sex determination analysis in such cases can give erroneous results. This paper discusses the phenomenon of chimerism and its impact on sex determination analysis in forensic investigations.
2010-03-04
efficient or less costly than their classical counterparts. A large-scale quantum computer is certainly an extremely ambi- tious goal, appearing to us...outperform the largest classical supercomputers in solving some specific problems important for data encryption. In the long term, another application...which the quantum computer depends, causing the quantum mechanically destructive process known as decoherence . Decoherence comes in several forms
An Integer Programming Approach to Solving Tantrix on Fixed Boards
Directory of Open Access Journals (Sweden)
Yushi Uno
2012-03-01
Full Text Available Tantrix (Tantrix R ⃝ is a registered trademark of Colour of Strategy Ltd. in New Zealand, and of TANTRIX JAPAN in Japan, respectively, under the license of M. McManaway, the inventor. is a puzzle to make a loop by connecting lines drawn on hexagonal tiles, and the objective of this research is to solve it by a computer. For this purpose, we first give a problem setting of solving Tantrix as making a loop on a given fixed board. We then formulate it as an integer program by describing the rules of Tantrix as its constraints, and solve it by a mathematical programming solver to have a solution. As a result, we establish a formulation that can solve Tantrix of moderate size, and even when the solutions are invalid only by elementary constraints, we achieved it by introducing additional constraints and re-solve it. By this approach we succeeded to solve Tantrix of size up to 60.
The Five Immune Forces Impacting DNA-Based Cancer Immunotherapeutic Strategy
Directory of Open Access Journals (Sweden)
Suneetha Amara
2017-03-01
Full Text Available DNA-based vaccine strategy is increasingly realized as a viable cancer treatment approach. Strategies to enhance immunogenicity utilizing tumor associated antigens have been investigated in several pre-clinical and clinical studies. The promising outcomes of these studies have suggested that DNA-based vaccines induce potent T-cell effector responses and at the same time cause only minimal side-effects to cancer patients. However, the immune evasive tumor microenvironment is still an important hindrance to a long-term vaccine success. Several options are currently under various stages of study to overcome immune inhibitory effect in tumor microenvironment. Some of these approaches include, but are not limited to, identification of neoantigens, mutanome studies, designing fusion plasmids, vaccine adjuvant modifications, and co-treatment with immune-checkpoint inhibitors. In this review, we follow a Porter’s analysis analogy, otherwise commonly used in business models, to analyze various immune-forces that determine the potential success and sustainable positive outcomes following DNA vaccination using non-viral tumor associated antigens in treatment against cancer.
Effective inhibition of human cytomegalovirus gene expression by DNA-based external guide sequences
Institute of Scientific and Technical Information of China (English)
Zhifeng Zeng; Hongjian Li; Yueqing Li; Yanwei Cui; Qi Zhou; Yi Zou; Guang Yang; Tianhong Zhou
2009-01-01
To investigate whether a 12 nucleotide DNA-based miniEGSs can silence the expression of human cytomegalovirus(HCMV)UL49 gene efficiently,A HeLa cell line stably expressing UL49 gene was constructed and the putative miniEGSs(UL49-miniEGSs)were assayed in the stable cell line.Quantitative RT-PCR and western blot resuits showed a reduction of 67%in UL49expression level in HeLa cells that were transfected with UL49-miniEGSs.It was significantly different from that of mock and control miniEGSs(TK-miniEGSs)which were 1 and 7%,respectively.To further confirm the gene silence directed by UL49-miniEGSs with human RNase P,a mutant of UL49-miniEGSs was constructedand a modified 5'RACE was carried out.Data showed that the inhibition of UL49 gene expression directed by UL49-miniEGSs was RNase P-dependent and the clea vage of UL49 mRNA by RNase P was site specific.As a result,the length of DNA-based miniEGSs that could silence gene expression efficiently was only 12 nt.That is significantly less than any other Oligonucleotide-based method of gene inactivation known SO far.MiniEGSs may represent novel gene-targeting agents for the inhibition of viral genes and other human disease reiated gene expression.
Isolation of a small molecule inhibitor of DNA base excision repair.
Madhusudan, Srinivasan; Smart, Fiona; Shrimpton, Paul; Parsons, Jason L; Gardiner, Laurence; Houlbrook, Sue; Talbot, Denis C; Hammonds, Timothy; Freemont, Paul A; Sternberg, Michael J E; Dianov, Grigory L; Hickson, Ian D
2005-01-01
The base excision repair (BER) pathway is essential for the removal of DNA bases damaged by alkylation or oxidation. A key step in BER is the processing of an apurinic/apyrimidinic (AP) site intermediate by an AP endonuclease. The major AP endonuclease in human cells (APE1, also termed HAP1 and Ref-1) accounts for >95% of the total AP endonuclease activity, and is essential for the protection of cells against the toxic effects of several classes of DNA damaging agents. Moreover, APE1 overexpression has been linked to radio- and chemo-resistance in human tumors. Using a newly developed high-throughput screen, several chemical inhibitors of APE1 have been isolated. Amongst these, CRT0044876 was identified as a potent and selective APE1 inhibitor. CRT0044876 inhibits the AP endonuclease, 3'-phosphodiesterase and 3'-phosphatase activities of APE1 at low micromolar concentrations, and is a specific inhibitor of the exonuclease III family of enzymes to which APE1 belongs. At non-cytotoxic concentrations, CRT0044876 potentiates the cytotoxicity of several DNA base-targeting compounds. This enhancement of cytotoxicity is associated with an accumulation of unrepaired AP sites. In silico modeling studies suggest that CRT0044876 binds to the active site of APE1. These studies provide both a novel reagent for probing APE1 function in human cells, and a rational basis for the development of APE1-targeting drugs for antitumor therapy.
Structuring polymers for delivery of DNA-based therapeutics: updated insights.
Gupta, Madhu; Tiwari, Shailja; Vyas, Suresh
2012-01-01
Gene therapy offers greater opportunities for treating numerous incurable diseases from genetic disorders, infections, and cancer. However, development of appropriate delivery systems could be one of the most important factors to overcome numerous biological barriers for delivery of various therapeutic molecules. A number of nonviral polymer-mediated vectors have been developed for DNA delivery and offer the potential to surmount the associated problems of their viral counterpart. To address the concerns associated with safety issues, a wide range of polymeric vectors are available and have been utilized successfully to deliver their therapeutics in vivo. Today's research is mainly focused on the various natural or synthetic polymer-based delivery carriers that protect the DNA molecule from degradation, which offer specific targeting to the desired cells after systemic administration, have transfection efficiencies equivalent to virus-mediated gene delivery, and have long-term gene expression through sustained-release mechanisms. This review explores an updated overview of different nonviral polymeric delivery system for delivery of DNA-based therapeutics. These polymeric carriers have been evaluated in vitro and in vivo and are being utilized in various stages of clinical evaluation. Continued research and understanding of the principles of polymer-based gene delivery systems will enable us to develop new and efficient delivery systems for the delivery of DNA-based therapeutics to achieve the goal of efficacious and specific gene therapy for a vast array of clinical disorders as the therapeutic solutions of tomorrow.
Ultrafast dynamics of solvation and charge transfer in a DNA-based biomaterial.
Choudhury, Susobhan; Batabyal, Subrata; Mondol, Tanumoy; Sao, Dilip; Lemmens, Peter; Pal, Samir Kumar
2014-05-01
Charge migration along DNA molecules is a key factor for DNA-based devices in optoelectronics and biotechnology. The association of a significant amount of water molecules in DNA-based materials for the intactness of the DNA structure and their dynamic role in the charge-transfer (CT) dynamics is less documented in contemporary literature. In the present study, we have used a genomic DNA-cetyltrimethyl ammonium chloride (CTMA) complex, a technological important biomaterial, and Hoechest 33258 (H258), a well-known DNA minor groove binder, as fluorogenic probe for the dynamic solvation studies. The CT dynamics of CdSe/ZnS quantum dots (QDs; 5.2 nm) embedded in the as-prepared and swollen biomaterial have also been studied and correlated with that of the timescale of solvation. We have extended our studies on the temperature-dependent CT dynamics of QDs in a nanoenvironment of an anionic, sodium bis(2-ethylhexyl)sulfosuccinate reverse micelle (AOT RMs), whereby the number of water molecules and their dynamics can be tuned in a controlled manner. A direct correlation of the dynamics of solvation and that of the CT in the nanoenvironments clearly suggests that the hydration barrier within the Arrhenius framework essentially dictates the charge-transfer dynamics.
Globalization of DNA-based prenatal diagnosis for recessive dystrophic epidermolysis bullosa.
Wessagowit, V; Chunharas, A; Wattanasirichaigoon, D; McGrath, J A
2007-11-01
Globalization of economies and improvements in international telecommunications has led to increased demand for better access to the latest developments in healthcare, wherever they may be available. In this report, we describe the first case from Thailand of DNA-based prenatal testing of a mother at risk for recurrence of severe recessive dystrophic epidermolysis bullosa (RDEB), whose affected child had died in early childhood. In the absence of previous access to prenatal diagnostic tests, the mother had undergone several terminations for fear of having another affected child. To prevent this happening again, DNA from the mother and her consanguineous partner was sent from Bangkok to a specialist laboratory at St John's Institute of Dermatology in London and screened for pathogenic mutations in the COL7A1 gene: both individuals were shown to be heterozygous carriers of a splice-site mutation, c.2440G --> C. In a subsequent pregnancy, amniocentesis was performed at 18 weeks' gestation in Bangkok, and fetal DNA was extracted and sent to London for analysis. Restriction endonuclease digestion of the amplified fetal DNA revealed the wild-type COL7A1 sequence only, and 5 months later, a clinically unaffected boy was born. This case represents the first example of DNA-based prenatal diagnosis for RDEB in Thailand and illustrates the benefits for patients in establishing international links with diagnostic centres with technological expertise that is not widely available in certain countries.
Molecular Design of Ionization-Induced Proton Switching Element Based on Fluorinated DNA Base Pair.
Tachikawa, Hiroto; Kawabata, Hiroshi
2016-03-10
To design theoretically the high-performance proton switching element based on DNA base pair, the effects of fluorine substitution on the rate of proton transfer (PT) in the DNA model base pair have been investigated by means of direct ab initio molecular dynamics (AIMD) method. The 2-aminopyridine dimer, (AP)2, was used as the model of the DNA base pair. One of the hydrogen atoms of the AP molecule in the dimer was substituted by a fluorine (F) atom, and the structures of the dimer, expressed by F-(AP)2, were fully optimized at the MP2/6-311++G(d,p) level. The direct AIMD calculations showed that the proton is transferred within the base pair after the vertical ionization. The rates of PT in F-(AP)2(+) were calculated and compared with that of (AP)2(+) without an F atom. It was found that PT rate is accelerated by the F-substitution. Also, the direction of PT between F-AP and AP molecules can be clearly controlled by the position of F-substitution (AP)2 in the dimer.
Augmentation of French grunt diet description using combined visual and DNA-based analyses
Hargrove, John S.; Parkyn, Daryl C.; Murie, Debra J.; Demopoulos, Amanda W.J.; Austin, James D.
2012-01-01
Trophic linkages within a coral-reef ecosystem may be difficult to discern in fish species that reside on, but do not forage on, coral reefs. Furthermore, dietary analysis of fish can be difficult in situations where prey is thoroughly macerated, resulting in many visually unrecognisable food items. The present study examined whether the inclusion of a DNA-based method could improve the identification of prey consumed by French grunt, Haemulon flavolineatum, a reef fish that possesses pharyngeal teeth and forages on soft-bodied prey items. Visual analysis indicated that crustaceans were most abundant numerically (38.9%), followed by sipunculans (31.0%) and polychaete worms (5.2%), with a substantial number of unidentified prey (12.7%). For the subset of prey with both visual and molecular data, there was a marked reduction in the number of unidentified sipunculans (visual – 31.1%, combined &ndash 4.4%), unidentified crustaceans (visual &ndash 15.6%, combined &ndash 6.7%), and unidentified taxa (visual &ndash 11.1%, combined &ndash 0.0%). Utilising results from both methodologies resulted in an increased number of prey placed at the family level (visual &ndash 6, combined &ndash 33) and species level (visual &ndash 0, combined &ndash 4). Although more costly than visual analysis alone, our study demonstrated the feasibility of DNA-based identification of visually unidentifiable prey in the stomach contents of fish.
Development and use of an efficient DNA-based viral gene silencing vector for soybean.
Zhang, Chunquan; Yang, Chunling; Whitham, Steven A; Hill, John H
2009-02-01
Virus-induced gene silencing (VIGS) is increasingly being used as a reverse genetics tool to study functions of specific plant genes. It is especially useful for plants, such as soybean, that are recalcitrant to transformation. Previously, Bean pod mottle virus (BPMV) was shown to be an effective VIGS vector for soybean. However, the reported BPMV vector requires in vitro RNA transcription and inoculation, which is not reliable or amenable to high-throughput applications. To increase the efficiency of the BPMV vector for soybean functional genomics, a DNA-based version was developed. Reported here is the construction of a Cauliflower mosaic virus 35S promoter-driven BPMV vector that is efficient for the study of soybean gene function. The selection of a mild rather than a severe BPMV strain greatly reduced the symptom interference caused by virus infection. The DNA-based BPMV vector was used to silence soybean homologues of genes involved in plant defense, translation, and the cytoskeleton in shoots and in roots. VIGS of the Actin gene resulted in reduced numbers of Soybean mosaic virus infection foci. The results demonstrate the utility of this new vector as an efficient tool for a wide range of applications for soybean functional genomics.
Comparison of ITS, RAPD and ISSR from DNA-based genetic diversity techniques.
Poyraz, Ismail
2016-01-01
ITS, RAPD-PCR and ISSR-PCR are most popular DNA-based techniques that are extensively applied in the determination of the genetic diversity of species among populations. However, especially for organisms having high genetic polymorphism, phylogenetic trees drawn from the results of these techniques may be different. For finding a meaningful phylogenetic tree, it should be compared phylogenetic trees obtained from these different techniques with geographic locations of populations. Lichens have a high genetic polymorphism and tolerance against different environmental conditions. In this study, these three DNA-based genetic diversity techniques were compared, using different populations of a lichen species (Xanthoria parietina). X. parietina was especially chosen because of its high genetic diversity in narrow zones. Lichen samples were collected from ten different locations in a narrow transition climate zone Bilecik (Turkey). Statistical analyses of all results were calculated using UPGMA analysis. Phylogenic trees for each technique were drawn and transferred to the Bilecik map for comparative analysis. The results of three techniques allowed us to verify that populations of X. parietina have high genetic variety in a narrow zone. But phylogenetic trees obtained from these results were found to be very different. Our comparative analysis demonstrated that the results of these techniques are not similar and have critical differences. We observed that the ITS method provides more clear data and is more successful in genetic diversity analyses of more asunder populations, in contrast to ISSR-PCR and RAPD-PCR methods.
Depression and social problem solving.
Marx, E M; Williams, J M; Claridge, G C
1992-02-01
Twenty depressed patients with major depressive disorder, 20 nondepressed matched control subjects, and 17 patients with anxiety disorders were compared in different measures of social problem solving. Problem solving was assessed with the Means-Ends Problem-Solving Test (Study 1), the solution of personal problems, and a problem-solving questionnaire (Study 2). Results showed that, as predicted, depressed subjects suffered from a deficit in problem solving in all three measures. The majority of these deficits were also displayed by the clinical control group rather than being specific to a diagnosis of depression. However, depressed subjects produced less effective solutions than did normal and clinical control subjects. The results suggest that depressed and anxious patients may have difficulties at different stages of the problem-solving process.
Schoenfeld's problem solving theory in a student controlled learning environment
Harskamp, E.; Suhre, C.
2007-01-01
This paper evaluates the effectiveness of a student controlled computer program for high school mathematics based on instruction principles derived from Schoenfeld's theory of problem solving. The computer program allows students to choose problems and to make use of hints during different episodes
Schoenfeld's problem solving theory in a student controlled learning environment
Harskamp, E.; Suhre, C.
2007-01-01
This paper evaluates the effectiveness of a student controlled computer program for high school mathematics based on instruction principles derived from Schoenfeld's theory of problem solving. The computer program allows students to choose problems and to make use of hints during different episodes
Latest Trends in Problem Solving Assessment
Directory of Open Access Journals (Sweden)
Maria Karyotaki
2016-07-01
Full Text Available Problem solving is the skill that coordinates all the cognitive, metacognitive and behavioral processes taking place when individuals encounter a previously unprecedented situation or difficulty. Metacognitive processes seem to play the most important role for resolving a problematic situation as individuals reflect on their acquired knowledge, skills and experiences, thus become aware of their capabilities and how to regulate them. Therefore, metacognitive awareness is the competence that mostly assists individuals in their attempt to construct new knowledge and reach their goals. Furthermore, individuals’ self-assessment and peer-assessment processes could reveal their level of metacognitive awareness and therefore, by far, their problem solving competency. Consequently, ICTs could capture individuals’ problem solving skills through tracking down and analyzing the latters’ cognitive and metacognitive processes as well as their behavioral patterns. The aforementioned computer-based assessment could consist of a fuzzy expert system with domain knowledge from an automated task-based test with particular solution strategies in combination with log data for identifying and classifying one’s level of problem solving ability according to specific criteria.
Computer techniques for electromagnetics
Mittra, R
1973-01-01
Computer Techniques for Electromagnetics discusses the ways in which computer techniques solve practical problems in electromagnetics. It discusses the impact of the emergence of high-speed computers in the study of electromagnetics. This text provides a brief background on the approaches used by mathematical analysts in solving integral equations. It also demonstrates how to use computer techniques in computing current distribution, radar scattering, and waveguide discontinuities, and inverse scattering. This book will be useful for students looking for a comprehensive text on computer techni
Institute of Scientific and Technical Information of China (English)
钮亮; 张宝友
2015-01-01
物流配送是物流的核心环节之一，配送线路的优化选择决定着配送效率和运输成本。为了求解大规模城市路网中的物流配送最短路径问题，提出了基于MapReduce的并行算法和GIS仿真结合的求解方法。在该求解方式中，构建MapReduce并行化模型和算法流程；数字化城市路网，借助于mapin⁃fo分层管理道路和配送点，并对道路和配送点进行拓扑化处理，生成MID文件供MapReduce使用；通过Hadoop平台实现最短路径计算并在MapX直观显示出计算结果。%Logistics distribution is the core link of logistics, optimization of distribution lines determines the efficiency of delivery and transportation costs. In order to solve the shortest path problem of logistics distribution in large-scale urban network , the solving method based on MapReduce and GIS simulation is proposed.this method proposes a shortest path algorithm based on MapReduce, constructs a parallel process model and algorithm of MapReduce; digitalizes urban road network, manages road and distribution points according to layer with the help of MapInfo, topologizes road and distribution point, generates the MID file which is used by MapReduce; finishes the calculation of the shortest path through the Hadoop platform and displays intuitively the results in MapX .
Würtz, Rolf P
2008-01-01
Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.
Solving fault diagnosis problems linear synthesis techniques
Varga, Andreas
2017-01-01
This book addresses fault detection and isolation topics from a computational perspective. Unlike most existing literature, it bridges the gap between the existing well-developed theoretical results and the realm of reliable computational synthesis procedures. The model-based approach to fault detection and diagnosis has been the subject of ongoing research for the past few decades. While the theoretical aspects of fault diagnosis on the basis of linear models are well understood, most of the computational methods proposed for the synthesis of fault detection and isolation filters are not satisfactory from a numerical standpoint. Several features make this book unique in the fault detection literature: Solution of standard synthesis problems in the most general setting, for both continuous- and discrete-time systems, regardless of whether they are proper or not; consequently, the proposed synthesis procedures can solve a specific problem whenever a solution exists Emphasis on the best numerical algorithms to ...
Combinatorial reasoning to solve problems
Coenen, Tom; Hof, Frits; Verhoef, Nellie
2016-01-01
This study reports combinatorial reasoning to solve problems. We observed the mathematical thinking of students aged 14-16. We study the variation of the students’ solution strategies in the context of emergent modelling. The results show that the students are tempted to begin the problem solving pr
Solving complex fisheries management problems
DEFF Research Database (Denmark)
Petter Johnsen, Jahn; Eliasen, Søren Qvist
2011-01-01
A crucial issue for the new EU common fisheries policy is how to solve the discard problem. Through a study of the institutional set up and the arrangements for solving the discard problem in Denmark, the Faroe Islands, Iceland and Norway, the article identifies the discard problem as related...
Crime Solving Techniques: Training Bulletin.
Sands, Jack M.
The document is a training bulletin for criminal investigators, explaining the use of probability, logic, lateral thinking, group problem solving, and psychological profiles as methods of solving crimes. One chpater of several pages is devoted to each of the five methods. The use of each method is explained; problems are presented for the user to…
DNA-based nanobiostructured devices: The role of quasiperiodicity and correlation effects
Energy Technology Data Exchange (ETDEWEB)
Albuquerque, E.L., E-mail: eudenilson@gmail.com [Departamento de Biofísica e Farmacologia, Universidade Federal do Rio Grande do Norte, 59072-970, Natal-RN (Brazil); Fulco, U.L. [Departamento de Biofísica e Farmacologia, Universidade Federal do Rio Grande do Norte, 59072-970, Natal-RN (Brazil); Freire, V.N. [Departamento de Física, Universidade Federal do Ceará, 60455-760, Fortaleza-CE (Brazil); Caetano, E.W.S. [Instituto Federal de Educação, Ciência e Tecnologia do Ceará, 60040-531, Fortaleza-CE (Brazil); Lyra, M.L.; Moura, F.A.B.F. de [Instituto de Física, Universidade Federal de Alagoas, 57072-970, Maceió-AL (Brazil)
2014-02-01
The purpose of this review is to present a comprehensive and up-to-date account of the main physical properties of DNA-based nanobiostructured devices, stressing the role played by their quasi-periodicity arrangement and correlation effects. Although the DNA-like molecule is usually described as a short-ranged correlated random ladder, artificial segments can be grown following quasiperiodic sequences as, for instance, the Fibonacci and Rudin–Shapiro ones. They have interesting properties like a complex fractal spectra of energy, which can be considered as their indelible mark, and collective properties that are not shared by their constituents. These collective properties are due to the presence of long-range correlations, which are expected to be reflected somehow in their various spectra (electronic transmission, density of states, etc.) defining another description of disorder. Although long-range correlations are responsible for the effective electronic transport at specific resonant energies of finite DNA segments, much of the anomalous spread of an initially localized electron wave-packet can be accounted by short-range pair correlations, suggesting that an approach based on the inclusion of further short-range correlations on the nucleotide distribution leads to an adequate description of the electronic properties of DNA segments. The introduction of defects may generate states within the gap, and substantially improves the conductance, specially of finite branches. They usually become exponentially localized for any amount of disorder, and have the property to tailor the electronic transport properties of DNA-based nanoelectronic devices. In particular, symmetric and antisymmetric correlations have quite distinct influence on the nature of the electronic states, and a diluted distribution of defects lead to an anomalous diffusion of the electronic wave-packet. Nonlinear contributions, arising from the coupling between electrons and the molecular
DNA-based nanobiostructured devices: The role of quasiperiodicity and correlation effects
Albuquerque, E. L.; Fulco, U. L.; Freire, V. N.; Caetano, E. W. S.; Lyra, M. L.; de Moura, F. A. B. F.
2014-02-01
The purpose of this review is to present a comprehensive and up-to-date account of the main physical properties of DNA-based nanobiostructured devices, stressing the role played by their quasi-periodicity arrangement and correlation effects. Although the DNA-like molecule is usually described as a short-ranged correlated random ladder, artificial segments can be grown following quasiperiodic sequences as, for instance, the Fibonacci and Rudin-Shapiro ones. They have interesting properties like a complex fractal spectra of energy, which can be considered as their indelible mark, and collective properties that are not shared by their constituents. These collective properties are due to the presence of long-range correlations, which are expected to be reflected somehow in their various spectra (electronic transmission, density of states, etc.) defining another description of disorder. Although long-range correlations are responsible for the effective electronic transport at specific resonant energies of finite DNA segments, much of the anomalous spread of an initially localized electron wave-packet can be accounted by short-range pair correlations, suggesting that an approach based on the inclusion of further short-range correlations on the nucleotide distribution leads to an adequate description of the electronic properties of DNA segments. The introduction of defects may generate states within the gap, and substantially improves the conductance, specially of finite branches. They usually become exponentially localized for any amount of disorder, and have the property to tailor the electronic transport properties of DNA-based nanoelectronic devices. In particular, symmetric and antisymmetric correlations have quite distinct influence on the nature of the electronic states, and a diluted distribution of defects lead to an anomalous diffusion of the electronic wave-packet. Nonlinear contributions, arising from the coupling between electrons and the molecular vibrations
Program Transformation by Solving Equations
Institute of Scientific and Technical Information of China (English)
朱鸿
1991-01-01
Based on the theory of orthogonal program expansion[8-10],the paper proposes a method to transform programs by solving program equations.By the method,transformation goals are expressed in program equations,and achieved by solving these equations.Although such equations are usually too complicated to be solved directly,the orthogonal expansion of programs makes it possible to reduce such equations into systems of equations only containing simple constructors of programs.Then,the solutions of such equations can be derived by a system of solving and simplifying rules,and algebraic laws of programs.The paper discusses the methods to simplify and solve equations and gives some examples.
The Actual Implementation of Cloud Computing and Solving Method%医院云计算项目实施中的实际问题及解决方法
Institute of Scientific and Technical Information of China (English)
王胜; 高峰; 曹卫锋
2013-01-01
As a developed province in eastern China economy in county hospital, our hospital earlier exploration using cloud computing technology. However, in the practice process, we find that, under the cloud computing environment, not all of the system technical indicators have exceeded the traditional PC mode, but there are limitations in some aspects. This paper from the medical field of medical cloud landing angle, enumerates several typical technical problems encountered in process of different application in the cloud evolution in the transition, and gives practical solutions, looking for technology process of hospital in the implementation of cloud computing reference role.%靖江市人民医院较早探索采用了云计算技术。实践过程中发现，云计算环境下，并不是所有系统技术指标都超过传统PC模式，在某些方面还存在局限。从医疗领域医疗云落地视角，列举了县级医院不同应用在向云计算过渡演进过程中遇到的几个典型技术问题，并给出实际解决办法，期待对医院在实施云计算技术过程中有借鉴参考作用。
The role of DNA base excision repair in brain homeostasis and disease
DEFF Research Database (Denmark)
Akbari, Mansour; Morevati, Marya; Croteau, Deborah;
2015-01-01
Chemical modification and spontaneous loss of nucleotide bases from DNA are estimated to occur at the rate of thousands per human cell per day. DNA base excision repair (BER) is a critical mechanism for repairing such lesions in nuclear and mitochondrial DNA. Defective expression or function...... of proteins required for BER or proteins that regulate BER have been consistently associated with neurological dysfunction and disease in humans. Recent studies suggest that DNA lesions in the nuclear and mitochondrial compartments and the cellular response to those lesions have a profound effect on cellular...... energy homeostasis, mitochondrial function and cellular bioenergetics, with especially strong influence on neurological function. Further studies in this area could lead to novel approaches to prevent and treat human neurodegenerative disease....
Electronic properties and assambly of DNA-based molecules on gold surfaces
DEFF Research Database (Denmark)
Salvatore, Princia
This thesis presents a multifaceted study of deoxyribonucleic acid (DNA) in the form of strands and individual components, attached/adsorbed on single-crystal Au(111) and Au(110) gold surfaces, and on citrate-reduced gold nanoparticles. Strategically designed DNA moieties were addressed directly......-assembled monolayers(SAMs), grafted to the substrate through the strong gold-sulphur bond. The voltammetric behaviour of such DNA-based systems was analysed in the presence of “smart” redox molecules, the intercalating aromatic anthraquinone monosulfonate (AQMS) and a covalently attached terpyridine (terpy) redox unit...... and addressing metal coordination of ONs with the terpyridine ligand in the highly flexible structure ofthe new synthetic unlocked nucleic acid (UNA). Composite voltammetric behaviour for each of the metal-functionalized ONbased monolayers was observed and supported by in situscanning tunnelling microscopy...
Assessment of helminth biodiversity in wild rats using 18S rDNA based metagenomics.
Tanaka, Ryusei; Hino, Akina; Tsai, Isheng J; Palomares-Rius, Juan Emilio; Yoshida, Ayako; Ogura, Yoshitoshi; Hayashi, Tetsuya; Maruyama, Haruhiko; Kikuchi, Taisei
2014-01-01
Parasite diversity has important implications in several research fields including ecology, evolutionary biology and epidemiology. Wide-ranging analysis has been restricted because of the difficult, highly specialised and time-consuming processes involved in parasite identification. In this study, we assessed parasite diversity in wild rats using 18S rDNA-based metagenomics. 18S rDNA PCR products were sequenced using an Illumina MiSeq sequencer and the analysis of the sequences using the QIIME software successfully classified them into several parasite groups. The comparison of the results with those obtained using standard methods including microscopic observation of helminth parasites in the rat intestines and PCR amplification/sequencing of 18S rDNA from isolated single worms suggests that this new technique is reliable and useful to investigate parasite diversity.
Khadsai, Sudarat; Rutnakornpituk, Boonjira; Vilaivan, Tirayut; Nakkuntod, Maliwan; Rutnakornpituk, Metha
2016-09-01
Magnetite nanoparticles (MNPs) were surface modified with anionic poly( N-acryloyl glycine) (PNAG) and streptavidin for specific interaction with biotin-conjugated pyrrolidinyl peptide nucleic acid (PNA). Hydrodynamic size ( D h) of PNAG-grafted MNPs varied from 334 to 496 nm depending on the loading ratio of the MNP to NAG in the reaction. UV-visible and fluorescence spectrophotometries were used to confirm the successful immobilization of streptavidin and PNA on the MNPs. About 291 pmol of the PNA/mg MNP was immobilized on the particle surface. The PNA-functionalized MNPs were effectively used as solid supports to differentiate between fully complementary and non-complementary/single-base mismatch DNA using the PNA probe. These novel anionic MNPs can be efficiently applicable for use as a magnetically guidable support for DNA base discrimination.
Inverse Temperature Dependence of Nuclear Quantum Effects in DNA Base Pairs
Fang, Wei; Rossi, Mariana; Feng, Yexin; Li, Xin-Zheng; Michaelides, Angelos
2016-01-01
Despite the inherently quantum mechanical nature of hydrogen bonding, it is unclear how nuclear quantum effects (NQEs) alter the strengths of hydrogen bonds. With this in mind, we use ab initio path integral molecular dynamics to determine the absolute contribution of NQEs to the binding in DNA base pair complexes, arguably the most important hydrogen-bonded systems of all. We find that depending on the temperature, NQEs can either strengthen or weaken the binding within the hydrogen-bonded complexes. As a somewhat counterintuitive consequence, NQEs can have a smaller impact on hydrogen bond strengths at cryogenic temperatures than at room temperature. We rationalize this in terms of a competition of NQEs between low-frequency and high-frequency vibrational modes. Extending this idea, we also propose a simple model to predict the temperature dependence of NQEs on hydrogen bond strengths in general.
Local compression properties of double-stranded DNA based on a dynamic simulation
Lei, Xiaoling; Fang, Haiping
2013-01-01
The local mechanical properties of DNA are believed to play an important role in their biological functions and DNA-based nanomechanical devices. Using a simple sphere-tip compression system, the local radial mechanical properties of DNA are systematically studied by changing the tip size. The compression simulation results for the 16 nm diameter sphere tip are well consistent with the experimental results. With the diameter of the tip decreasing, the radial compressive elastic properties under external loads become sensitive to the tip size and the local DNA conformation. There appears a suddenly force break in the compression-force curve when the sphere size is less than or equal to 12 nm diameter. The analysis of the hydrogen bonds and base stacking interaction shows there is a local unwinding process occurs. During the local unwinding process, first the hydrogen bonds between complement base pairs are broken. With the compression aggregating, the local backbones in the compression center are unwound from ...
Angular distributions in the double ionization of DNA bases by electron impact
Khelladi, M. F.; Mansouri, A.; Dal Cappello, C.; Charpentier, I.; Hervieux, P. A.; Ruiz-Lopez, M. F.; Roy, A. C.
2016-11-01
Ab initio calculations of the five-fold differential cross sections for electron-impact double ionization of thymine, cytosine, adenine and guanine are performed in the first Born approximation for an incident energy close to 5500 eV. The wavefunctions of the DNA bases are constructed using the multi-center wave functions from the Gaussian 03 program. These multi-center wave functions are converted into single-center expansions of Slater-type functions. For the final state, the two ejected electrons are described by two Coulomb wave functions. The electron-electron repulsion between the two ejected electrons is also taken into account. Mechanisms of the double ionization are discussed for each case and the best choices of the kinematical parameters are determined for next experiments.
Assessment of helminth biodiversity in wild rats using 18S rDNA based metagenomics.
Directory of Open Access Journals (Sweden)
Ryusei Tanaka
Full Text Available Parasite diversity has important implications in several research fields including ecology, evolutionary biology and epidemiology. Wide-ranging analysis has been restricted because of the difficult, highly specialised and time-consuming processes involved in parasite identification. In this study, we assessed parasite diversity in wild rats using 18S rDNA-based metagenomics. 18S rDNA PCR products were sequenced using an Illumina MiSeq sequencer and the analysis of the sequences using the QIIME software successfully classified them into several parasite groups. The comparison of the results with those obtained using standard methods including microscopic observation of helminth parasites in the rat intestines and PCR amplification/sequencing of 18S rDNA from isolated single worms suggests that this new technique is reliable and useful to investigate parasite diversity.
Recent advances in DNA-based electrochemical biosensors for heavy metal ion detection: A review.
Saidur, M R; Aziz, A R Abdul; Basirun, W J
2017-04-15
The presence of heavy metal in food chains due to the rapid industrialization poses a serious threat on the environment. Therefore, detection and monitoring of heavy metals contamination are gaining more attention nowadays. However, the current analytical methods (based on spectroscopy) for the detection of heavy metal contamination are often very expensive, tedious and can only be handled by trained personnel. DNA biosensors, which are based on electrochemical transduction, is a sensitive but inexpensive method of detection. The principles, sensitivity, selectivity and challenges of electrochemical biosensors are discussed in this review. This review also highlights the major advances of DNA-based electrochemical biosensors for the detection of heavy metal ions such as Hg(2+), Ag(+), Cu(2+) and Pb(2+).
Universal spectrum for DNA base C+G frequency distribution in Human chromosomes 1 to 24
Selvam, A M
2007-01-01
Power spectra of human DNA base C+G frequency distribution in all available contiguous sections exhibit the universal inverse power law form of the statistical normal distribution for the 24 chromosomes. Inverse power law form for power spectra of space-time fluctuations is generic to dynamical systems in nature and indicate long-range space-time correlations. A recently developed general systems theory predicts the observed non-local connections as intrinsic to quantumlike chaos governing space-time fluctuations of dynamical systems. The model predicts the following. (1) The quasiperiodic Penrose tiling pattern for the nested coiled structure of the DNA molecule in the chromosome resulting in maximum packing efficiency. (2) The DNA molecule functions as a unified whole fuzzy logic network with ordered two-way signal transmission between the coding and non-coding regions. Recent studies indicate influence of non-coding regions on functions of coding regions in the DNA molecule.
DNA-based identification of Helleborus niger by high-resolution melting analysis.
Schmiderer, Corinna; Mader, Eduard; Novak, Johannes
2010-11-01
Hellbori nigri rhizoma is a drug that is difficult to distinguish from other species of the genus Helleborus. In this communication we present a DNA-based identification by high-resolution melting analysis (HRM) that is able to differentiate between Helleborus niger and other species of the genus. HRM is a very specific, time- and labour-saving method for identifying DNA sequence variations and is ideally suitable for routine PCR analysis. The HRM assay developed is specific for the genus Helleborus. This method not only detects the presence of the target species H. niger but also, to a certain extent, identifies other Helleborus species by their different melting curve shapes. Markers were developed based on the trnL-trnF intergenic spacer and on the matK sequence. For an unambiguous identification of Helleborus niger, melting curves of both markers should be used. © Georg Thieme Verlag KG Stuttgart · New York.
Directory of Open Access Journals (Sweden)
Kambiranda Devaiah
2011-01-01
Full Text Available Vidari is an Ayurvedic herbal drug used as aphrodisiac, galactagogue and is also used in the preparation of Chyavanaprash. Tubers of Ipomoea mauritiana Jacq. (Convolvulaceae, Pueraria tuberosa (Roxb. ex Willd. DC (Fabaceae, Adenia hondala (Gaertn. de Wilde (Passifloraceae and pith of Cycas circinalis L. (Cycadaceae are all traded in the name of Vidari, creating issues of botanical authenticity of the Ayurvedic raw drug. DNA-based markers have been developed to distinguish I. mauritiana from the other Vidari candidates. A putative 600-bp polymorphic sequence, specific to I. mauritiana was identified using randomly amplified polymorphic DNA (RAPD technique. Furthermore, sequence characterized amplified region (SCAR primers (IM1F and IM1R were designed from the unique RAPD amplicon. The SCAR primers produced a specific 323-bp amplicon in authentic I. mauritiana and not in the allied species.
Refined Exercise testing can aid DNA-based Diagnosis in Muscle Channelopathies
Tan, S. Veronica; Matthews, Emma; Barber, Melissa; Burge, James A; Rajakulendran, Sanjeev; Fialho, Doreen; Sud, Richa; Haworth, Andrea; Koltzenburg, Martin; Hanna, Michael G
2010-01-01
Objective To improve the accuracy of genotype prediction and guide genetic testing in patients with muscle channelopathies we applied and refined specialised electrophysiological exercise test parameters. Methods We studied 56 genetically confirmed patients and 65 controls using needle electromyography, the long exercise test, and short exercise tests at room temperature, after cooling, and rewarming. Results Concordant amplitude-and-area decrements were more reliable than amplitude-only measurements when interpreting patterns of change during the short exercise tests. Concordant amplitude-and-area pattern I and pattern II decrements of >20% were 100% specific for PMC and MC respectively. When decrements at room temperature and after cooling were 20% allow more reliable interpretation of the short exercise tests and aid accurate DNA-based diagnosis. In patients with negative exercise tests, specific clinical features are helpful in differentiating sodium from chloride channel myotonia. A modified algorithm is suggested.. PMID:21387378
Lam, Ching-Wan; Li, Chi-Keung; Lai, Chi-Kong; Tong, Sui-Fan; Chan, Kwok-Yin; Ng, Grace Sui-Fun; Yuen, Yuet-Ping; Cheng, Anna Wai-Fun; Chan, Yan-Wo
2002-01-01
Isolated sulfite oxidase deficiency is a rare autosomal recessive disease, characterized by severe neurological abnormalities, seizures, mental retardation, and dislocation of the ocular lenses, that often leads to death in infancy. There is a special demand for prenatal diagnosis, since no effective treatment is available for isolated sulfite oxidase deficiency. Until now, the cDNA sequence of the sulfite oxidase (SUOX) gene has been available, but the genomic sequence of the SUOX gene has not been published. In this study, we have performed a DNA-based diagnosis of isolated sulfite oxidase deficiency in a Chinese patient. To do so, we designed oligonucleotide primers for amplification of the predicted exons and intron-exon boundaries of the SUOX gene obtained from the completed draft version of the human genome. Using overlapping PCR products, we confirmed the flanking intronic sequences of the coding exons and that the entire 466-residue mature peptide is encoded by the last exon of the gene. We then performed mutation detection using denaturing high-performance liquid chromatography (DHPLC). The DHPLC chromatogram of exon 2b showed the presence of heteroduplex peaks only after mixing of the mutant DNA with the wild-type DNA, indicating the presence of a homozygous mutation. Direct DNA sequencing showed a homozygous base substitution at codon 160, changing the codon from CGG to CAG, which changes the amino acid from arginine to glutamine, i.e., R160Q. The DNA-based diagnosis of isolated sulfite oxidase deficiency will enable us to make an accurate determination of carrier status and to perform prenatal diagnosis of this disease. The availability of the genomic sequences of human genes from the completed draft human genome sequence will simplify the development of molecular genetic diagnoses of human diseases from peripheral blood DNA.
Planning under uncertainty solving large-scale stochastic linear programs
Energy Technology Data Exchange (ETDEWEB)
Infanger, G. (Stanford Univ., CA (United States). Dept. of Operations Research Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft)
1992-12-01
For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
@@ Scientific computation is widely used in multiple cross-disciplinary areas. Most of the issues coming from this area finally result in solving PDE. In the process of solving PDE, the meshes are firstly generated within the area where PDE is functional; then, the methods of FE,Finite Difference (FD), and Finite Volume (FV) are applied on the meshes to solve the PDE.
2010-11-01
By discovering the first double star where a pulsating Cepheid variable and another star pass in front of one another, an international team of astronomers has solved a decades-old mystery. The rare alignment of the orbits of the two stars in the double star system has allowed a measurement of the Cepheid mass with unprecedented accuracy. Up to now astronomers had two incompatible theoretical predictions of Cepheid masses. The new result shows that the prediction from stellar pulsation theory is spot on, while the prediction from stellar evolution theory is at odds with the new observations. The new results, from a team led by Grzegorz Pietrzyński (Universidad de Concepción, Chile, Obserwatorium Astronomiczne Uniwersytetu Warszawskiego, Poland), appear in the 25 November 2010 edition of the journal Nature. Grzegorz Pietrzyński introduces this remarkable result: "By using the HARPS instrument on the 3.6-metre telescope at ESO's La Silla Observatory in Chile, along with other telescopes, we have measured the mass of a Cepheid with an accuracy far greater than any earlier estimates. This new result allows us to immediately see which of the two competing theories predicting the masses of Cepheids is correct." Classical Cepheid Variables, usually called just Cepheids, are unstable stars that are larger and much brighter than the Sun [1]. They expand and contract in a regular way, taking anything from a few days to months to complete the cycle. The time taken to brighten and grow fainter again is longer for stars that are more luminous and shorter for the dimmer ones. This remarkably precise relationship makes the study of Cepheids one of the most effective ways to measure the distances to nearby galaxies and from there to map out the scale of the whole Universe [2]. Unfortunately, despite their importance, Cepheids are not fully understood. Predictions of their masses derived from the theory of pulsating stars are 20-30% less than predictions from the theory of the
Solving Hitchcock's transportation problem by a genetic algorithm
Institute of Scientific and Technical Information of China (English)
CHEN Hai-feng; CHO Joong Rae; LEE Jeong.Tae
2004-01-01
Genetic algorithms (GAs) employ the evolutionary process of Darwin's nature selection theory to find the solutions of optimization problems. In this paper, an implementation of genetic algorithm is put forward to solve a classical transportation problem, namely the Hitchcock's Transportation Problem (HTP), and the GA is improved to search for all optimal solutions and identify them automatically. The algorithm is coded with C++ and validated by numerical examples. The computational results show that the algorithm is efficient for solving the Hitchcock's transportation problem.
Wavelet operational matrix method for solving the Riccati differential equation
Li, Yuanlu; Sun, Ning; Zheng, Bochao; Wang, Qi; Zhang, Yingchao
2014-03-01
A Haar wavelet operational matrix method (HWOMM) was derived to solve the Riccati differential equations. As a result, the computation of the nonlinear term was simplified by using the Block pulse function to expand the Haar wavelet one. The proposed method can be used to solve not only the classical Riccati differential equations but also the fractional ones. The capability and the simplicity of the proposed method was demonstrated by some examples and comparison with other methods.
(Numerical algorithms for solving linear algebra problems). Final report
Energy Technology Data Exchange (ETDEWEB)
Golub, G.H.
1985-04-16
We have concentrated on developing and analyzing various numerical algorithms for solving problems arising in a linear algebra context. The papers and research fall into basically three categories: (1) iterative methods for solving linear equations arising from p.d.e.'s; (2) calculation of Gauss-type quadrature rules; and (3) solution of matrix and data problems arising in statistical computation. We summarize some of these results, highlighting those which are of most importance.
Solving complex fisheries management problems
DEFF Research Database (Denmark)
Petter Johnsen, Jahn; Eliasen, Søren Qvist
2011-01-01
A crucial issue for the new EU common fisheries policy is how to solve the discard problem. Through a study of the institutional set up and the arrangements for solving the discard problem in Denmark, the Faroe Islands, Iceland and Norway, the article identifies the discard problem as related to ...... to both natural, other material and to cultural conditions. Hence, solving the discard problem requires not only technical and regulatory instruments, but also arenas and structures that allow and facilitate processes of cultural change....
BlockSolve v1.1: Scalable library software for the parallel solution of sparse linear systems
Energy Technology Data Exchange (ETDEWEB)
Jones, M.T.; Plassmann, P.E.
1993-03-01
BlockSolve is a software library for solving large, sparse systems of linear equations on massively parallel computers. The matrices must be symmetric, but may have an arbitrary sparsity structure. BlockSolve is a portable package that is compatible with several different message-passing pardigms. This report gives detailed instructions on the use of BlockSolve in applications programs.
BlockSolve v1. 1: Scalable library software for the parallel solution of sparse linear systems
Energy Technology Data Exchange (ETDEWEB)
Jones, M.T.; Plassmann, P.E.
1993-03-01
BlockSolve is a software library for solving large, sparse systems of linear equations on massively parallel computers. The matrices must be symmetric, but may have an arbitrary sparsity structure. BlockSolve is a portable package that is compatible with several different message-passing pardigms. This report gives detailed instructions on the use of BlockSolve in applications programs.
Geroch, Robert
2009-01-01
Computation is the process of applying a procedure or algorithm to the solution of a mathematical problem. Mathematicians and physicists have been occupied for many decades pondering which problems can be solved by which procedures, and, for those that can be solved, how this can most efficiently be done. In recent years, quantum mechanics has augmented our understanding of the process of computation and of its limitations. Perspectives in Computation covers three broad topics: the computation process and its limitations, the search for computational efficiency, and the role of quantum mechani
Institute of Scientific and Technical Information of China (English)
Andrzej Skowron
2006-01-01
Solving complex problems by multi-agent systems in distributed environments requires new approximate reasoning methods based on new computing paradigms. One such recently emerging computing paradigm is Granular Computing(GC). We discuss the Rough-Granular Computing(RGC) approach to modeling of computations in complex adaptive systems and multiagent systems as well as for approximate reasoning about the behavior of such systems. The RGC methods have been successfully applied for solving complex problems in areas such as identification of objects or behavioral patterns by autonomous systems, web mining, and sensor fusion.
Welbourne, D
1965-01-01
Analogue Computing Methods presents the field of analogue computation and simulation in a compact and convenient form, providing an outline of models and analogues that have been produced to solve physical problems for the engineer and how to use and program the electronic analogue computer. This book consists of six chapters. The first chapter provides an introduction to analogue computation and discusses certain mathematical techniques. The electronic equipment of an analogue computer is covered in Chapter 2, while its use to solve simple problems, including the method of scaling is elaborat
How to solve mathematical problems
Wickelgren, Wayne A
1995-01-01
Seven problem-solving techniques include inference, classification of action sequences, subgoals, contradiction, working backward, relations between problems, and mathematical representation. Also, problems from mathematics, science, and engineering with complete solutions.
Solving Optimal Timing Problems Elegantly
Todorova, Tamara
2013-01-01
Few textbooks in mathematical economics cover optimal timing problems. Those which cover them do it scantly or in a rather clumsy way, making it hard for students to understand and apply the concept of optimal time in new contexts. Discussing the plentiful illustrations of optimal timing problems, we present an elegant and simple method of solving them. Whether the present value function is exponential or logarithmic, a convenient way to solve it is to convert the base to the exponential numb...
Building problem solving environments with the arches framework
Energy Technology Data Exchange (ETDEWEB)
Debardeleben, Nathan [Los Alamos National Laboratory; Sass, Ron [U NORTH CAROLINA; Stanzione, Jr., Daniel [ASU; Ligon, Ill, Walter [CLEMSON UNIV
2009-01-01
The computational problems that scientists face are rapidly escalating in size and scope. Moreover, the computer systems used to solve these problems are becoming significantly more complex than the familiar, well-understood sequential model on their desktops. While it is possible to re-train scientists to use emerging high-performance computing (HPC) models, it is much more effective to provide them with a higher-level programming environment that has been specialized to their particular domain. By fostering interaction between HPC specialists and the domain scientists, problem-solving environments (PSEs) provide a collaborative environment. A PSE environment allows scientists to focus on expressing their computational problem while the PSE and associated tools support mapping that domain-specific problem to a high-performance computing system. This article describes Arches, an object-oriented framework for building domain-specific PSEs. The framework was designed to support a wide range of problem domains and to be extensible to support very different high-performance computing targets. To demonstrate this flexibility, two PSEs have been developed from the Arches framework to solve problem in two different domains and target very different computing platforms. The Coven PSE supports parallel applications that require large-scale parallelism found in cost-effective Beowulf clusters. In contrast, RCADE targets FPGA-based reconfigurable computing and was originally designed to aid NASA Earth scientists studying satellite instrument data.
Computer applications in bioprocessing.
Bungay, H R
2000-01-01
Biotechnologists have stayed at the forefront for practical applications for computing. As hardware and software for computing have evolved, the latest advances have found eager users in the area of bioprocessing. Accomplishments and their significance can be appreciated by tracing the history and the interplay between the computing tools and the problems that have been solved in bioprocessing.
Solving Nonlinear Euler Equations with Arbitrary Accuracy
Dyson, Rodger W.
2005-01-01
A computer program that efficiently solves the time-dependent, nonlinear Euler equations in two dimensions to an arbitrarily high order of accuracy has been developed. The program implements a modified form of a prior arbitrary- accuracy simulation algorithm that is a member of the class of algorithms known in the art as modified expansion solution approximation (MESA) schemes. Whereas millions of lines of code were needed to implement the prior MESA algorithm, it is possible to implement the present MESA algorithm by use of one or a few pages of Fortran code, the exact amount depending on the specific application. The ability to solve the Euler equations to arbitrarily high accuracy is especially beneficial in simulations of aeroacoustic effects in settings in which fully nonlinear behavior is expected - for example, at stagnation points of fan blades, where linearizing assumptions break down. At these locations, it is necessary to solve the full nonlinear Euler equations, and inasmuch as the acoustical energy is of the order of 4 to 5 orders of magnitude below that of the mean flow, it is necessary to achieve an overall fractional error of less than 10-6 in order to faithfully simulate entropy, vortical, and acoustical waves.
Computer Assisted Parallel Program Generation
Kawata, Shigeo
2015-01-01
Parallel computation is widely employed in scientific researches, engineering activities and product development. Parallel program writing itself is not always a simple task depending on problems solved. Large-scale scientific computing, huge data analyses and precise visualizations, for example, would require parallel computations, and the parallel computing needs the parallelization techniques. In this Chapter a parallel program generation support is discussed, and a computer-assisted parallel program generation system P-NCAS is introduced. Computer assisted problem solving is one of key methods to promote innovations in science and engineering, and contributes to enrich our society and our life toward a programming-free environment in computing science. Problem solving environments (PSE) research activities had started to enhance the programming power in 1970's. The P-NCAS is one of the PSEs; The PSE concept provides an integrated human-friendly computational software and hardware system to solve a target ...
Sokhansanj, Bahrad A; Wilson, David M
2006-05-01
Epidemiologic studies have revealed a complex association between human genetic variance and cancer risk. Quantitative biological modeling based on experimental data can play a critical role in interpreting the effect of genetic variation on biochemical pathways relevant to cancer development and progression. Defects in human DNA base excision repair (BER) proteins can reduce cellular tolerance to oxidative DNA base damage caused by endogenous and exogenous sources, such as exposure to toxins and ionizing radiation. If not repaired, DNA base damage leads to cell dysfunction and mutagenesis, consequently leading to cancer, disease, and aging. Population screens have identified numerous single-nucleotide polymorphism variants in many BER proteins and some have been purified and found to exhibit mild kinetic defects. Epidemiologic studies have led to conflicting conclusions on the association between single-nucleotide polymorphism variants in BER proteins and cancer risk. Using experimental data for cellular concentration and the kinetics of normal and variant BER proteins, we apply a previously developed and tested human BER pathway model to (i) estimate the effect of mild variants on BER of abasic sites and 8-oxoguanine, a prominent oxidative DNA base modification, (ii) identify ranges of variation associated with substantial BER capacity loss, and (iii) reveal nonintuitive consequences of multiple simultaneous variants. Our findings support previous work suggesting that mild BER variants have a minimal effect on pathway capacity whereas more severe defects and simultaneous variation in several BER proteins can lead to inefficient repair and potentially deleterious consequences of cellular damage.
DEFF Research Database (Denmark)
Olsen, Rikke K J; Andresen, Brage S; Christensen, Ernst;
2005-01-01
, prenatal diagnosis of MADD has relied mostly on second-trimester biochemical analyses of amniotic fluid or cultured amniocytes. We report here on an alternative DNA-based approach for prenatal diagnosis in pregnancies at risk of MADD. METHODS: We used our knowledge of the mutational status in three...
DNA-based identification of mixed-organism samples offers the potential to greatly reduce the need for resource-intensive morphological identification, which would be of value both to biotic condition assessment and non-native species early-detection monitoring. However, the abi...
Forrest, Stephanie; Beauchemin, Catherine
2007-04-01
This review describes a body of work on computational immune systems that behave analogously to the natural immune system. These artificial immune systems (AIS) simulate the behavior of the natural immune system and in some cases have been used to solve practical engineering problems such as computer security. AIS have several strengths that can complement wet lab immunology. It is easier to conduct simulation experiments and to vary experimental conditions, for example, to rule out hypotheses; it is easier to isolate a single mechanism to test hypotheses about how it functions; agent-based models of the immune system can integrate data from several different experiments into a single in silico experimental system.
Directory of Open Access Journals (Sweden)
N. K. Khalid
2008-01-01
Full Text Available Problem statement: In DNA based computation and DNA nanotechnology, the design of good DNA sequences has turned out to be an essential problem and one of the most practical and important research topics. Basically, the DNA sequence design problem is a multi-objective problem and it can be evaluated using four objective functions, namely, Hmeasure, similarity, continuity and hairpin. Approach: There are several ways to solve multi-objective problem, however, in order to evaluate the correctness of PSO algorithm in DNA sequence design, this problem is converted into single objective problem. Particle Swarm Optimization (PSO is proposed to minimize the objective in the problem, subjected to two constraints: melting temperature and GCcontent. A model is developed to present the DNA sequence design based on PSO computation. Results: Based on experiments and researches done, 20 particles are used in the implementation of the optimization process, where the average values and the standard deviation for 100 runs are shown along with comparison to other existing methods. Conclusion: The results achieve verified that PSO can suitably solves the DNA sequence design problem using the proposed method and model, comparatively better than other approaches.
Krueger, Andrew T; Kool, Eric T
2008-03-26
We recently described the synthesis and helix assembly properties of expanded DNA (xDNA), which contains base pairs 2.4 A larger than natural DNA pairs. This designed genetic set is under study with the goals of mimicking the functions of the natural DNA-based genetic system and of developing useful research tools. Here, we study the fluorescence properties of the four expanded bases of xDNA (xA, xC, xG, xT) and evaluate how their emission varies with changes in oligomer length, composition, and hybridization. Experiments were carried out with short oligomers of xDNA nucleosides conjugated to a DNA oligonucleotide, and we investigated the effects of hybridizing these fluorescent oligomers to short complementary DNAs with varied bases opposite the xDNA bases. As monomer nucleosides, the xDNA bases absorb light in two bands: one at approximately 260 nm (similar to DNA) and one at longer wavelength ( approximately 330 nm). All are efficient violet-blue fluorophores with emission maxima at approximately 380-410 nm and quantum yields (Phifl) of 0.30-0.52. Short homo-oligomers of the xDNA bases (length 1-4 monomers) showed moderate self-quenching except xC, which showed enhancement of Phifl with increasing length. Interestingly, multimers of xA emitted at longer wavelengths (520 nm) as an apparent excimer. Hybridization of an oligonucleotide to the DNA adjacent to the xDNA bases (with the xDNA portion overhanging) resulted in no change in fluorescence. However, addition of one, two, or more DNA bases in these duplexes opposite the xDNA portion resulted in a number of significant fluorescence responses, including wavelength shifts, enhancements, or quenching. The strongest responses were the enhancement of (xG)n emission by hybridization of one or more adenines opposite them, and the quenching of (xT)n and (xC)n emission by guanines opposite. The data suggest multiple ways in which the xDNA bases, both alone and in oligomers, may be useful as tools in biophysical analysis
Institute of Scientific and Technical Information of China (English)
周祥; 何小荣; 陈丙珍
2004-01-01
Because of the powerful mapping ability, back propagation neural network (BP-NN) has been employed in computer-aided product design (CAPD) to establish the property prediction model. The backward problem in CAPD is to search for the appropriate structure or composition of the product with desired property, which is an optimization problem. In this paper, a global optimization method of using the α BB algorithm to solve the backward problem is presented. In particular, a convex lower bounding function is constructed for the objective function formulated with BP-NN model, and the calculation of the key parameter α is implemented by recurring to the interval Hessian matrix of the objective function. Two case studies involving the design of dopamine β-hydroxylase (DβH) inhibitors and linear low density polyethylene (LLDPE) nano composites are investigated using the proposed method.
Fast and highly specific DNA-based multiplex detection on a solid support.
Barišić, Ivan; Kamleithner, Verena; Schönthaler, Silvia; Wiesinger-Mayr, Herbert
2015-01-01
Highly specific and fast multiplex detection methods are essential to conduct reasonable DNA-based diagnostics and are especially important to characterise infectious diseases. More than 1000 genetic targets such as antibiotic resistance genes, virulence factors and phylogenetic markers have to be identified as fast as possible to facilitate the correct treatment of a patient. In the present work, we developed a novel ligation-based DNA probe concept that was combined with the microarray technology and used it for the detection of bacterial pathogens. The novel linear chain (LNC) probes identified all tested species correctly within 1 h based on their 16S rRNA gene in a 25-multiplex reaction. Genomic DNA was used directly as template in the ligation reaction identifying as little as 10(7) cells without any pre-amplification. The high specificity was further demonstrated characterising a single nucleotide polymorphism leading to no false positive fluorescence signals of the untargeted single nucleotide polymorphism (SNP) variants. In comparison to conventional microarray probes, the sensitivity of the novel LNC3 probes was higher by a factor of 10 or more. In summary, we present a fast, simple, highly specific and sensitive multiplex detection method adaptable for a wide range of applications.
Early Steps in the DNA Base Excision Repair Pathway of a Fission Yeast Schizosaccharomyces pombe
Directory of Open Access Journals (Sweden)
Kyoichiro Kanamitsu
2010-01-01
Full Text Available DNA base excision repair (BER accounts for maintaining genomic integrity by removing damaged bases that are generated endogenously or induced by genotoxic agents. In this paper, we describe the roles of enzymes functioning in the early steps of BER in fission yeast. Although BER is an evolutionarily conserved process, some unique features of the yeast repair pathway were revealed by genetic and biochemical approaches. AP sites generated by monofunctional DNA glycosylases are incised mainly by AP lyase activity of Nth1p, a sole bifunctional glycosylase in yeast, to leave a blocked 3′ end. The major AP endonuclease Apn2p functions predominantly in removing the 3′ block. Finally, a DNA polymerase fills the gap, and a DNA ligase seals the nick (Nth1p-dependent or short patch BER. Apn1p backs up Apn2p. In long patch BER, Rad2p endonuclease removes flap DNA containing a lesion after DNA synthesis. A UV-specific endonuclease Uve1p engages in an alternative pathway by nicking DNA on the 5′ side of oxidative damage. Nucleotide excision repair and homologous recombination are involved in repair of BER intermediates including the AP site and single-strand break with the 3′ block. Other enzymes working in 3′ end processing are also discussed.
Suzuki, Ryosuke; Ishikawa, Tomohiro; Konishi, Eiji; Matsuda, Mami; Watashi, Koichi; Aizaki, Hideki; Takasaki, Tomohiko; Wakita, Takaji
2014-01-01
A method for rapid production of single-round infectious particles (SRIPs) of flavivirus would be useful for viral mutagenesis studies. Here, we established a DNA-based production system for SRIPs of flavivirus. We constructed a Japanese encephalitis virus (JEV) subgenomic replicon plasmid, which lacked the C-prM-E (capsid-pre-membrane-envelope) coding region, under the control of the cytomegalovirus promoter. When the JEV replicon plasmid was transiently co-transfected with a JEV C-prM-E expression plasmid into 293T cells, SRIPs were produced, indicating successful trans-complementation with JEV structural proteins. Equivalent production levels were observed when C and prM-E proteins were provided separately. Furthermore, dengue types 1-4, West Nile, yellow fever or tick-borne encephalitis virus prM-E proteins could be utilized for production of chimaeric flavivirus SRIPs, although the production was less efficient for dengue and yellow fever viruses. These results indicated that our plasmid-based system is suitable for investigating the life cycles of flaviviruses, diagnostic applications and development of safer vaccine candidates.
You, Mingxu; Zhu, Guizhi; Chen, Tao; Donovan, Michael J; Tan, Weihong
2015-01-21
The specific inventory of molecules on diseased cell surfaces (e.g., cancer cells) provides clinicians an opportunity for accurate diagnosis and intervention. With the discovery of panels of cancer markers, carrying out analyses of multiple cell-surface markers is conceivable. As a trial to accomplish this, we have recently designed a DNA-based device that is capable of performing autonomous logic-based analysis of two or three cancer cell-surface markers. Combining the specific target-recognition properties of DNA aptamers with toehold-mediated strand displacement reactions, multicellular marker-based cancer analysis can be realized based on modular AND, OR, and NOT Boolean logic gates. Specifically, we report here a general approach for assembling these modular logic gates to execute programmable and higher-order profiling of multiple coexisting cell-surface markers, including several found on cancer cells, with the capacity to report a diagnostic signal and/or deliver targeted photodynamic therapy. The success of this strategy demonstrates the potential of DNA nanotechnology in facilitating targeted disease diagnosis and effective therapy.
In vitro measurement of DNA base excision repair in isolated mitochondria.
Page, Melissa M; Stuart, Jeffrey A
2009-01-01
Mitochondrial DNA (mtDNA) is in relatively close proximity to reactive oxygen species (ROS) arising from spontaneous superoxide formation during respiration. As a result, it sustains oxidative damage that may include base modifications, base loss, and strand breaks. mtDNA replication past sites of oxidative damage can result in the introduction of mutations. mtDNA mutations are associated with various human diseases and can manifest as loss of bioenergetic function. DNA repair processes exist in mitochondria from apparently all metazoans. A fully functional DNA base excision repair (BER) pathway is present in mitochondria of vertebrates. This pathway is catalyzed by a number of DNA glycosylases, an AP endonuclease, polymerase gamma, and a DNA ligase. This chapter outlines the step-by-step protocols for isolating mitochondrial fractions, from a number of different model organisms, of sufficient purity to allow mtDNA repair activities to be measured. It details in vitro assays for the measurement of BER enzyme activities in lysates prepared from isolated mitochondria.
Ahlinder, Jon; Öhrman, Caroline; Svensson, Kerstin; Lindgren, Petter; Johansson, Anders; Forsman, Mats; Larsson, Pär; Sjödin, Andreas
2012-09-25
Recent advances in sequencing technologies offer promising tools for generating large numbers of genomes, larger typing databases and improved mapping of environmental bacterial diversity. However, DNA-based methods for the detection of Francisella were developed with limited knowledge about genetic diversity. This, together with the high sequence identity between several Francisella species, means there is a high risk of false identification and detection of the highly virulent pathogen Francisella tularensis. Moreover, phylogenetic reconstructions using single or limited numbers of marker sequences often result in incorrect tree topologies and inferred evolutionary distances. The recent growth in publicly accessible whole-genome sequences now allows evaluation of published genetic markers to determine optimal combinations of markers that minimise both time and laboratory costs. In the present study, we evaluated 38 previously published DNA markers and the corresponding PCR primers against 42 genomes representing the currently known diversity of the genus Francisella. The results highlight that PCR assays for Francisella tularensis are often complicated by low specificity, resulting in a high probability of false positives. A method to select a set of one to seven markers for obtaining optimal phylogenetic resolution or diagnostic accuracy is presented. Current multiple-locus sequence-typing systems and detection assays of Francisella, could be improved by redesigning some of the primers and reselecting typing markers. The use of only a few optimally selected sequence-typing markers allows construction of phylogenetic topologies with almost the same accuracy as topologies based on whole-genome sequences.
High-affinity DNA base analogs as supramolecular, nanoscale promoters of macroscopic adhesion.
Anderson, Cyrus A; Jones, Amanda R; Briggs, Ellen M; Novitsky, Eric J; Kuykendall, Darrell W; Sottos, Nancy R; Zimmerman, Steven C
2013-05-15
Adhesion phenomena are essential to many biological processes and to synthetic adhesives and manufactured coatings and composites. Supramolecular interactions are often implicated in various adhesion mechanisms. Recently, supramolecular building blocks, such as synthetic DNA base-pair mimics, have drawn attention in the context of molecular recognition, self-assembly, and supramolecular polymers. These reversible, hydrogen-bonding interactions have been studied extensively for their adhesive capabilities at the nano- and microscale, however, much less is known about their utility for practical adhesion in macroscopic systems. Herein, we report the preparation and evaluation of supramolecular coupling agents based on high-affinity, high-fidelity quadruple hydrogen-bonding units (e.g., DAN·DeUG, Kassoc = 10(8) M(-1) in chloroform). Macroscopic adhesion between polystyrene films and glass surfaces modified with 2,7-diamidonaphthyridine (DAN) and ureido-7-deazaguanine (DeUG) units was evaluated by mechanical testing. Structure-property relationships indicate that the designed supramolecular interaction at the nanoscale plays a key role in the observed macroscopic adhesive response. Experiments probing reversible adhesion or self-healing properties of bulk samples indicate that significant recovery of initial strength can be realized after failure but that the designed noncovalent interaction does not lead to healing during the process of adhesion loss.
PlantID – DNA-based identification of multiple medicinal plants in complex mixtures
Directory of Open Access Journals (Sweden)
Howard Caroline
2012-07-01
Full Text Available Abstract Background An efficient method for the identification of medicinal plant products is now a priority as the global demand increases. This study aims to develop a DNA-based method for the identification and authentication of plant species that can be implemented in the industry to aid compliance with regulations, based upon the economically important Hypericum perforatum L. (St John’s Wort or Guan ye Lian Qiao. Methods The ITS regions of several Hypericum species were analysed to identify the most divergent regions and PCR primers were designed to anneal specifically to these regions in the different Hypericum species. Candidate primers were selected such that the amplicon produced by each species-specific reaction differed in size. The use of fluorescently labelled primers enabled these products to be resolved by capillary electrophoresis. Results Four closely related Hypericum species were detected simultaneously and independently in one reaction. Each species could be identified individually and in any combination. The introduction of three more closely related species to the test had no effect on the results. Highly processed commercial plant material was identified, despite the potential complications of DNA degradation in such samples. Conclusion This technique can detect the presence of an expected plant material and adulterant materials in one reaction. The method could be simply applied to other medicinal plants and their problem adulterants.
DNA-Based Synthesis and Assembly of Organized Iron Oxide Nanostructures
Khomutov, Gennady B.
Organized bio-inorganic and hybrid bio-organic-inorganic nanostructures consisting of iron oxide nanoparticles and DNA complexes have been formed using methods based on biomineralization, interfacial and bulk phase assembly, ligand exchange and substitution, Langmuir-Blodgett technique, DNA templating and scaffolding. Interfacially formed planar DNA complexes with water-insoluble amphiphilic polycation or intercalator Langmuir monolayers were prepared and deposited on solid substrates to form immobilized DNA complexes. Those complexes were then used for the synthesis of organized DNA-based iron oxide nanostructures. Planar net-like and circular nanostructures of magnetic Fe3O4 nanoparticles were obtained via interaction of cationic colloid magnetite nanoparticles with preformed immobilized DNA/amphiphilic polycation complexes of net-like and toroidal morphologies. The processes of the generation of iron oxide nanoparticles in immobilized DNA complexes via redox synthesis with various iron sources of biological (ferritin) and artificial (FeCl3) nature have been studied. Bulk-phase complexes of magnetite nanoparticles with biomolecular ligands (DNA, spermine) were formed and studied. Novel nano-scale organized bio-inorganic nanostructures - free-floating sheet-like spermine/magnetite nanoparticle complexes and DNA/spermine/magnetite nanoparticle complexes were synthesized in bulk aqueous phase and the effect of DNA molecules on the structure of complexes was discovered.
Dipstick test for DNA-based food authentication. Application to coffee authenticity assessment.
Trantakis, Ioannis A; Spaniolas, Stelios; Kalaitzis, Panagiotis; Ioannou, Penelope C; Tucker, Gregory A; Christopoulos, Theodore K
2012-01-25
This paper reports DNA-based food authenticity assays, in which species identification is accomplished by the naked eye without the need of specialized instruments. Strongly colored nanoparticles (gold nanoparticles) are employed as reporters that enable visual detection. Furthermore, detection is performed in a low-cost, disposable, dipstick-type device that incorporates the required reagents in dry form, thereby avoiding multiple pipetting and incubation steps. Due to its simplicity, the method does not require highly qualified personnel. The procedure comprises the following steps: (i) PCR amplification of the DNA segment that flanks the unique SNP (species marker); (ii) a 15 min extension reaction in which DNA polymerase extends an allele-specific primer only if it is perfectly complementary with the target sequence; (iii) detection of the products of the extension reaction within a few minutes by the naked eye employing the dipstick. No purification is required prior to application of the extension products to the dipstick. The method is general and requires only a unique DNA sequence for species discrimination. The only instrument needed is a conventional thermocycler for PCR, which is common equipment in every DNA laboratory. As a model, the method was applied to the discrimination of Coffea robusta and arabica species in coffee authenticity assessment. As low as 5% of Robusta coffee can be detected in the presence of Arabica coffee.
DNA-based digital tension probes reveal integrin forces during early cell adhesion
Zhang, Yun; Ge, Chenghao; Zhu, Cheng; Salaita, Khalid
2014-10-01
Mechanical stimuli profoundly alter cell fate, yet the mechanisms underlying mechanotransduction remain obscure because of a lack of methods for molecular force imaging. Here to address this need, we develop a new class of molecular tension probes that function as a switch to generate a 20- to 30-fold increase in fluorescence upon experiencing a threshold piconewton force. The probes employ immobilized DNA hairpins with tunable force response thresholds, ligands and fluorescence reporters. Quantitative imaging reveals that integrin tension is highly dynamic and increases with an increasing integrin density during adhesion formation. Mixtures of fluorophore-encoded probes show integrin mechanical preference for cyclized RGD over linear RGD peptides. Multiplexed probes with variable guanine-cytosine content within their hairpins reveal integrin preference for the more stable probes at the leading tip of growing adhesions near the cell edge. DNA-based tension probes are among the most sensitive optical force reporters to date, overcoming the force and spatial resolution limitations of traction force microscopy.
Proton tunneling in the A∙T Watson-Crick DNA base pair: myth or reality?
Brovarets', Ol'ha O; Hovorun, Dmytro M
2015-01-01
The results and conclusions reached by Godbeer et al. in their recent work, that proton tunneling in the A∙T(WC) Watson-Crick (WC) DNA base pair occurs according to the Löwdin's (L) model, but with a small (~10(-9)) probability were critically analyzed. Here, it was shown that this finding overestimates the possibility of the proton tunneling at the A∙T(WC)↔A*∙T*(L) tautomerization, because this process cannot be implemented as a chemical reaction. Furthermore, it was outlined those biologically important nucleobase mispairs (A∙A*↔A*∙A, G∙G*↔G*∙G, T∙T*↔T*∙T, C∙C*↔C*∙C, H∙H*↔H*∙H (H - hypoxanthine)) - the players in the field of the spontaneous point mutagenesis - where the tunneling of protons is expected and for which the application of the model proposed by Godbeer et al. can be productive.
Semiquinone formation and DNA base damage by toxic quinones and inhibition by N-acetylcysteine (NAC)
Energy Technology Data Exchange (ETDEWEB)
Lewis, D.C.; Shibamoto, T.
1986-03-05
Toxic, mutagenic, carcinogenic, and teratogenic effects have been reported for some quinones as well as compounds metabolized to quinones. Semiquinone radical formation, thymidine degradation, and protection by NAC were studied in a hypoxanthine/xanthine oxidase (HX/XO) system. Quinone, benzo(a)pyrene-3,6-quinone, danthron, doxorubicin, emodin, juglone, menadione, and moniliformin were tested. Diethylstilbestrolquinone, N-acetylquinoneimine, and benzoquinonediimine, hypothesized toxic metabolites of diethylstilbestrol, acetaminophen and p-phenylenediamine, respectively, were synthesized and studied. Semiquinone radical formation was assessed in a HX/XO system monitoring cytochrome C reduction. Large differences in rates of semiquinone radical formation were noted for different quinones, with V/Vo values ranging from 1.2 to 10.6. DNA base degradation, thymine or thymidine glycol formation, and thiobarbituric acid reactive substance (TBARS) production were measured in a similar system containing thymine, thymidine, calf thymus DNA, or deoxyribose. TBARS formation was observed with deoxyribose, but thymidine degradation without TBARS formation was noted with thymidine. NAC (0.5 to 10 mM) caused dose-dependent inhibition of quinone-induced cytochrome C reduction.
Protocol Improvements for Low Concentration DNA-Based Bioaerosol Sampling and Analysis.
Directory of Open Access Journals (Sweden)
Irvan Luhung
Full Text Available As bioaerosol research attracts increasing attention, there is a need for additional efforts that focus on method development to deal with different environmental samples. Bioaerosol environmental samples typically have very low biomass concentrations in the air, which often leaves researchers with limited options in choosing the downstream analysis steps, especially when culture-independent methods are intended.This study investigates the impacts of three important factors that can influence the performance of culture-independent DNA-based analysis in dealing with bioaerosol environmental samples engaged in this study. The factors are: 1 enhanced high temperature sonication during DNA extraction; 2 effect of sampling duration on DNA recoverability; and 3 an alternative method for concentrating composite samples. In this study, DNA extracted from samples was analysed using the Qubit fluorometer (for direct total DNA measurement and quantitative polymerase chain reaction (qPCR.The findings suggest that additional lysis from high temperature sonication is crucial: DNA yields from both high and low biomass samples increased up to 600% when the protocol included 30-min sonication at 65°C. Long air sampling duration on a filter media was shown to have a negative impact on DNA recoverability with up to 98% of DNA lost over a 20-h sampling period. Pooling DNA from separate samples during extraction was proven to be feasible with margins of error below 30%.
Dihydropyridines decrease X-ray-induced DNA base damage in mammalian cells
Energy Technology Data Exchange (ETDEWEB)
Wojewodzka, M., E-mail: marylaw@ichtj.waw.pl [Center of Radiobiology and Biological Dosimetry, Institute of Nuclear Chemistry and Technology, Warszawa (Poland); Gradzka, I.; Buraczewska, I.; Brzoska, K.; Sochanowicz, B. [Center of Radiobiology and Biological Dosimetry, Institute of Nuclear Chemistry and Technology, Warszawa (Poland); Goncharova, R.; Kuzhir, T. [Institute of Genetics and Cytology, Belarussian National Academy of Sciences, Minsk (Belarus); Szumiel, I. [Center of Radiobiology and Biological Dosimetry, Institute of Nuclear Chemistry and Technology, Warszawa (Poland)
2009-12-01
Compounds with the structural motif of 1,4-dihydropyridine display a broad spectrum of biological activities, often defined as bioprotective. Among them are L-type calcium channel blockers, however, also derivatives which do not block calcium channels exert various effects at the cellular and organismal levels. We examined the effect of sodium 3,5-bis-ethoxycarbonyl-2,6-dimethyl-1,4-dihydropyridine-4-carboxylate (denoted here as DHP and previously also as AV-153) on X-ray-induced DNA damage and mutation frequency at the HGPRT (hypoxanthine-guanine phosphoribosyl transferase) locus in Chinese hamster ovary CHO-K1 cells. Using formamido-pyrimidine glycosylase (FPG) comet assay, we found that 1-h DHP (10 nM) treatment before X-irradiation considerably reduced the initial level of FPG-recognized DNA base damage, which was consistent with decreased 8-oxo-7,8-dihydro-2'-deoxyguanosine content and mutation frequency lowered by about 40%. No effect on single strand break rejoining or on cell survival was observed. Similar base damage-protective effect was observed for two calcium channel blockers: nifedipine (structurally similar to DHP) or verapamil (structurally unrelated). So far, the specificity of the DHP-caused reduction in DNA damage - practically limited to base damage - has no satisfactory explanation.
Poisonous or non-poisonous plants? DNA-based tools and applications for accurate identification.
Mezzasalma, Valerio; Ganopoulos, Ioannis; Galimberti, Andrea; Cornara, Laura; Ferri, Emanuele; Labra, Massimo
2017-01-01
Plant exposures are among the most frequently reported cases to poison control centres worldwide. This is a growing condition due to recent societal trends oriented towards the consumption of wild plants as food, cosmetics, or medicine. At least three general causes of plant poisoning can be identified: plant misidentification, introduction of new plant-based supplements and medicines with no controls about their safety, and the lack of regulation for the trading of herbal and phytochemical products. Moreover, an efficient screening for the occurrence of plants poisonous to humans is also desirable at the different stages of the food supply chain: from the raw material to the final transformed product. A rapid diagnosis of intoxication cases is necessary in order to provide the most reliable treatment. However, a precise taxonomic characterization of the ingested species is often challenging. In this review, we provide an overview of the emerging DNA-based tools and technologies to address the issue of poisonous plant identification. Specifically, classic DNA barcoding and its applications using High Resolution Melting (Bar-HRM) ensure high universality and rapid response respectively, whereas High Throughput Sequencing techniques (HTS) provide a complete characterization of plant residues in complex matrices. The pros and cons of each approach have been evaluated with the final aim of proposing a general user's guide to molecular identification directed to different stakeholder categories interested in the diagnostics of poisonous plants.
Directory of Open Access Journals (Sweden)
Aleksandra Delplanque
Full Text Available Lanthanide-doped nanoparticles are of considerable interest for biodetection and bioimaging techniques thanks to their unique chemical and optical properties. As a sensitive luminescence material, they can be used as (bio probes in Förster Resonance Energy Transfer (FRET where trivalent lanthanide ions (La3+ act as energy donors. In this paper we present an efficient method to transfer ultrasmall (ca. 8 nm NaYF4 nanoparticles dispersed in organic solvent to an aqueous solution via oxidation of the oleic acid ligand. Nanoparticles were then functionalized with single strand DNA oligomers (ssDNA by inducing covalent bonds between surface carboxylic groups and a 5' amine modified-ssDNA. Hybridization with the 5' fluorophore (Cy5 modified complementary ssDNA strand demonstrated the specificity of binding and allowed the fine control over the distance between Eu3+ ions doped nanoparticle and the fluorophore by varying the number of the dsDNA base pairs. First, our results confirmed nonradiative resonance energy transfer and demonstrate the dependence of its efficiency on the distance between the donor (Eu3+ and the acceptor (Cy5 with sensitivity at a nanometre scale.
Horses for courses: a DNA-based test for race distance aptitude in thoroughbred racehorses.
Hill, Emmeline W; Ryan, Donal P; MacHugh, David E
2012-12-01
Variation at the myostatin (MSTN) gene locus has been shown to influence racing phenotypes in Thoroughbred horses, and in particular, early skeletal muscle development and the aptitude for racing at short distances. Specifically, a single nucleotide polymorphism (SNP) in the first intron of MSTN (g.66493737C/T) is highly predictive of best race distance among Flat racing Thoroughbreds: homozygous C/C horses are best suited to short distance races, heterozygous C/T horses are best suited to middle distance races, and homozygous T/T horses are best suited to longer distance races. Patent applications for this gene marker association, and other linked markers, have been filed. The information contained within the patent applications is exclusively licensed to the commercial biotechnology company Equinome Ltd, which provides a DNA-based test to the international Thoroughbred horse racing and breeding industry. The application of this information in the industry enables informed decision making in breeding and racing and can be used to assist selection to accelerate the rate of change of genetic types among distinct populations (Case Study 1) and within individual breeding operations (Case Study 2).
Lobe, Elisabeth; Stollenwerk, Tobias; Tröltzsch, Anke
2015-01-01
In the recent years, the field of adiabatic quantum computing has gained importance due to the advances in the realisation of such machines, especially by the company D-Wave Systems. These machines are suited to solve discrete optimisation problems which are typically very hard to solve on a classical computer. Due to the quantum nature of the device it is assumed that there is a substantial speedup compared to classical HPC facilities. We explain the basic principles of adiabatic ...
Problem Solving and Complex Systems
Guinand, Frédéric
2008-01-01
The observation and modeling of natural Complex Systems (CSs) like the human nervous system, the evolution or the weather, allows the definition of special abilities and models reusable to solve other problems. For instance, Genetic Algorithms or Ant Colony Optimizations are inspired from natural CSs to solve optimization problems. This paper proposes the use of ant-based systems to solve various problems with a non assessing approach. This means that solutions to some problem are not evaluated. They appear as resultant structures from the activity of the system. Problems are modeled with graphs and such structures are observed directly on these graphs. Problems of Multiple Sequences Alignment and Natural Language Processing are addressed with this approach.
Aging and skilled problem solving.
Charness, N
1981-03-01
Information-processing models of problem solving too often are based on restrictive age ranges. On the other hand, gerontologists have investigated few problem-solving tasks and have rarely generated explicit models. As this article demonstrates, both fields can benefit by closer collaboration. One major issue in gerontology is whether aging is associated with irreversible decrement or developmental plasticity. If both processes occur, then an appropriate strategy for investigating aging is to equate age groups for molar problem-solving performance and search for differences in the underlying components. This strategy was adopted to examine the relation of age and skill to problem solving in chess. Chess players were selected to vary widely in age and skill such that these variables were uncorrelated. Problem-solving and memory tasks were administered. Skill level was the only significant predictor for accuracy in both a choose-a-move task and a speeded end-game evaluation task. Age (negatively) and skill (positively) jointly determined performance in an unexpected recall task. Efficient chunking in recall was positively related to skill, though negatively related to age. Recognition confidence, though not accuracy, was negatively related to age. Thus despite age-related declines in encoding and retrieval of information, older players match the problem-solving performance of equivalently skilled younger players. Apparently, they can search the problem space more efficiently, as evidenced by taking less time to select an equally good move. Models of chess skill that stress that role of encoding efficiency, as indexed by chunking in recall, need to be modified to account for performance over the life span.
2008+ solved problems in electromagnetics
Nasar, Syed
2007-01-01
SciTech Publishing is reissuing this extremely valuable learning resource, originally published in 1992 in the Schaum's Problem-Solving Series for students of electromagnetics and those who wish to refresh and solidify their understanding of its challenging applications. Problem-solving drill helps develop confidence, but few textbooks offer the answers, never mind the complete solutions, to their chapter exercises. Here noted author Professor Syed Nasar has divided the book's problems into topic areas similar to a textbook and presented a wide array of problems, followed immediately by their
DEFF Research Database (Denmark)
Foss, Kirsten; Foss, Nicolai
as a general approach to problem solving. We apply these Simonian ideas to organizational issues, specifically new organizational forms. Specifically, Simonian ideas allow us to develop a morphology of new organizational forms and to point to some design problems that characterize these forms.Keywords: Herbert...... Simon, problem-solving, new organizational forms. JEL Code: D23, D83......Two of Herbert Simon's best-known papers are "The Architecture of Complexity" and "The Structure of Ill-Structured Problems." We discuss the neglected links between these two papers, highlighting the role of decomposition in the context of problems on which constraints have been imposed...
Polyomino Problems to Confuse Computers
Coffin, Stewart
2009-01-01
Computers are very good at solving certain types combinatorial problems, such as fitting sets of polyomino pieces into square or rectangular trays of a given size. However, most puzzle-solving programs now in use assume orthogonal arrangements. When one departs from the usual square grid layout, complications arise. The author--using a computer,…
Computer mathematics for programmers
Abney, Darrell H; Sibrel, Donald W
1985-01-01
Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p
A genetic algorithm for solving supply chain network design model
Firoozi, Z.; Ismail, N.; Ariafar, S. H.; Tang, S. H.; Ariffin, M. K. M. A.
2013-09-01
Network design is by nature costly and optimization models play significant role in reducing the unnecessary cost components of a distribution network. This study proposes a genetic algorithm to solve a distribution network design model. The structure of the chromosome in the proposed algorithm is defined in a novel way that in addition to producing feasible solutions, it also reduces the computational complexity of the algorithm. Computational results are presented to show the algorithm performance.
Computational thinking as an emerging competence domain
Yadav, A.; Good, J.; Voogt, J.; Fisser, P.; Mulder, M.
2016-01-01
Computational thinking is a problem-solving skill set, which includes problem decomposition, algorithmic thinking, abstraction, and automation. Even though computational thinking draws upon concepts fundamental to computer science (CS), it has broad application to all disciplines. It has been
Recent Advances in Solving the Protein Threading Problem
Andonov, Rumen; Gibrat, Jean-François; Marin, Antoine; Poirriez, Vincent; Yanev, Nikola
2007-01-01
The fold recognition methods are promissing tools for capturing the structure of a protein by its amino acid residues sequence but their use is still restricted by the needs of huge computational resources and suitable efficient algorithms as well. In the recent version of FROST (Fold Recognition Oriented Search Tool) package the most efficient algorithm for solving the Protein Threading Problem (PTP) is implemented due to the strong collaboration between the SYMBIOSE group in IRISA and MIG in Jouy-en-Josas. In this paper, we present the diverse components of FROST, emphasizing on the recent advances in formulating and solving new versions of the PTP and on the way of solving on a computer cluster a million of instances in a easonable time.
Quantitative Reasoning in Problem Solving
Ramful, Ajay; Ho, Siew Yin
2015-01-01
In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.
On transfer during problem solving
Hamel, R.; Jakab, E.
2013-01-01
A puzzle is equally new for everyone who is presented with it for the first time. However, it is not if we take one’s previous knowledge into account. Some knowledge may be utilised while working on the puzzle. If this is the case, problem solving as well as the development of knowledge about the pu
Common Core: Solve Math Problems
Strom, Erich
2012-01-01
The new common core standards for mathematics demand that students (and teachers!) exhibit deeper conceptual understanding. That's music to the ears of education professor John Tapper, who says teachers have overemphasized teaching procedures--and getting right answers. In his new book, "Solving for Why," he makes a powerful case for moving beyond…
Problem-Solving Test: Pyrosequencing
Szeberenyi, Jozsef
2013-01-01
Terms to be familiar with before you start to solve the test: Maxam-Gilbert sequencing, Sanger sequencing, gel electrophoresis, DNA synthesis reaction, polymerase chain reaction, template, primer, DNA polymerase, deoxyribonucleoside triphosphates, orthophosphate, pyrophosphate, nucleoside monophosphates, luminescence, acid anhydride bond,…
Students' Problem Solving and Justification
Glass, Barbara; Maher, Carolyn A.
2004-01-01
This paper reports on methods of students' justifications of their solution to a problem in the area of combinatorics. From the analysis of the problem solving of 150 students in a variety of settings from high-school to graduate study, four major forms of reasoning evolved: (1) Justification by Cases, (2) Inductive Argument, (3) Elimination…
Promote Problem-Solving Discourse
Bostic, Jonathan; Jacobbe, Tim
2010-01-01
Fourteen fifth-grade students gather at the front of the classroom as their summer school instructor introduces Jonathan Bostic as the mathematics teacher for the week. Before examining any math problems, Bostic sits at eye level with the students and informs them that they will solve problems over the next four days by working individually as…
Teaching Employees to Solve Problems.
Miller, Lauren E.; Feggestad, Kurt
1987-01-01
John Deere's systematic problem-solving training for its employees is applicable in the vocational classroom. The process includes stating the problem, writing its specifications, identifying distinctions, determining changes that occurred at the time, identifying possible causes, testing the possibilities, verifying the most probable cause, and…
Pizlo, Zygmunt
2007-01-01
This paper presents a bibliography of a little more than 100 references related to human problem solving, arranged by subject matter. The references were taken from PsycInfo and Compendex databases. Only journal papers, books and dissertations are included. The topics include human development, education, neuroscience, research in applied…
Solving Differential Equations in R: Package deSolve
Directory of Open Access Journals (Sweden)
Karline Soetaert
2010-02-01
Full Text Available In this paper we present the R package deSolve to solve initial value problems (IVP written as ordinary differential equations (ODE, differential algebraic equations (DAE of index 0 or 1 and partial differential equations (PDE, the latter solved using the method of lines approach. The differential equations can be represented in R code or as compiled code. In the latter case, R is used as a tool to trigger the integration and post-process the results, which facilitates model development and application, whilst the compiled code significantly increases simulation speed. The methods implemented are efficient, robust, and well documented public-domain Fortran routines. They include four integrators from the ODEPACK package (LSODE, LSODES, LSODA, LSODAR, DVODE and DASPK2.0. In addition, a suite of Runge-Kutta integrators and special-purpose solvers to efficiently integrate 1-, 2- and 3-dimensional partial differential equations are available. The routines solve both stiff and non-stiff systems, and include many options, e.g., to deal in an efficient way with the sparsity of the Jacobian matrix, or finding the root of equations. In this article, our objectives are threefold: (1 to demonstrate the potential of using R for dynamic modeling, (2 to highlight typical uses of the different methods implemented and (3 to compare the performance of models specified in R code and in compiled code for a number of test cases. These comparisons demonstrate that, if the use of loops is avoided, R code can efficiently integrate problems comprising several thousands of state variables. Nevertheless, the same problem may be solved from 2 to more than 50 times faster by using compiled code compared to an implementation using only R code. Still, amongst the benefits of R are a more flexible and interactive implementation, better readability of the code, and access to R’s high-level procedures. deSolve is the successor of package odesolve which will be deprecated in
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Numerical stability of descent methods for solving linear equations
Bollen, Jo A.M.
1984-01-01
In this paper we perform a round-off error analysis of descent methods for solving a liner systemAx=b, whereA is supposed to be symmetric and positive definite. This leads to a general result on the attainable accuracy of the computed sequence {xi} when the method is performed in floating point arit
Solving Abel integral equations of first kind via fractional calculus
Directory of Open Access Journals (Sweden)
Salman Jahanshahi
2015-04-01
Full Text Available We give a new method for numerically solving Abel integral equations of first kind. An estimation for the error is obtained. The method is based on approximations of fractional integrals and Caputo derivatives. Using trapezoidal rule and Computer Algebra System Maple, the exact and approximation values of three Abel integral equations are found, illustrating the effectiveness of the proposed approach.
How Digital Scaffolds in Games Direct Problem-Solving Behaviors
Sun, Chuen-Tsai; Wang, Dai-Yi; Chan, Hui-Ling
2011-01-01
Digital systems offer computational power and instant feedback. Game designers are using these features to create scaffolding tools to reduce player frustration. However, researchers are finding some unexpected effects of scaffolding on strategy development and problem-solving behaviors. We used a digital Sudoku game named "Professor Sudoku" to…
Parallel SAT Solving using Bit-level Operations
Heule, M.J.H.; Van Maaren, H.
2008-01-01
We show how to exploit the 32/64 bit architecture of modern computers to accelerate some of the algorithms used in satisfiability solving by modifying assignments to variables in parallel on a single processor. Techniques such as random sampling demonstrate that while using bit vectors instead of Bo
Solving the Water Jugs Problem by an Integer Sequence Approach
Man, Yiu-Kwong
2012-01-01
In this article, we present an integer sequence approach to solve the classic water jugs problem. The solution steps can be obtained easily by additions and subtractions only, which is suitable for manual calculation or programming by computer. This approach can be introduced to secondary and undergraduate students, and also to teachers and…
Genetics problem solving and worldview
Dale, Esther
The research goal was to determine whether worldview relates to traditional and real-world genetics problem solving. Traditionally, scientific literacy emphasized content knowledge alone because it was sufficient to solve traditional problems. The contemporary definition of scientific literacy is, "The knowledge and understanding of scientific concepts and processes required for personal decision-making, participation in civic and cultural affairs and economic productivity" (NRC, 1996). An expanded definition of scientific literacy is needed to solve socioscientific issues (SSI), complex social issues with conceptual, procedural, or technological associations with science. Teaching content knowledge alone assumes that students will find the scientific explanation of a phenomenon to be superior to a non-science explanation. Formal science and everyday ways of thinking about science are two different cultures (Palmer, 1999). Students address this rift with cognitive apartheid, the boxing away of science knowledge from other types of knowledge (Jedege & Aikenhead, 1999). By addressing worldview, cognitive apartheid may decrease and scientific literacy may increase. Introductory biology students at the University of Minnesota during fall semester 2005 completed a written questionnaire-including a genetics content-knowledge test, four genetic dilemmas, the Worldview Assessment Instrument (WAI) and some items about demographics and religiosity. Six students responded to the interview protocol. Based on statistical analysis and interview data, this study concluded the following: (1) Worldview, in the form of metaphysics, relates to solving traditional genetic dilemmas. (2) Worldview, in the form of agency, relates to solving traditional genetics problems. (3) Thus, worldview must be addressed in curriculum, instruction, and assessment.
A DNA-based registry for all animal species: the barcode index number (BIN system.
Directory of Open Access Journals (Sweden)
Sujeevan Ratnasingham
Full Text Available Because many animal species are undescribed, and because the identification of known species is often difficult, interim taxonomic nomenclature has often been used in biodiversity analysis. By assigning individuals to presumptive species, called operational taxonomic units (OTUs, these systems speed investigations into the patterning of biodiversity and enable studies that would otherwise be impossible. Although OTUs have conventionally been separated through their morphological divergence, DNA-based delineations are not only feasible, but have important advantages. OTU designation can be automated, data can be readily archived, and results can be easily compared among investigations. This study exploits these attributes to develop a persistent, species-level taxonomic registry for the animal kingdom based on the analysis of patterns of nucleotide variation in the barcode region of the cytochrome c oxidase I (COI gene. It begins by examining the correspondence between groups of specimens identified to a species through prior taxonomic work and those inferred from the analysis of COI sequence variation using one new (RESL and four established (ABGD, CROP, GMYC, jMOTU algorithms. It subsequently describes the implementation, and structural attributes of the Barcode Index Number (BIN system. Aside from a pragmatic role in biodiversity assessments, BINs will aid revisionary taxonomy by flagging possible cases of synonymy, and by collating geographical information, descriptive metadata, and images for specimens that are likely to belong to the same species, even if it is undescribed. More than 274,000 BIN web pages are now available, creating a biodiversity resource that is positioned for rapid growth.
Automated DNA-based plant identification for large-scale biodiversity assessment.
Papadopoulou, Anna; Chesters, Douglas; Coronado, Indiana; De la Cadena, Gissela; Cardoso, Anabela; Reyes, Jazmina C; Maes, Jean-Michel; Rueda, Ricardo M; Gómez-Zurita, Jesús
2015-01-01
Rapid degradation of tropical forests urges to improve our efficiency in large-scale biodiversity assessment. DNA barcoding can assist greatly in this task, but commonly used phenetic approaches for DNA-based identifications rely on the existence of comprehensive reference databases, which are infeasible for hyperdiverse tropical ecosystems. Alternatively, phylogenetic methods are more robust to sparse taxon sampling but time-consuming, while multiple alignment of species-diagnostic, typically length-variable, markers can be problematic across divergent taxa. We advocate the combination of phylogenetic and phenetic methods for taxonomic assignment of DNA-barcode sequences against incomplete reference databases such as GenBank, and we developed a pipeline to implement this approach on large-scale plant diversity projects. The pipeline workflow includes several steps: database construction and curation, query sequence clustering, sequence retrieval, distance calculation, multiple alignment and phylogenetic inference. We describe the strategies used to establish these steps and the optimization of parameters to fit the selected psbA-trnH marker. We tested the pipeline using infertile plant samples and herbivore diet sequences from the highly threatened Nicaraguan seasonally dry forest and exploiting a valuable purpose-built resource: a partial local reference database of plant psbA-trnH. The selected methodology proved efficient and reliable for high-throughput taxonomic assignment, and our results corroborate the advantage of applying 'strict' tree-based criteria to avoid false positives. The pipeline tools are distributed as the scripts suite 'BAGpipe' (pipeline for Biodiversity Assessment using GenBank data), which can be readily adjusted to the purposes of other projects and applied to sequence-based identification for any marker or taxon.
Directory of Open Access Journals (Sweden)
Ahlinder Jon
2012-09-01
Full Text Available Abstract Background Recent advances in sequencing technologies offer promising tools for generating large numbers of genomes, larger typing databases and improved mapping of environmental bacterial diversity. However, DNA-based methods for the detection of Francisella were developed with limited knowledge about genetic diversity. This, together with the high sequence identity between several Francisella species, means there is a high risk of false identification and detection of the highly virulent pathogen Francisella tularensis. Moreover, phylogenetic reconstructions using single or limited numbers of marker sequences often result in incorrect tree topologies and inferred evolutionary distances. The recent growth in publicly accessible whole-genome sequences now allows evaluation of published genetic markers to determine optimal combinations of markers that minimise both time and laboratory costs. Results In the present study, we evaluated 38 previously published DNA markers and the corresponding PCR primers against 42 genomes representing the currently known diversity of the genus Francisella. The results highlight that PCR assays for Francisella tularensis are often complicated by low specificity, resulting in a high probability of false positives. A method to select a set of one to seven markers for obtaining optimal phylogenetic resolution or diagnostic accuracy is presented. Conclusions Current multiple-locus sequence-typing systems and detection assays of Francisella, could be improved by redesigning some of the primers and reselecting typing markers. The use of only a few optimally selected sequence-typing markers allows construction of phylogenetic topologies with almost the same accuracy as topologies based on whole-genome sequences.
Directory of Open Access Journals (Sweden)
Marcel Tutor Ale
2016-06-01
Full Text Available This work reveals new, important insights about the influence of broad spatial variations on the phylogenetic relationship and chemical characteristics of Ghanaian Hypnea musciformis—a carrageenan-containing red seaweed. DNA barcoding techniques alleviate the difficulty for accurate morphological identification. COI barcode sequences of the Ghanaian H. musciformis showed <0.7% intraspecies divergence, indicating no distinct phylogenetic variation, suggesting that they actually belong to the same species. Thus, the spatial distribution of the sampling sites along the coast of Ghana did not influence the phylogenetic characteristics of H. musciformis in the region. The data also showed that the Ghanaian Hypnea sp. examined in this work should be regarded as the same species as the H. musciformis collected in Brazilian Sao Paulo (KP725276 with only 0.8%–1.3% intraspecies divergence. However, the comparison of COI sequences of Ghanaian H. musciformis with the available COI sequence of H. musciformis from other countries showed intraspecies divergences of 0%–6.9% indicating that the COI sequences for H. musciformis in the GenBank may include different subspecies. Although samples did not differ phylogenetically, the chemical characteristics of the H. musciformis differed significantly between different sampling locations in Ghana. The levels of the monosaccharides, notably galactose (20%–30% dw and glucose (10%–18% dw, as well as the seawater inorganic salt concentration (21–32 mg/L and ash content (19%–33% dw, varied between H. musciformis collected at different coastal locations in Ghana. The current work demonstrated that DNA-based identification allowed a detailed understanding of H. musciformis phylogenetic characteristics and revealed that chemical compositional differences of H. musciformis occur along the Ghanaian coast which are not coupled with genetic variations among those samples.
Directory of Open Access Journals (Sweden)
Barbara Bryant
Full Text Available In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this "chromatin computer" to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal--and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines.
Learning via problem solving in mathematics education
Directory of Open Access Journals (Sweden)
Piet Human
2009-09-01
Full Text Available Three forms of mathematics education at school level are distinguished: direct expository teaching with an emphasis on procedures, with the expectation that learners will at some later stage make logical and functional sense of what they have learnt and practised (the prevalent form, mathematically rigorous teaching in terms of fundamental mathematical concepts, as in the so-called “modern mathematics” programmes of the sixties, teaching and learning in the context of engaging with meaningful problems and focused both on learning to become good problem solvers (teaching for problem solving andutilising problems as vehicles for the development of mathematical knowledge andproﬁciency by learners (problem-centred learning, in conjunction with substantialteacher-led social interaction and mathematical discourse in classrooms.Direct expository teaching of mathematical procedures dominated in school systems after World War II, and was augmented by the “modern mathematics” movement in the period 1960-1970. The latter was experienced as a major failure, and was soon abandoned. Persistent poor outcomes of direct expository procedural teaching of mathematics for the majority of learners, as are still being experienced in South Africa, triggered a world-wide movement promoting teaching mathematics for and via problem solving in the seventies and eighties of the previous century. This movement took the form of a variety of curriculum experiments in which problem solving was the dominant classroom activity, mainly in the USA, Netherlands, France and South Africa. While initially focusing on basic arithmetic (computation with whole numbers and elementary calculus, the problem-solving movement started to address other mathematical topics (for example, elementary statistics, algebra, differential equations around the turn of the century. The movement also spread rapidly to other countries, including Japan, Singapore and Australia. Parallel with the
DNA-based vaccines activate innate and adaptive antitumor immunity by engaging the NKG2D receptor.
Zhou, He; Luo, Yunping; Lo, Jeng-fan; Kaplan, Charles D; Mizutani, Masato; Mizutani, Noriko; Lee, Jiing-Dwan; Primus, F James; Becker, Jürgen C; Xiang, Rong; Reisfeld, Ralph A
2005-08-02
The interaction of NKG2D, a stimulatory receptor expressed on natural killer (NK) cells and activated CD8(+) T cells, and its ligands mediates stimulatory and costimulatory signals to these cells. Here, we demonstrate that DNA-based vaccines, encoding syngeneic or allogeneic NKG2D ligands together with tumor antigens such as survivin or carcinoembryonic antigen, markedly activate both innate and adaptive antitumor immunity. Such vaccines result in highly effective, NK- and CD8(+) T cell-mediated protection against either breast or colon carcinoma cells in prophylactic and therapeutic settings. Notably, this protection was irrespective of the NKG2D ligand expression level of the tumor cells. Hence, this strategy has the potential to lead to widely applicable and possibly clinically useful DNA-based cancer vaccines.
Solving Einstein's Equations With Dual Coordinate Frames
Scheel, M A; Lindblom, L; Pfeiffer, H P; Rinne, O; Teukolsky, S A; Kidder, Lawrence E.; Lindblom, Lee; Pfeiffer, Harald P.; Rinne, Oliver; Scheel, Mark A.; Teukolsky, Saul A.
2006-01-01
A method is introduced for solving Einstein's equations using two distinct coordinate systems. The coordinate basis vectors associated with one system are used to project out components of the metric and other fields, in analogy with the way fields are projected onto an orthonormal tetrad basis. These field components are then determined as functions of a second independent coordinate system. The transformation to the second coordinate system can be thought of as a mapping from the original ``inertial'' coordinate system to the computational domain. This dual-coordinate method is used to perform stable numerical evolutions of a black-hole spacetime using the generalized harmonic form of Einstein's equations in coordinates that rotate with respect to the inertial frame at infinity; such evolutions are found to be generically unstable using a single rotating coordinate frame. The dual-coordinate method is also used here to evolve binary black-hole spacetimes for several orbits. The great flexibility of this met...
Solving Partial Differential Equations on Overlapping Grids
Energy Technology Data Exchange (ETDEWEB)
Henshaw, W D
2008-09-22
We discuss the solution of partial differential equations (PDEs) on overlapping grids. This is a powerful technique for efficiently solving problems in complex, possibly moving, geometry. An overlapping grid consists of a set of structured grids that overlap and cover the computational domain. By allowing the grids to overlap, grids for complex geometries can be more easily constructed. The overlapping grid approach can also be used to remove coordinate singularities by, for example, covering a sphere with two or more patches. We describe the application of the overlapping grid approach to a variety of different problems. These include the solution of incompressible fluid flows with moving and deforming geometry, the solution of high-speed compressible reactive flow with rigid bodies using adaptive mesh refinement (AMR), and the solution of the time-domain Maxwell's equations of electromagnetism.
Hahn, Alexandra
2009-01-01
DNA-based methodologies have become an integral part of food and feed analysis and are commonly applied in different fields such as the analysis of genetically modified organisms (GMO) and the detection of allergens. The method development performed in this study was focussed on novel strategies meeting different demands of GMO and allergen analysis. The new ligation-dependent probe amplification (LPA) technique and the established real-time PCR technology were used to develop and to validate...
Methods of solving nonstandard problems
Grigorieva, Ellina
2015-01-01
This book, written by an accomplished female mathematician, is the second to explore nonstandard mathematical problems – those that are not directly solved by standard mathematical methods but instead rely on insight and the synthesis of a variety of mathematical ideas. It promotes mental activity as well as greater mathematical skills, and is an ideal resource for successful preparation for the mathematics Olympiad. Numerous strategies and techniques are presented that can be used to solve intriguing and challenging problems of the type often found in competitions. The author uses a friendly, non-intimidating approach to emphasize connections between different fields of mathematics and often proposes several different ways to attack the same problem. Topics covered include functions and their properties, polynomials, trigonometric and transcendental equations and inequalities, optimization, differential equations, nonlinear systems, and word problems. Over 360 problems are included with hints, ...
Solving higher curvature gravity theories
Energy Technology Data Exchange (ETDEWEB)
Chakraborty, Sumanta [IUCAA, Pune (India); SenGupta, Soumitra [Indian Association for the Cultivation of Science, Theoretical Physics Department, Kolkata (India)
2016-10-15
Solving field equations in the context of higher curvature gravity theories is a formidable task. However, in many situations, e.g., in the context of f(R) theories, the higher curvature gravity action can be written as an Einstein-Hilbert action plus a scalar field action. We show that not only the action but the field equations derived from the action are also equivalent, provided the spacetime is regular. We also demonstrate that such an equivalence continues to hold even when the gravitational field equations are projected on a lower-dimensional hypersurface. We have further addressed explicit examples in which the solutions for Einstein-Hilbert and a scalar field system lead to solutions of the equivalent higher curvature theory. The same, but on the lower-dimensional hypersurface, has been illustrated in the reverse order as well. We conclude with a brief discussion on this technique of solving higher curvature field equations. (orig.)
Solving Limited Memory Influence Diagrams
Mauá, Denis Deratani; Zaffalon, Marco
2011-01-01
We present a new algorithm for exactly solving decision making problems represented as influence diagrams. We do not require the usual assumptions of no forgetting and regularity; this allows us to solve problems with simultaneous decisions and limited information. The algorithm is empirically shown to outperform a state-of-the-art algorithm on randomly generated problems of up to 150 variables and $10^{64}$ solutions. We show that the problem is NP-hard even if the underlying graph structure of the problem has small treewidth and the variables take on a bounded number of states, but that a fully polynomial time approximation scheme exists for these cases. Moreover, we show that the bound on the number of states is a necessary condition for any efficient approximation scheme.
Reasoning, Problem Solving, and Intelligence.
1980-04-01
first half of the analogy or because of vocabulary demand in the second half of the analogy. For example, FELINE is to CANINE as CAT is to ? was...the ecological validity of the task as a representative case of real-world problem solving. Investigation of the task’s ecological validity, or at...analogies) measure intellectual functioning of such a basic kind that ecological validity is less important. But if one’s goal is to study the ability to
Journey toward Teaching Mathematics through Problem Solving
Sakshaug, Lynae E.; Wohlhuter, Kay A.
2010-01-01
Teaching mathematics through problem solving is a challenge for teachers who learned mathematics by doing exercises. How do teachers develop their own problem solving abilities as well as their abilities to teach mathematics through problem solving? A group of teachers began the journey of learning to teach through problem solving while taking a…
Anticipating Student Responses to Improve Problem Solving
Wallace, Ann H.
2007-01-01
This article illustrates how problem solving can be enhanced through careful planning and problem presentation. Often, students shut down or are turned off when presented with a problem to solve. The author describes how to motivate students to embrace a problem to be solved and provides helpful prompts to further the problem-solving process.…
Assessing Algebraic Solving Ability: A Theoretical Framework
Lian, Lim Hooi; Yew, Wun Thiam
2012-01-01
Algebraic solving ability had been discussed by many educators and researchers. There exists no definite definition for algebraic solving ability as it can be viewed from different perspectives. In this paper, the nature of algebraic solving ability in terms of algebraic processes that demonstrate the ability in solving algebraic problem is…
Watanabe, Kohei; Koga, Hajime; Nakamura, Kodai; Fujita, Akiko; Hattori, Akimasa; Matsuda, Masaru; Koga, Akihiko
2014-04-01
DNA-based transposable elements are ubiquitous constituents of eukaryotic genomes. Vertebrates are, however, exceptional in that most of their DNA-based elements appear to be inactivated. The Tol1 element of the medaka fish, Oryzias latipes, is one of the few elements for which copies containing an undamaged gene have been found. Spontaneous transposition of this element in somatic cells has previously been demonstrated, but there is only indirect evidence for its germline transposition. Here, we show direct evidence of spontaneous excision in the germline. Tyrosinase is the key enzyme in melanin biosynthesis. In an albino laboratory strain of medaka fish, which is homozygous for a mutant tyrosinase gene in which a Tol1 copy is inserted, we identified de novo reversion mutations related to melanin pigmentation. The gamete-based reversion rate was as high as 0.4%. The revertant fish carried the tyrosinase gene from which the Tol1 copy had been excised. We previously reported the germline transposition of Tol2, another DNA-based element that is thought to be a recent invader of the medaka fish genome. Tol1 is an ancient resident of the genome. Our results indicate that even an old element can contribute to genetic variation in the host genome as a natural mutator.
Cho, Youngsuk; Lee, Junyeong; Lim, June Yeong; Yu, Sanghyuck; Yi, Yeonjin; Im, Seongil
2017-02-01
DNA-based small molecules of guanine, cytosine, thymine and adenine are adopted for the charge injection layer between the Au electrodes and organic semiconductor, heptazole (C26H16N2). The heptazole-channel organic field effect transistors (OFETs) with a DNA-based small molecule charge injection layer showed higher hole mobility (maximum 0.12 cm2 V-1 s-1) than that of a pristine device (0.09 cm2 V-1 s-1). We characterized the contact resistance of each device by a transfer length method (TLM) and found that the guanine layer among all DNA-based materials performs best as a hole injection layer leading to the lowest contact resistance. Since the guanine layer is also known to be a proper channel passivation layer coupled with a thin conformal Al2O3 layer protecting the channel from bias stress and ambient molecules, we could realize ultra-stable OFETs utilizing guanine/Au contact and guanine/Al2O3 bilayer on the organic channel.
Mathematical problem solving by analogy.
Novick, L R; Holyoak, K J
1991-05-01
We report the results of 2 experiments and a verbal protocol study examining the component processes of solving mathematical word problems by analogy. College students first studied a problem and its solution, which provided a potential source for analogical transfer. Then they attempted to solve several analogous problems. For some problems, subjects received one of a variety of hints designed to reduce or eliminate the difficulty of some of the major processes hypothesized to be involved in analogical transfer. Our studies yielded 4 major findings. First, the process of mapping the features of the source and target problems and the process of adapting the source solution procedure for use in solving the target problem were clearly distinguished: (a) Successful mapping was found to be insufficient for successful transfer and (b) adaptation was found to be a major source of transfer difficulty. Second, we obtained direct evidence that schema induction is a natural consequence of analogical transfer. The schema was found to co-exist with the problems from which it was induced, and both the schema and the individual problems facilitated later transfer. Third, for our multiple-solution problems, the relation between analogical transfer and solution accuracy was mediated by the degree of time pressure exerted for the test problems. Finally, mathematical expertise was a significant predictor of analogical transfer, but general analogical reasoning ability was not. The implications of the results for models of analogical transfer and for instruction were considered.
Solving the Curriculum Sequencing Problem with DNA Computing Approach
Debbah, Amina; Ben Ali, Yamina Mohamed
2014-01-01
In the e-learning systems, a learning path is known as a sequence of learning materials linked to each others to help learners achieving their learning goals. As it is impossible to have the same learning path that suits different learners, the Curriculum Sequencing problem (CS) consists of the generation of a personalized learning path for each…
Solving project scheduling problems by minimum cut computations
Möhring, R.H.; Schulz, A.S.; Stork, F.; Uetz, M.J.
2003-01-01
In project scheduling, a set of precedence-constrained jobs has to be scheduled so as to minimize a given objective. In resource-constrained project scheduling, the jobs additionally compete for scarce resources. Due to its universality, the latter problem has a variety of applications in manufactur
Human and machine diagnosis of scientific problem-solving abilities
Good, Ron; Kromhout, Robert; Bandler, Wyllis
Diagnosis of the problem-solving state of a novice student in science, by an accomplished teacher, is studied in order to build a computer system that will simulate the process. Although such expert systems have been successfully developed in medicine (MYCIN, INTERNIST/CADUCEUS), very little has been accomplished in science education, even though there is a reasonably close parallel between expert medical diagnosis of patients with physiological problems and expert instructional diagnosis of students with learning problems. The system described in this paper, DIPS: Diagnosis for Instruction in Problem Solving, involves a new line of research for science educators interested in interdisciplinary efforts and ways in which computer technology might be used to better understand how to improve science learning. The basic architecture of the DIPS system is outlined and explained in terms of instruction and research implications, and the role of such intelligent computer systems in science education of the future is considered.
Using graph theory for automated electric circuit solving
Toscano, L.; Stella, S.; Milotti, E.
2015-05-01
Graph theory plays many important roles in modern physics and in many different contexts, spanning diverse topics such as the description of scale-free networks and the structure of the universe as a complex directed graph in causal set theory. Graph theory is also ideally suited to describe many concepts in computer science. Therefore it is increasingly important for physics students to master the basic concepts of graph theory. Here we describe a student project where we develop a computational approach to electric circuit solving which is based on graph theoretic concepts. This highly multidisciplinary approach combines abstract mathematics, linear algebra, the physics of circuits, and computer programming to reach the ambitious goal of implementing automated circuit solving.
A new DNA algorithm to solve graph coloring problem
Institute of Scientific and Technical Information of China (English)
Jiang Xingpeng; Li Yin; Meng Ya; Meng Dazhi
2007-01-01
Using a small quantity of DNA molecules and little experimental time to solve complex problems successfully is a goal of DNA computing. Some NP-hard problems have been solved by DNA computing with lower time complexity than conventional computing.However, this advantage often brings higher space complexity and needs a large number of DNA encoding molecules. One example is graph coloring problem. Current DNA algorithms need exponentially increasing DNA encoding strands with the growing of problem size.Here we propose a new DNA algorithm of graph coloring problem based on the proof of four-color theorem. This algorithm has good properties of needing a relatively small number of operations in polynomial time and needing a small number of DNA encoding molecules (we need only 6R DNA encoding molecules if the number of regions in a graph is R ).
On the variational data assimilation problem solving and sensitivity analysis
Arcucci, Rossella; D'Amore, Luisa; Pistoia, Jenny; Toumi, Ralf; Murli, Almerico
2017-04-01
We consider the Variational Data Assimilation (VarDA) problem in an operational framework, namely, as it results when it is employed for the analysis of temperature and salinity variations of data collected in closed and semi closed seas. We present a computing approach to solve the main computational kernel at the heart of the VarDA problem, which outperforms the technique nowadays employed by the oceanographic operative software. The new approach is obtained by means of Tikhonov regularization. We provide the sensitivity analysis of this approach and we also study its performance in terms of the accuracy gain on the computed solution. We provide validations on two realistic oceanographic data sets.