WorldWideScience

Sample records for computing groebner bases

  1. Computation of Groebner bases for two-loop propagator type integrals

    International Nuclear Information System (INIS)

    Tarasov, O.V.

    2004-01-01

    The Groebner basis technique for calculating Feynman diagrams proposed in (Acta Phys. Pol. B 29(1998) 2655) is applied to the two-loop propagator type integrals with arbitrary masses and momentum. We describe the derivation of Groebner bases for all integrals with 1PI topologies and present explicit content of the Groebner bases

  2. Computation of Groebner bases for two-loop propagator type integrals

    Energy Technology Data Exchange (ETDEWEB)

    Tarasov, O.V. [DESY Zeuthen, Theory Group, Deutsches Elektronen Synchrotron, DESY, Platanenallee 6, D-15738 Zeuthen (Germany)]. E-mail: tarasov@ifh.de

    2004-11-21

    The Groebner basis technique for calculating Feynman diagrams proposed in (Acta Phys. Pol. B 29(1998) 2655) is applied to the two-loop propagator type integrals with arbitrary masses and momentum. We describe the derivation of Groebner bases for all integrals with 1PI topologies and present explicit content of the Groebner bases.

  3. On computation of Groebner bases for linear difference systems

    Energy Technology Data Exchange (ETDEWEB)

    Gerdt, Vladimir P. [Laboratory of Information Technologies, Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation)]. E-mail: gerdt@jinr.ru

    2006-04-01

    In this paper, we present an algorithm for computing Groebner bases of linear ideals in a difference polynomial ring over a ground difference field. The input difference polynomials generating the ideal are also assumed to be linear. The algorithm is an adaptation to difference ideals of our polynomial algorithm based on Janet-like reductions.

  4. On computation of Groebner bases for linear difference systems

    International Nuclear Information System (INIS)

    Gerdt, Vladimir P.

    2006-01-01

    In this paper, we present an algorithm for computing Groebner bases of linear ideals in a difference polynomial ring over a ground difference field. The input difference polynomials generating the ideal are also assumed to be linear. The algorithm is an adaptation to difference ideals of our polynomial algorithm based on Janet-like reductions

  5. Groebner Bases Based Verification Solution for SystemVerilog Concurrent Assertions

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2014-01-01

    of polynomial ring algebra to perform SystemVerilog assertion verification over digital circuit systems. This method is based on Groebner bases theory and sequential properties checking. We define a constrained subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using Groebner bases for concurrent SVAs checking. Case studies show that computer algebra can provide canonical symbolic representations for both assertions and circuit designs and can act as a novel solver engine from the viewpoint of symbolic computation.

  6. Groebner bases for finite-temperature quantum computing and their complexity

    International Nuclear Information System (INIS)

    Crompton, P. R.

    2011-01-01

    Following the recent approach of using order domains to construct Groebner bases from general projective varieties, we examine the parity and time-reversal arguments relating to the Wightman axioms of quantum field theory and propose that the definition of associativity in these axioms should be introduced a posteriori to the cluster property in order to generalize the anyon conjecture for quantum computing to indefinite metrics. We then show that this modification, which we define via ideal quotients, does not admit a faithful representation of the Braid group, because the generalized twisted inner automorphisms that we use to reintroduce associativity are only parity invariant for the prime spectra of the exterior algebra. We then use a coordinate prescription for the quantum deformations of toric varieties to show how a faithful representation of the Braid group can be reconstructed and argue that for a degree reverse lexicographic (monomial) ordered Groebner basis, the complexity class of this problem is bounded quantum polynomial.

  7. A Maple package for computing Groebner bases for linear recurrence relations

    International Nuclear Information System (INIS)

    Gerdt, Vladimir P.; Robertz, Daniel

    2006-01-01

    A Maple package for computing Groebner bases of linear difference ideals is described. The underlying algorithm is based on Janet and Janet-like monomial divisions associated with finite difference operators. The package can be used, for example, for automatic generation of difference schemes for linear partial differential equations and for reduction of multiloop Feynman integrals. These two possible applications are illustrated by simple examples of the Laplace equation and a one-loop scalar integral of propagator type

  8. A Maple package for computing Groebner bases for linear recurrence relations

    Energy Technology Data Exchange (ETDEWEB)

    Gerdt, Vladimir P. [Laboratory of Information Technologies, Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation)]. E-mail: gerdt@jinr.ru; Robertz, Daniel [Lehrstuhl B fuer Mathematik, RWTH Aachen, Templergraben 64, D-52062 Aachen (Germany)]. E-mail: daniel@momo.math.rwth-aachen.de

    2006-04-01

    A Maple package for computing Groebner bases of linear difference ideals is described. The underlying algorithm is based on Janet and Janet-like monomial divisions associated with finite difference operators. The package can be used, for example, for automatic generation of difference schemes for linear partial differential equations and for reduction of multiloop Feynman integrals. These two possible applications are illustrated by simple examples of the Laplace equation and a one-loop scalar integral of propagator type.

  9. Groebner bases in perturbative calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gerdt, Vladimir P. [Laboratory of Information Technologies, Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation)

    2004-10-01

    In this paper we outline the most general and universal algorithmic approach to reduction of loop integrals to basic integrals. The approach is based on computation of Groebner bases for recurrence relations derived from the integration by parts method. In doing so we consider generic recurrence relations when propagators have arbitrary integer powers treated as symbolic variables (indices) for the relations.

  10. Groebner bases in perturbative calculations

    International Nuclear Information System (INIS)

    Gerdt, Vladimir P.

    2004-01-01

    In this paper we outline the most general and universal algorithmic approach to reduction of loop integrals to basic integrals. The approach is based on computation of Groebner bases for recurrence relations derived from the integration by parts method. In doing so we consider generic recurrence relations when propagators have arbitrary integer powers treated as symbolic variables (indices) for the relations

  11. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  12. Truncated Groebner fans and lattice ideals

    OpenAIRE

    Lauritzen, Niels

    2005-01-01

    We outline a generalization of the Groebner fan of a homogeneous ideal with maximal cells parametrizing truncated Groebner bases. This "truncated" Groebner fan is usually much smaller than the full Groebner fan and offers the natural framework for conversion between truncated Groebner bases. The generic Groebner walk generalizes naturally to this setting by using the Buchberger algorithm with truncation on facets. We specialize to the setting of lattice ideals. Here facets along the generic w...

  13. M4GB : Efficient Groebner Basis algorithm

    NARCIS (Netherlands)

    R.H. Makarim (Rusydi); M.M.J. Stevens (Marc)

    2017-01-01

    textabstractWe introduce a new efficient algorithm for computing Groebner-bases named M4GB. Like Faugere's algorithm F4 it is an extension of Buchberger's algorithm that describes: how to store already computed (tail-)reduced multiples of basis polynomials to prevent redundant work in the reduction

  14. Applying Groebner bases to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Alexander V.; Smirnov, Vladimir A.

    2006-01-01

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential

  15. Applying Groebner bases to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Alexander V. [Mechanical and Mathematical Department and Scientific Research Computer Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, Vladimir A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-01-15

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential.

  16. A non-regular Groebner fan

    OpenAIRE

    Jensen, Anders N.

    2005-01-01

    The Groebner fan of an ideal $I\\subset k[x_1,...,x_n]$, defined by Mora and Robbiano, is a complex of polyhedral cones in $R^n$. The maximal cones of the fan are in bijection with the distinct monomial initial ideals of $I$ as the term order varies. If $I$ is homogeneous the Groebner fan is complete and is the normal fan of the state polytope of $I$. In general the Groebner fan is not complete and therefore not the normal fan of a polytope. We may ask if the restricted Groebner fan, a subdivi...

  17. Groebner Finite Path Algebras

    OpenAIRE

    Leamer, Micah J.

    2004-01-01

    Let K be a field and Q a finite directed multi-graph. In this paper I classify all path algebras KQ and admissible orders with the property that all of their finitely generated ideals have finite Groebner bases. MS

  18. An algorithm to construct Groebner bases for solving integration by parts relations

    International Nuclear Information System (INIS)

    Smirnov, Alexander V.

    2006-01-01

    This paper is a detailed description of an algorithm based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. The algorithm is used to calculate Feynman integrals and has proved to be efficient in several complicated cases

  19. Parallel Algorithms for Groebner-Basis Reduction

    Science.gov (United States)

    1987-09-25

    22209 ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE (Include Security Classification) * PARALLEL ALGORITHMS FOR GROEBNER -BASIS REDUCTION 12. PERSONAL...All other editions are obsolete. Productivity Engineering in the UNIXt Environment p Parallel Algorithms for Groebner -Basis Reduction Technical Report

  20. Computing multiple periodic solutions of nonlinear vibration problems using the harmonic balance method and Groebner bases

    Science.gov (United States)

    Grolet, Aurelien; Thouverez, Fabrice

    2015-02-01

    This paper is devoted to the study of vibration of mechanical systems with geometric nonlinearities. The harmonic balance method is used to derive systems of polynomial equations whose solutions give the frequency component of the possible steady states. Groebner basis methods are used for computing all solutions of polynomial systems. This approach allows to reduce the complete system to an unique polynomial equation in one variable driving all solutions of the problem. In addition, in order to decrease the number of variables, we propose to first work on the undamped system, and recover solution of the damped system using a continuation on the damping parameter. The search for multiple solutions is illustrated on a simple system, where the influence of the retained number of harmonic is studied. Finally, the procedure is applied on a simple cyclic system and we give a representation of the multiple states versus frequency.

  1. An Inverse Kinematic Approach Using Groebner Basis Theory Applied to Gait Cycle Analysis

    Science.gov (United States)

    2013-03-01

    AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks

  2. On computing Gröbner bases in rings of differential operators

    Science.gov (United States)

    Ma, Xiaodong; Sun, Yao; Wang, Dingkang

    2011-05-01

    Insa and Pauer presented a basic theory of Groebner basis for differential operators with coefficients in a commutative ring in 1998, and a criterion was proposed to determine if a set of differential operators is a Groebner basis. In this paper, we will give a new criterion such that Insa and Pauer's criterion could be concluded as a special case and one could compute the Groebner basis more efficiently by this new criterion.

  3. Groebner Basis Solutions to Satellite Trajectory Control by Pole Placement

    Science.gov (United States)

    Kukelova, Z.; Krsek, P.; Smutny, V.; Pajdla, T.

    2013-09-01

    Satellites play an important role, e.g., in telecommunication, navigation and weather monitoring. Controlling their trajectories is an important problem. In [1], an approach to the pole placement for the synthesis of a linear controller has been presented. It leads to solving five polynomial equations in nine unknown elements of the state space matrices of a compensator. This is an underconstrained system and therefore four of the unknown elements need to be considered as free parameters and set to some prior values to obtain a system of five equations in five unknowns. In [1], this system was solved for one chosen set of free parameters with the help of Dixon resultants. In this work, we study and present Groebner basis solutions to this problem of computation of a dynamic compensator for the satellite for different combinations of input free parameters. We show that the Groebner basis method for solving systems of polynomial equations leads to very simple solutions for all combinations of free parameters. These solutions require to perform only the Gauss-Jordan elimination of a small matrix and computation of roots of a single variable polynomial. The maximum degree of this polynomial is not greater than six in general but for most combinations of the input free parameters its degree is even lower. [1] B. Palancz. Application of Dixon resultant to satellite trajectory control by pole placement. Journal of Symbolic Computation, Volume 50, March 2013, Pages 79-99, Elsevier.

  4. F5C: A variant of Faugère’s F5 algorithm with reduced Gröbner bases

    OpenAIRE

    Eder, Christian; Perry, John

    2010-01-01

    Faugere's F5 algorithm computes a Groebner basis incrementally, by computing a sequence of (non-reduced) Groebner bases. The authors describe a variant of F5, called F5C, that replaces each intermediate Groebner basis with its reduced Groebner basis. As a result, F5C considers fewer polynomials and performs substantially fewer polynomial reductions, so that it terminates more quickly. We also provide a generalization of Faugere's characterization theorem for Groebner bases.

  5. Groebner basis, resultants and the generalized Mandelbrot set

    Energy Technology Data Exchange (ETDEWEB)

    Geum, Young Hee [Centre of Research for Computational Sciences and Informatics in Biology, Bioindustry, Environment, Agriculture and Healthcare, University of Malaya, 50603 Kuala Lumpur (Malaysia)], E-mail: conpana@empal.com; Hare, Kevin G. [Department of Pure Mathematics, University of Waterloo, Waterloo, Ont., N2L 3G1 (Canada)], E-mail: kghare@math.uwaterloo.ca

    2009-10-30

    This paper demonstrates how the Groebner basis algorithm can be used for finding the bifurcation points in the generalized Mandelbrot set. It also shows how resultants can be used to find components of the generalized Mandelbrot set.

  6. Groebner basis, resultants and the generalized Mandelbrot set

    International Nuclear Information System (INIS)

    Geum, Young Hee; Hare, Kevin G.

    2009-01-01

    This paper demonstrates how the Groebner basis algorithm can be used for finding the bifurcation points in the generalized Mandelbrot set. It also shows how resultants can be used to find components of the generalized Mandelbrot set.

  7. Discrimination of Neutral Postures in Computer Based Work

    Science.gov (United States)

    2013-03-01

    upper extremity during a gait cycle using Groebner Basis to solve the inverse kinematics problem. The inverse kinemat- ics problem states that if the...position and orientation of end point is known, though back substitution and the Groebner Basis, the angles of the joints can be found. In Kendrick’s...too complicated to evaluate directly; but, by using the Groebner basis through the software Magma, simpler equations are produced which can then be

  8. Classical versus Computer Algebra Methods in Elementary Geometry

    Science.gov (United States)

    Pech, Pavel

    2005-01-01

    Computer algebra methods based on results of commutative algebra like Groebner bases of ideals and elimination of variables make it possible to solve complex, elementary and non elementary problems of geometry, which are difficult to solve using a classical approach. Computer algebra methods permit the proof of geometric theorems, automatic…

  9. Closed form solution for a double quantum well using Groebner basis

    Energy Technology Data Exchange (ETDEWEB)

    Acus, A [Institute of Theoretical Physics and Astronomy, Vilnius University, A Gostauto 12, LT-01108 Vilnius (Lithuania); Dargys, A, E-mail: dargys@pfi.lt [Center for Physical Sciences and Technology, Semiconductor Physics Institute, A Gostauto 11, LT-01108 Vilnius (Lithuania)

    2011-07-01

    Analytical expressions for the spectrum, eigenfunctions and dipole matrix elements of a square double quantum well (DQW) are presented for a general case when the potential in different regions of the DQW has different heights and the effective masses are different. This was achieved by using a Groebner basis algorithm that allowed us to disentangle the resulting coupled polynomials without explicitly solving the transcendental eigenvalue equation.

  10. Determining the global minimum of Higgs potentials via Groebner bases - applied to the NMSSM

    International Nuclear Information System (INIS)

    Maniatis, M.; Manteuffel, A. von; Nachtmann, O.

    2007-01-01

    Determining the global minimum of Higgs potentials with several Higgs fields like the next-to-minimal supersymmetric extension of the standard model (NMSSM) is a non-trivial task already at the tree level. The global minimum of a Higgs potential can be found from the set of all its stationary points defined by a multivariate polynomial system of equations. We introduce here the algebraic Groebner basis approach to solve this system of equations. We apply the method to the NMSSM with CP-conserving as well as CP-violating parameters. The results reveal an interesting stationary-point structure of the potential. Requiring the global minimum to give the electroweak symmetry breaking observed in Nature excludes large parts of the parameter space. (orig.)

  11. Determining the global minimum of Higgs potentials via Groebner bases - applied to the NMSSM

    Energy Technology Data Exchange (ETDEWEB)

    Maniatis, M.; Manteuffel, A. von; Nachtmann, O. [Institut fuer Theoretische Physik, Heidelberg (Germany)

    2007-03-15

    Determining the global minimum of Higgs potentials with several Higgs fields like the next-to-minimal supersymmetric extension of the standard model (NMSSM) is a non-trivial task already at the tree level. The global minimum of a Higgs potential can be found from the set of all its stationary points defined by a multivariate polynomial system of equations. We introduce here the algebraic Groebner basis approach to solve this system of equations. We apply the method to the NMSSM with CP-conserving as well as CP-violating parameters. The results reveal an interesting stationary-point structure of the potential. Requiring the global minimum to give the electroweak symmetry breaking observed in Nature excludes large parts of the parameter space. (orig.)

  12. Transactions of the Army Conference on Applied Mathematics and Computing (6th) Held in Boulder, Colorado on 31 May - 3 June 1988

    Science.gov (United States)

    1989-02-01

    Groebner Bases Moss Sweedler ...................................................... 699 Using Macsyma in a Generalized Harmonic Balance Method for a Problem...Doing Mathiematics by Comnputer, Addison-Wesley Publishing Company, 1988. 698 Groebner Bases GROEBNER BASES Moss Sweedler Mathematical Sciences...Institute Cornell University Ithaca NY 14853 ABSTRACT Groebner bases are remarkable sets of polynomials which permit effec- tive manipulation of multivariate

  13. ASYS: a computer algebra package for analysis of nonlinear algebraic equations systems

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Khutornoj, N.V.

    1992-01-01

    A program package ASYS for analysis of nonlinear algebraic equations based on the Groebner basis technique is described. The package is written in REDUCE computer algebra language. It has special facilities to treat polynomial ideals of positive dimension, corresponding to algebraic systems with infinitely many solutions. Such systems can be transformed to an equivalent set of subsystems with reduced number of variables in completely automatic way. It often allows to construct the explicit form of a solution set in many problems of practical importance. Some examples and results of comparison with the standard Reduce package GROEBNER and special-purpose systems FELIX and A1PI are given. 21 refs.; 2 tabs

  14. Noncommutative Gröbner bases and filtered-graded transfer

    CERN Document Server

    Li, Huishi

    2002-01-01

    This self-contained monograph is the first to feature the intersection of the structure theory of noncommutative associative algebras and the algorithmic aspect of Groebner basis theory. A double filtered-graded transfer of data in using noncommutative Groebner bases leads to effective exploitation of the solutions to several structural-computational problems, e.g., an algorithmic recognition of quadric solvable polynomial algebras, computation of GK-dimension and multiplicity for modules, and elimination of variables in noncommutative setting. All topics included deal with algebras of (q-)differential operators as well as some other operator algebras, enveloping algebras of Lie algebras, typical quantum algebras, and many of their deformations.

  15. A Center for Excellence in Mathematical Sciences Final Progress Report

    Science.gov (United States)

    1997-02-18

    concentration are a Groebner Basis Project and a Symbolic Methods in AI and Computer Science project, with simultaneous development of other needed areas. The... Groebner construction algorithm. Develop an algebraic theory of piece wise polynomial approximation based on the Bezier- Bernstein algebra. Address...questions surrounding polytopes, splines, and complexity of Groebner basis computations. In topology determine the homotopy type of subdivision lattice of a

  16. Transactions of the Conference on Applied Mathematics and Computing (9th) Held in Minneapolis, Minnesota on 18-21 June 1991

    Science.gov (United States)

    1992-03-01

    Aspects of the Hilbert Scheme . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Alyson A. Reeves Using Groebner ...COMBINATOlUAL ASPECTS OF THE HILBERT SCHEME Alyson Reeves, Cornell University, Ithaca, New York GROEBNER BASES AND FIELD EXTENSIONS Moss Sweedler... GROEBNER BASES TO DETERMINE THE NATURE OF FIEW EXTENSIONS* - Moss E Sweedler ACSyAM, MSI Cornell University Ithaca NY 14853 ABSTRACT. Suppose the field

  17. Nonlinear evolution equations and solving algebraic systems: the importance of computer algebra

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Kostov, N.A.

    1989-01-01

    In the present paper we study the application of computer algebra to solve the nonlinear polynomial systems which arise in investigation of nonlinear evolution equations. We consider several systems which are obtained in classification of integrable nonlinear evolution equations with uniform rank. Other polynomial systems are related with the finding of algebraic curves for finite-gap elliptic potentials of Lame type and generalizations. All systems under consideration are solved using the method based on construction of the Groebner basis for corresponding polynomial ideals. The computations have been carried out using computer algebra systems. 20 refs

  18. Computational algebraic geometry of epidemic models

    Science.gov (United States)

    Rodríguez Vega, Martín.

    2014-06-01

    Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.

  19. An algorithmic approach to solving polynomial equations associated with quantum circuits

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Zinin, M.V.

    2009-01-01

    In this paper we present two algorithms for reducing systems of multivariate polynomial equations over the finite field F 2 to the canonical triangular form called lexicographical Groebner basis. This triangular form is the most appropriate for finding solutions of the system. On the other hand, the system of polynomials over F 2 whose variables also take values in F 2 (Boolean polynomials) completely describes the unitary matrix generated by a quantum circuit. In particular, the matrix itself can be computed by counting the number of solutions (roots) of the associated polynomial system. Thereby, efficient construction of the lexicographical Groebner bases over F 2 associated with quantum circuits gives a method for computing their circuit matrices that is alternative to the direct numerical method based on linear algebra. We compare our implementation of both algorithms with some other software packages available for computing Groebner bases over F 2

  20. Algebraic model checking for Boolean gene regulatory networks.

    Science.gov (United States)

    Tran, Quoc-Nam

    2011-01-01

    We present a computational method in which modular and Groebner bases (GB) computation in Boolean rings are used for solving problems in Boolean gene regulatory networks (BN). In contrast to other known algebraic approaches, the degree of intermediate polynomials during the calculation of Groebner bases using our method will never grow resulting in a significant improvement in running time and memory space consumption. We also show how calculation in temporal logic for model checking can be done by means of our direct and efficient Groebner basis computation in Boolean rings. We present our experimental results in finding attractors and control strategies of Boolean networks to illustrate our theoretical arguments. The results are promising. Our algebraic approach is more efficient than the state-of-the-art model checker NuSMV on BNs. More importantly, our approach finds all solutions for the BN problems.

  1. Groebner Basis Methods for Stationary Solutions of a Low-Dimensional Model for a Shear Flow

    Science.gov (United States)

    Pausch, Marina; Grossmann, Florian; Eckhardt, Bruno; Romanovski, Valery G.

    2014-10-01

    We use Groebner basis methods to extract all stationary solutions for the nine-mode shear flow model described in Moehlis et al. (New J Phys 6:56, 2004). Using rational approximations to irrational wave numbers and algebraic manipulation techniques we reduce the problem of determining all stationary states to finding roots of a polynomial of order 30. The coefficients differ by 30 powers of 10, so that algorithms for extended precision are needed to extract the roots reliably. We find that there are eight stationary solutions consisting of two distinct states, each of which appears in four symmetry-related phases. We discuss extensions of these results for other flows.

  2. Explicitly computing geodetic coordinates from Cartesian coordinates

    Science.gov (United States)

    Zeng, Huaien

    2013-04-01

    This paper presents a new form of quartic equation based on Lagrange's extremum law and a Groebner basis under the constraint that the geodetic height is the shortest distance between a given point and the reference ellipsoid. A very explicit and concise formulae of the quartic equation by Ferrari's line is found, which avoids the need of a good starting guess for iterative methods. A new explicit algorithm is then proposed to compute geodetic coordinates from Cartesian coordinates. The convergence region of the algorithm is investigated and the corresponding correct solution is given. Lastly, the algorithm is validated with numerical experiments.

  3. Mutually unbiased bases and semi-definite programming

    Energy Technology Data Exchange (ETDEWEB)

    Brierley, Stephen; Weigert, Stefan, E-mail: steve.brierley@ulb.ac.be, E-mail: stefan.weigert@york.ac.uk

    2010-11-01

    A complex Hilbert space of dimension six supports at least three but not more than seven mutually unbiased bases. Two computer-aided analytical methods to tighten these bounds are reviewed, based on a discretization of parameter space and on Groebner bases. A third algorithmic approach is presented: the non-existence of more than three mutually unbiased bases in composite dimensions can be decided by a global optimization method known as semidefinite programming. The method is used to confirm that the spectral matrix cannot be part of a complete set of seven mutually unbiased bases in dimension six.

  4. Mutually unbiased bases and semi-definite programming

    International Nuclear Information System (INIS)

    Brierley, Stephen; Weigert, Stefan

    2010-01-01

    A complex Hilbert space of dimension six supports at least three but not more than seven mutually unbiased bases. Two computer-aided analytical methods to tighten these bounds are reviewed, based on a discretization of parameter space and on Groebner bases. A third algorithmic approach is presented: the non-existence of more than three mutually unbiased bases in composite dimensions can be decided by a global optimization method known as semidefinite programming. The method is used to confirm that the spectral matrix cannot be part of a complete set of seven mutually unbiased bases in dimension six.

  5. ASYS2: a new version of computer algebra package ASYS for analysis and simplification of polynomial systems

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Khutornoj, N.V.

    1993-01-01

    In this paper a new version of a package ASYS for analysis of nonlinear algebraic equations based on the Groebner basis technique is described. In addition to the first version ASYS1 of the package, the current one has a number of new facilities which provide its higher efficiency. Some examples and results of comparison between ASYS2, ASYS1 and two other REDUCE packages GROEBNER and CALI included in REDUCE 3.5, are given. 16 refs., 4 tabs

  6. A RUTCOR Project on Discrete Applied Mathematics

    Science.gov (United States)

    1989-01-30

    the more important results of this work is the possibility that Groebner basis methods of computational commutative algebra might lead to effective...Billera, L.J., " Groebner Basis Methods for Multivariate Splines," prepared for the Proceedings of the Oslo Conference on Computer-aided Geometric Design

  7. Extraction of human gait signatures: an inverse kinematic approach using Groebner basis theory applied to gait cycle analysis

    Science.gov (United States)

    Barki, Anum; Kendricks, Kimberly; Tuttle, Ronald F.; Bunker, David J.; Borel, Christoph C.

    2013-05-01

    This research highlights the results obtained from applying the method of inverse kinematics, using Groebner basis theory, to the human gait cycle to extract and identify lower extremity gait signatures. The increased threat from suicide bombers and the force protection issues of today have motivated a team at Air Force Institute of Technology (AFIT) to research pattern recognition in the human gait cycle. The purpose of this research is to identify gait signatures of human subjects and distinguish between subjects carrying a load to those subjects without a load. These signatures were investigated via a model of the lower extremities based on motion capture observations, in particular, foot placement and the joint angles for subjects affected by carrying extra load on the body. The human gait cycle was captured and analyzed using a developed toolkit consisting of an inverse kinematic motion model of the lower extremity and a graphical user interface. Hip, knee, and ankle angles were analyzed to identify gait angle variance and range of motion. Female subjects exhibited the most knee angle variance and produced a proportional correlation between knee flexion and load carriage.

  8. Institute for Defense Analysis. Annual Report 1995.

    Science.gov (United States)

    1995-01-01

    staff have been involved in the community-wide development of MPI as well as in its application to specific NSA problems. 35 Parallel Groebner ...Basis Code — Symbolic Computing on Parallel Machines The Groebner basis method is a set of algorithms for reformulating very complex algebraic expres

  9. Weierstrass Elliptic Function Solutions to Nonlinear Evolution Equations

    International Nuclear Information System (INIS)

    Yu Jianping; Sun Yongli

    2008-01-01

    This paper is based on the relations between projection Riccati equations and Weierstrass elliptic equation, combined with the Groebner bases in the symbolic computation. Then the novel method for constructing the Weierstrass elliptic solutions to the nonlinear evolution equations is given by using the above relations

  10. Automatic Deduction in Dynamic Geometry using Sage

    Directory of Open Access Journals (Sweden)

    Francisco Botana

    2012-02-01

    Full Text Available We present a symbolic tool that provides robust algebraic methods to handle automatic deduction tasks for a dynamic geometry construction. The main prototype has been developed as two different worksheets for the open source computer algebra system Sage, corresponding to two different ways of coding a geometric construction. In one worksheet, diagrams constructed with the open source dynamic geometry system GeoGebra are accepted. In this worksheet, Groebner bases are used to either compute the equation of a geometric locus in the case of a locus construction or to determine the truth of a general geometric statement included in the GeoGebra construction as a boolean variable. In the second worksheet, locus constructions coded using the common file format for dynamic geometry developed by the Intergeo project are accepted for computation. The prototype and several examples are provided for testing. Moreover, a third Sage worksheet is presented in which a novel algorithm to eliminate extraneous parts in symbolically computed loci has been implemented. The algorithm, based on a recent work on the Groebner cover of parametric systems, identifies degenerate components and extraneous adherence points in loci, both natural byproducts of general polynomial algebraic methods. Detailed examples are discussed.

  11. A Center of Excellence in the Mathematical Sciences - at Cornell University

    Science.gov (United States)

    1992-03-01

    S.L. PHOENIX 62 PAGES 87. GROEBNER BASES: THE ANCIENT SECRET MYSTIC POWER OF THE ALGU COMPUBRAICUS ,A REVELATION WHOSE SIMPLICITY WILL MAKE LADIES...Equations, October 1988; Groebner Basis, October 1988; Theoretical Aspects of Multiphase Flow, October 1988; Mathematical Theory of Queuing Systems

  12. S-bases as a tool to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, A.V.; Smirnov, V.A.

    2006-01-01

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined

  13. S-bases as a tool to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, A.V. [Scientific Research Computing Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, V.A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-10-15

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined.

  14. Relations between Some Characteristic Lengths in a Triangle

    Science.gov (United States)

    Koepf, Wolfram; Brede, Markus

    2005-01-01

    The paper's aim is to note a remarkable (and apparently unknown) relation for right triangles, its generalisation to arbitrary triangles and the possibility to derive these and some related relations by elimination using Groebner basis computations with a modern computer algebra system. (Contains 9 figures.)

  15. Algebraic and computational aspects of real tensor ranks

    CERN Document Server

    Sakata, Toshio; Miyazaki, Mitsuhiro

    2016-01-01

    This book provides comprehensive summaries of theoretical (algebraic) and computational aspects of tensor ranks, maximal ranks, and typical ranks, over the real number field. Although tensor ranks have been often argued in the complex number field, it should be emphasized that this book treats real tensor ranks, which have direct applications in statistics. The book provides several interesting ideas, including determinant polynomials, determinantal ideals, absolutely nonsingular tensors, absolutely full column rank tensors, and their connection to bilinear maps and Hurwitz-Radon numbers. In addition to reviews of methods to determine real tensor ranks in details, global theories such as the Jacobian method are also reviewed in details. The book includes as well an accessible and comprehensive introduction of mathematical backgrounds, with basics of positive polynomials and calculations by using the Groebner basis. Furthermore, this book provides insights into numerical methods of finding tensor ranks through...

  16. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  17. Evaluation of the Protective Efficacy of Recombinant Vesicular Stomatitis Virus Vectors Against Marburg Hemorrhagic Fever in Nonhuman Primate Models

    Science.gov (United States)

    2007-01-19

    et al. 1996; Lee, Groebner et al. 2006; Jones, Feldmann et al. 2005). However, NHPs display disease characteristics such as clinical disease, and...2003; Jones, Feldmann et al. 2005; Wang, Schmaljohn et al. 2006; Lee, Groebner et al. 2006). Among the successful vaccine platforms evaluated in...primary influenza infection and helps to prevent reinfection." J Immunol 175(9): 5827-38. 152 Lee, J. S., J. L. Groebner , et al. (2006). "Multiagent

  18. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  19. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  20. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  1. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  2. Role of Cracks in the Creep of Structural Polycrystalline Ceramics.

    Science.gov (United States)

    1988-01-15

    2 N N NN+l/(+-4N)N/ 2 dl = AYN$N E tN/(N + 1) (11) 0 A solution for the integral in eq. 11 can be obtained from the tables compiled by Groebner and...1964) 1679 34. T. G. Langdon and F. A. Mohamed, J. Mat. Sci. 13 (1978) 473 35. R. A. Sack, Proc. Phys. Soc. London 58A (1946) 729 36. W. Groebner , N...evaluated with the aid of solutions given by Groebner and Hofreiter [56]. The right-hand integral can be converted to a finite series representation

  3. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  4. A program for constructing finitely presented Lie algebras and superalgebras

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Kornyak, V.V.

    1997-01-01

    The purpose of this paper is to describe a C program FPLSA for investigating finitely presented Lie algebras and superalgebras. The underlying algorithm is based on constructing the complete set of relations called also standard basis or Groebner basis of ideal of free Lie (super) algebra generated by the input set of relations. The program may be used, in particular, to compute the Lie (super)algebra basis elements and its structure constants, to classify the finitely presented algebras depending on the values of parameters in the relations, and to construct the Hilbert series. These problems are illustrated by examples. (orig.)

  5. A RUTCOR Project in Discrete Applied Mathematics

    Science.gov (United States)

    1990-02-20

    representations of smooth piecewise polynomial functions over triangulated regions have led in particular to the conclusion that Groebner basis methods of...Reversing Number of a Digraph," in preparation. 4. Billera, L.J., and Rose, L.L., " Groebner Basis Methods for Multivariate Splines," RRR 1-89, January

  6. Knowledge-based computer security advisor

    International Nuclear Information System (INIS)

    Hunteman, W.J.; Squire, M.B.

    1991-01-01

    The rapid expansion of computer security information and technology has included little support to help the security officer identify the safeguards needed to comply with a policy and to secure a computing system. This paper reports that Los Alamos is developing a knowledge-based computer security system to provide expert knowledge to the security officer. This system includes a model for expressing the complex requirements in computer security policy statements. The model is part of an expert system that allows a security officer to describe a computer system and then determine compliance with the policy. The model contains a generic representation that contains network relationships among the policy concepts to support inferencing based on information represented in the generic policy description

  7. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  8. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  9. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  10. Some recent results on evaluating Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, V.A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-07-15

    Some recent results on evaluating Feynman integrals are reviewed. The status of the method based on Mellin-Barnes representation as a powerful tool to evaluate individual Feynman integrals is characterized. A new method based on Groebner bases to solve integration by parts relations in an automatic way is described.

  11. Some recent results on evaluating Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, V.A.

    2006-01-01

    Some recent results on evaluating Feynman integrals are reviewed. The status of the method based on Mellin-Barnes representation as a powerful tool to evaluate individual Feynman integrals is characterized. A new method based on Groebner bases to solve integration by parts relations in an automatic way is described

  12. Maximal lattice free bodies, test sets and the Frobenius problem

    DEFF Research Database (Denmark)

    Jensen, Anders Nedergaard; Lauritzen, Niels; Roune, Bjarke Hammersholt

    Maximal lattice free bodies are maximal polytopes without interior integral points. Scarf initiated the study of maximal lattice free bodies relative to the facet normals in a fixed matrix. In this paper we give an efficient algorithm for computing the maximal lattice free bodies of an integral m...... method is inspired by the novel algorithm by Einstein, Lichtblau, Strzebonski and Wagon and the Groebner basis approach by Roune....

  13. Computer-based and web-based radiation safety training

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  14. Benchmarking gate-based quantum computers

    Science.gov (United States)

    Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans

    2017-11-01

    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.

  15. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  16. Pervasive Computing Support for Hospitals: An Overview of the Activity-Based Computing Project

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob E

    2007-01-01

    The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital......The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital...

  17. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  18. Evaluation of computer-based ultrasonic inservice inspection systems

    International Nuclear Information System (INIS)

    Harris, R.V. Jr.; Angel, L.J.; Doctor, S.R.; Park, W.R.; Schuster, G.J.; Taylor, T.T.

    1994-03-01

    This report presents the principles, practices, terminology, and technology of computer-based ultrasonic testing for inservice inspection (UT/ISI) of nuclear power plants, with extensive use of drawings, diagrams, and LTT images. The presentation is technical but assumes limited specific knowledge of ultrasonics or computers. The report is divided into 9 sections covering conventional LTT, computer-based LTT, and evaluation methodology. Conventional LTT topics include coordinate axes, scanning, instrument operation, RF and video signals, and A-, B-, and C-scans. Computer-based topics include sampling, digitization, signal analysis, image presentation, SAFI, ultrasonic holography, transducer arrays, and data interpretation. An evaluation methodology for computer-based LTT/ISI systems is presented, including questions, detailed procedures, and test block designs. Brief evaluations of several computer-based LTT/ISI systems are given; supplementary volumes will provide detailed evaluations of selected systems

  19. Property-Based Anonymous Attestation in Trusted Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhen-Hu Ning

    2014-01-01

    Full Text Available In the remote attestation on Trusted Computer (TC computing mode TCCP, the trusted computer TC has an excessive burden, and anonymity and platform configuration information security of computing nodes cannot be guaranteed. To overcome these defects, based on the research on and analysis of current schemes, we propose an anonymous proof protocol based on property certificate. The platform configuration information is converted by the matrix algorithm into the property certificate, and the remote attestation is implemented by trusted ring signature scheme based on Strong RSA Assumption. By the trusted ring signature scheme based on property certificate, we achieve the anonymity of computing nodes and prevent the leakage of platform configuration information. By simulation, we obtain the computational efficiency of the scheme. We also expand the protocol and obtain the anonymous attestation based on ECC. By scenario comparison, we obtain the trusted ring signature scheme based on RSA, which has advantages with the growth of the ring numbers.

  20. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  1. Self-guaranteed measurement-based quantum computation

    Science.gov (United States)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  2. Algorithms in invariant theory

    CERN Document Server

    Sturmfels, Bernd

    2008-01-01

    J. Kung and G.-C. Rota, in their 1984 paper, write: "Like the Arabian phoenix rising out of its ashes, the theory of invariants, pronounced dead at the turn of the century, is once again at the forefront of mathematics". The book of Sturmfels is both an easy-to-read textbook for invariant theory and a challenging research monograph that introduces a new approach to the algorithmic side of invariant theory. The Groebner bases method is the main tool by which the central problems in invariant theory become amenable to algorithmic solutions. Students will find the book an easy introduction to this "classical and new" area of mathematics. Researchers in mathematics, symbolic computation, and computer science will get access to a wealth of research ideas, hints for applications, outlines and details of algorithms, worked out examples, and research problems.

  3. An Overview of Computer-Based Natural Language Processing.

    Science.gov (United States)

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  4. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  5. Transforming bases to bytes: Molecular computing with DNA

    Indian Academy of Sciences (India)

    Despite the popular image of silicon-based computers for computation, an embryonic field of mole- cular computation is emerging, where molecules in solution perform computational ..... [4] Mao C, Sun W, Shen Z and Seeman N C 1999. A nanomechanical device based on the B-Z transition of DNA; Nature 397 144–146.

  6. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  7. Computer-Based Learning in Chemistry Classes

    Science.gov (United States)

    Pietzner, Verena

    2014-01-01

    Currently not many people would doubt that computers play an essential role in both public and private life in many countries. However, somewhat surprisingly, evidence of computer use is difficult to find in German state schools although other countries have managed to implement computer-based teaching and learning in their schools. This paper…

  8. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  9. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  10. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  11. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  12. Semantic computing and language knowledge bases

    Science.gov (United States)

    Wang, Lei; Wang, Houfeng; Yu, Shiwen

    2017-09-01

    As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.

  13. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  14. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  15. Ammonia-based quantum computer

    International Nuclear Information System (INIS)

    Ferguson, Andrew J.; Cain, Paul A.; Williams, David A.; Briggs, G. Andrew D.

    2002-01-01

    We propose a scheme for quantum computation using two eigenstates of ammonia or similar molecules. Individual ammonia molecules are confined inside fullerenes and used as two-level qubit systems. Interaction between these ammonia qubits takes place via the electric dipole moments, and in particular we show how a controlled-NOT gate could be implemented. After computation the qubit is measured with a single-electron electrometer sensitive enough to differentiate between the dipole moments of different states. We also discuss a possible implementation based on a quantum cellular automaton

  16. Computer-based feedback in formative assessment

    NARCIS (Netherlands)

    van der Kleij, Fabienne

    2013-01-01

    Formative assessment concerns any assessment that provides feedback that is intended to support learning and can be used by teachers and/or students. Computers could offer a solution to overcoming obstacles encountered in implementing formative assessment. For example, computer-based assessments

  17. Computational aeroelasticity using a pressure-based solver

    Science.gov (United States)

    Kamakoti, Ramji

    A computational methodology for performing fluid-structure interaction computations for three-dimensional elastic wing geometries is presented. The flow solver used is based on an unsteady Reynolds-Averaged Navier-Stokes (RANS) model. A well validated k-ε turbulence model with wall function treatment for near wall region was used to perform turbulent flow calculations. Relative merits of alternative flow solvers were investigated. The predictor-corrector-based Pressure Implicit Splitting of Operators (PISO) algorithm was found to be computationally economic for unsteady flow computations. Wing structure was modeled using Bernoulli-Euler beam theory. A fully implicit time-marching scheme (using the Newmark integration method) was used to integrate the equations of motion for structure. Bilinear interpolation and linear extrapolation techniques were used to transfer necessary information between fluid and structure solvers. Geometry deformation was accounted for by using a moving boundary module. The moving grid capability was based on a master/slave concept and transfinite interpolation techniques. Since computations were performed on a moving mesh system, the geometric conservation law must be preserved. This is achieved by appropriately evaluating the Jacobian values associated with each cell. Accurate computation of contravariant velocities for unsteady flows using the momentum interpolation method on collocated, curvilinear grids was also addressed. Flutter computations were performed for the AGARD 445.6 wing at subsonic, transonic and supersonic Mach numbers. Unsteady computations were performed at various dynamic pressures to predict the flutter boundary. Results showed favorable agreement of experiment and previous numerical results. The computational methodology exhibited capabilities to predict both qualitative and quantitative features of aeroelasticity.

  18. Computer Based Expert Systems.

    Science.gov (United States)

    Parry, James D.; Ferrara, Joseph M.

    1985-01-01

    Claims knowledge-based expert computer systems can meet needs of rural schools for affordable expert advice and support and will play an important role in the future of rural education. Describes potential applications in prediction, interpretation, diagnosis, remediation, planning, monitoring, and instruction. (NEC)

  19. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  20. Primary decomposition of zero-dimensional ideals over finite fields

    Science.gov (United States)

    Gao, Shuhong; Wan, Daqing; Wang, Mingsheng

    2009-03-01

    A new algorithm is presented for computing primary decomposition of zero-dimensional ideals over finite fields. Like Berlekamp's algorithm for univariate polynomials, the new method is based on the invariant subspace of the Frobenius map acting on the quotient algebra. The dimension of the invariant subspace equals the number of primary components, and a basis of the invariant subspace yields a complete decomposition. Unlike previous approaches for decomposing multivariate polynomial systems, the new method does not need primality testing nor any generic projection, instead it reduces the general decomposition problem directly to root finding of univariate polynomials over the ground field. Also, it is shown how Groebner basis structure can be used to get partial primary decomposition without any root finding.

  1. Computer-Based Career Interventions.

    Science.gov (United States)

    Mau, Wei-Cheng

    The possible utilities and limitations of computer-assisted career guidance systems (CACG) have been widely discussed although the effectiveness of CACG has not been systematically considered. This paper investigates the effectiveness of a theory-based CACG program, integrating Sequential Elimination and Expected Utility strategies. Three types of…

  2. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  3. Computer-based literature search in medical institutions in India

    Directory of Open Access Journals (Sweden)

    Kalita Jayantee

    2007-01-01

    Full Text Available Aim: To study the use of computer-based literature search and its application in clinical training and patient care as a surrogate marker of evidence-based medicine. Materials and Methods: A questionnaire comprising of questions on purpose (presentation, patient management, research, realm (site accessed, nature and frequency of search, effect, infrastructure, formal training in computer based literature search and suggestions for further improvement were sent to residents and faculty of a Postgraduate Medical Institute (PGI and a Medical College. The responses were compared amongst different subgroups of respondents. Results: Out of 300 subjects approached 194 responded; of whom 103 were from PGI and 91 from Medical College. There were 97 specialty residents, 58 super-specialty residents and 39 faculty members. Computer-based literature search was done at least once a month by 89% though there was marked variability in frequency and extent. The motivation for computer-based literature search was for presentation in 90%, research in 65% and patient management in 60.3%. The benefit of search was acknowledged in learning and teaching by 80%, research by 65% and patient care by 64.4% of respondents. Formal training in computer based literature search was received by 41% of whom 80% were residents. Residents from PGI did more frequent and more extensive computer-based literature search, which was attributed to better infrastructure and training. Conclusion: Training and infrastructure both are crucial for computer-based literature search, which may translate into evidence based medicine.

  4. Computer-based control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Kalashnikov, V.K.; Shugam, R.A.; Ol'shevsky, Yu.N.

    1975-01-01

    Computer-based control systems of nuclear power plants may be classified into those using computers for data acquisition only, those using computers for data acquisition and data processing, and those using computers for process control. In the present paper a brief review is given of the functions the systems above mentioned perform, their applications in different nuclear power plants, and some of their characteristics. The trend towards hierarchic systems using control computers with reserves already becomes clear when consideration is made of the control systems applied in the Canadian nuclear power plants that pertain to the first ones equipped with process computers. The control system being now under development for the large Soviet reactors of WWER type will also be based on the use of control computers. That part of the system concerned with controlling the reactor assembly is described in detail

  5. Basicities of Strong Bases in Water: A Computational Study

    OpenAIRE

    Kaupmees, Karl; Trummal, Aleksander; Leito, Ivo

    2014-01-01

    Aqueous pKa values of strong organic bases – DBU, TBD, MTBD, different phosphazene bases, etc – were computed with CPCM, SMD and COSMO-RS approaches. Explicit solvent molecules were not used. Direct computations and computations with reference pKa values were used. The latter were of two types: (1) reliable experimental aqueous pKa value of a reference base with structure similar to the investigated base or (2) reliable experimental pKa value in acetonitrile of the investigated base itself. ...

  6. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation.

    Science.gov (United States)

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-05-17

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.

  7. Computer-based learning for the enhancement of breastfeeding ...

    African Journals Online (AJOL)

    In this study, computer-based learning (CBL) was explored in the context of breastfeeding training for undergraduate Dietetic students. Aim: To adapt and validate an Indian computer-based undergraduate breastfeeding training module for use by South African undergraduate Dietetic students. Methods and materials: The ...

  8. Silicon CMOS architecture for a spin-based quantum computer.

    Science.gov (United States)

    Veldhorst, M; Eenink, H G J; Yang, C H; Dzurak, A S

    2017-12-15

    Recent advances in quantum error correction codes for fault-tolerant quantum computing and physical realizations of high-fidelity qubits in multiple platforms give promise for the construction of a quantum computer based on millions of interacting qubits. However, the classical-quantum interface remains a nascent field of exploration. Here, we propose an architecture for a silicon-based quantum computer processor based on complementary metal-oxide-semiconductor (CMOS) technology. We show how a transistor-based control circuit together with charge-storage electrodes can be used to operate a dense and scalable two-dimensional qubit system. The qubits are defined by the spin state of a single electron confined in quantum dots, coupled via exchange interactions, controlled using a microwave cavity, and measured via gate-based dispersive readout. We implement a spin qubit surface code, showing the prospects for universal quantum computation. We discuss the challenges and focus areas that need to be addressed, providing a path for large-scale quantum computing.

  9. Transitions in the computational power of thermal states for measurement-based quantum computation

    International Nuclear Information System (INIS)

    Barrett, Sean D.; Bartlett, Stephen D.; Jennings, David; Doherty, Andrew C.; Rudolph, Terry

    2009-01-01

    We show that the usefulness of the thermal state of a specific spin-lattice model for measurement-based quantum computing exhibits a transition between two distinct 'phases' - one in which every state is a universal resource for quantum computation, and another in which any local measurement sequence can be simulated efficiently on a classical computer. Remarkably, this transition in computational power does not coincide with any phase transition, classical, or quantum in the underlying spin-lattice model.

  10. Computational anatomy based on whole body imaging basic principles of computer-assisted diagnosis and therapy

    CERN Document Server

    Masutani, Yoshitaka

    2017-01-01

    This book deals with computational anatomy, an emerging discipline recognized in medical science as a derivative of conventional anatomy. It is also a completely new research area on the boundaries of several sciences and technologies, such as medical imaging, computer vision, and applied mathematics. Computational Anatomy Based on Whole Body Imaging highlights the underlying principles, basic theories, and fundamental techniques in computational anatomy, which are derived from conventional anatomy, medical imaging, computer vision, and applied mathematics, in addition to various examples of applications in clinical data. The book will cover topics on the basics and applications of the new discipline. Drawing from areas in multidisciplinary fields, it provides comprehensive, integrated coverage of innovative approaches to computational anatomy. As well,Computational Anatomy Based on Whole Body Imaging serves as a valuable resource for researchers including graduate students in the field and a connection with ...

  11. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  12. Concordance-based Kendall's Correlation for Computationally-Light vs. Computationally-Heavy Centrality Metrics: Lower Bound for Correlation

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2017-01-01

    Full Text Available We identify three different levels of correlation (pair-wise relative ordering, network-wide ranking and linear regression that could be assessed between a computationally-light centrality metric and a computationally-heavy centrality metric for real-world networks. The Kendall's concordance-based correlation measure could be used to quantitatively assess how well we could consider the relative ordering of two vertices vi and vj with respect to a computationally-light centrality metric as the relative ordering of the same two vertices with respect to a computationally-heavy centrality metric. We hypothesize that the pair-wise relative ordering (concordance-based assessment of the correlation between centrality metrics is the most strictest of all the three levels of correlation and claim that the Kendall's concordance-based correlation coefficient will be lower than the correlation coefficient observed with the more relaxed levels of correlation measures (linear regression-based Pearson's product-moment correlation coefficient and the network wide ranking-based Spearman's correlation coefficient. We validate our hypothesis by evaluating the three correlation coefficients between two sets of centrality metrics: the computationally-light degree and local clustering coefficient complement-based degree centrality metrics and the computationally-heavy eigenvector centrality, betweenness centrality and closeness centrality metrics for a diverse collection of 50 real-world networks.

  13. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  14. Game based learning for computer science education

    NARCIS (Netherlands)

    Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus

    2011-01-01

    Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.

  15. Women and Computer Based Technologies: A Feminist Perspective.

    Science.gov (United States)

    Morritt, Hope

    The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…

  16. An overview of computer-based natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  17. Strategic Planning for Computer-Based Educational Technology.

    Science.gov (United States)

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  18. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  19. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  20. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  1. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  2. Computer vision based room interior design

    Science.gov (United States)

    Ahmad, Nasir; Hussain, Saddam; Ahmad, Kashif; Conci, Nicola

    2015-12-01

    This paper introduces a new application of computer vision. To the best of the author's knowledge, it is the first attempt to incorporate computer vision techniques into room interior designing. The computer vision based interior designing is achieved in two steps: object identification and color assignment. The image segmentation approach is used for the identification of the objects in the room and different color schemes are used for color assignment to these objects. The proposed approach is applied to simple as well as complex images from online sources. The proposed approach not only accelerated the process of interior designing but also made it very efficient by giving multiple alternatives.

  3. Agent-Based Computing: Promise and Perils

    OpenAIRE

    Jennings, N. R.

    1999-01-01

    Agent-based computing represents an exciting new synthesis both for Artificial Intelligence (AI) and, more genrally, Computer Science. It has the potential to significantly improve the theory and practice of modelling, designing and implementing complex systems. Yet, to date, there has been little systematic analysis of what makes an agent such an appealing and powerful conceptual model. Moreover, even less effort has been devoted to exploring the inherent disadvantages that stem from adoptin...

  4. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  5. 26 CFR 1.809-10 - Computation of equity base.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Computation of equity base. 1.809-10 Section 1... (CONTINUED) INCOME TAXES Gain and Loss from Operations § 1.809-10 Computation of equity base. (a) In general. For purposes of section 809, the equity base of a life insurance company includes the amount of any...

  6. Enhancing Lecture Presentations in Introductory Biology with Computer-Based Multimedia.

    Science.gov (United States)

    Fifield, Steve; Peifer, Rick

    1994-01-01

    Uses illustrations and text to discuss convenient ways to organize and present computer-based multimedia to students in lecture classes. Includes the following topics: (1) Effects of illustrations on learning; (2) Using computer-based illustrations in lecture; (3) MacPresents-Multimedia Presentation Software; (4) Advantages of computer-based…

  7. Remote media vision-based computer input device

    Science.gov (United States)

    Arabnia, Hamid R.; Chen, Ching-Yi

    1991-11-01

    In this paper, we introduce a vision-based computer input device which has been built at the University of Georgia. The user of this system gives commands to the computer without touching any physical device. The system receives input through a CCD camera; it is PC- based and is built on top of the DOS operating system. The major components of the input device are: a monitor, an image capturing board, a CCD camera, and some software (developed by use). These are interfaced with a standard PC running under the DOS operating system.

  8. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  9. A quantum computer based on recombination processes in microelectronic devices

    International Nuclear Information System (INIS)

    Theodoropoulos, K; Ntalaperas, D; Petras, I; Konofaos, N

    2005-01-01

    In this paper a quantum computer based on the recombination processes happening in semiconductor devices is presented. A 'data element' and a 'computational element' are derived based on Schokley-Read-Hall statistics and they can later be used to manifest a simple and known quantum computing process. Such a paradigm is shown by the application of the proposed computer onto a well known physical system involving traps in semiconductor devices

  10. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  11. Standardized computer-based organized reporting of EEG:SCORE

    DEFF Research Database (Denmark)

    Beniczky, Sandor; H, Aurlien,; JC, Brøgger,

    2013-01-01

    process, organized by the European Chapter of the International Federation of Clinical Neurophysiology. The Standardised Computer-based Organised Reporting of EEG (SCORE) software was constructed based on the terms and features of the consensus statement and it was tested in the clinical practice...... in free-text format. The purpose of our endeavor was to create a computer-based system for EEG assessment and reporting, where the physicians would construct the reports by choosing from predefined elements for each relevant EEG feature, as well as the clinical phenomena (for video-EEG recordings....... SCORE can potentially improve the quality of EEG assessment and reporting; it will help incorporate the results of computer-assisted analysis into the report, it will make possible the build-up of a multinational database, and it will help in training young neurophysiologists....

  12. Computer Based Road Accident Reconstruction Experiences

    Directory of Open Access Journals (Sweden)

    Milan Batista

    2005-03-01

    Full Text Available Since road accident analyses and reconstructions are increasinglybased on specific computer software for simulationof vehicle d1iving dynamics and collision dynamics, and forsimulation of a set of trial runs from which the model that bestdescribes a real event can be selected, the paper presents anoverview of some computer software and methods available toaccident reconstruction experts. Besides being time-saving,when properly used such computer software can provide moreauthentic and more trustworthy accident reconstruction, thereforepractical experiences while using computer software toolsfor road accident reconstruction obtained in the TransportSafety Laboratory at the Faculty for Maritime Studies andTransport of the University of Ljubljana are presented and discussed.This paper addresses also software technology for extractingmaximum information from the accident photo-documentationto support accident reconstruction based on the simulationsoftware, as well as the field work of reconstruction expertsor police on the road accident scene defined by this technology.

  13. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  14. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE...... are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database...

  15. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  16. A personal computer-based nuclear magnetic resonance spectrometer

    Science.gov (United States)

    Job, Constantin; Pearson, Robert M.; Brown, Michael F.

    1994-11-01

    Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.

  17. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2010-11-01

    Full Text Available Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.,the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups with high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

  18. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  19. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  20. Impedance computations and beam-based measurements: A problem of discrepancy

    Science.gov (United States)

    Smaluk, Victor

    2018-04-01

    High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictions based on the computed impedance budgets show a significant discrepancy. Three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.

  1. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  2. Computer based training: Technology and trends

    International Nuclear Information System (INIS)

    O'Neal, A.F.

    1986-01-01

    Computer Based Training (CBT) offers great potential for revolutionizing the training environment. Tremendous advances in computer cost performance, instructional design science, and authoring systems have combined to put CBT within the reach of all. The ability of today's CBT systems to implement powerful training strategies, simulate complex processes and systems, and individualize and control the training process make it certain that CBT will now, at long last, live up to its potential. This paper reviews the major technologies and trends involved and offers some suggestions for getting started in CBT

  3. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  4. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  5. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  6. Enhancing school-based asthma education efforts using computer-based education for children.

    Science.gov (United States)

    Nabors, Laura A; Kockritz, Jennifer L; Ludke, Robert L; Bernstein, Jonathan A

    2012-03-01

    Schools are an important site for delivery of asthma education programs. Computer-based educational programs are a critical component of asthma education programs and may be a particularly important education method in busy school environments. The objective of this brief report is to review and critique computer-based education efforts in schools. The results of our literature review indicated that school-based computer education efforts are related to improved knowledge about asthma and its management. In some studies, improvements in clinical outcomes also occur. Data collection programs need to be built into games that improve knowledge. Many projects do not appear to last for periods greater than 1 year and little information is available about cultural relevance of these programs. Educational games and other programs are effective methods of delivering knowledge about asthma management and control. Research about the long-term effects of this increased knowledge, in regard to behavior change, is needed. Additionally, developing sustainable projects, which are culturally relevant, is a goal for future research.

  7. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  8. CAMAC based computer--computer communications via microprocessor data links

    International Nuclear Information System (INIS)

    Potter, J.M.; Machen, D.R.; Naivar, F.J.; Elkins, E.P.; Simmonds, D.D.

    1976-01-01

    Communications between the central control computer and remote, satellite data acquisition/control stations at The Clinton P. Anderson Meson Physics Facility (LAMPF) is presently accomplished through the use of CAMAC based Data Link Modules. With the advent of the microprocessor, a new philosophy for digital data communications has evolved. Data Link modules containing microprocessor controllers provide link management and communication network protocol through algorithms executed in the Data Link microprocessor

  9. Computer-Based Technologies in Dentistry: Types and Applications

    Directory of Open Access Journals (Sweden)

    Rajaa Mahdi Musawi

    2016-10-01

    Full Text Available During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR simulators, augmented reality (AR and computer aided design/computer aided manufacturing (CAD/CAM systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established.This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.Keywords: Virtual Reality Exposure Therapy; Immersion; Computer-Aided Design; Dentistry; Education

  10. Medical imaging in clinical applications algorithmic and computer-based approaches

    CERN Document Server

    Bhateja, Vikrant; Hassanien, Aboul

    2016-01-01

    This volume comprises of 21 selected chapters, including two overview chapters devoted to abdominal imaging in clinical applications supported computer aided diagnosis approaches as well as different techniques for solving the pectoral muscle extraction problem in the preprocessing part of the CAD systems for detecting breast cancer in its early stage using digital mammograms. The aim of this book is to stimulate further research in medical imaging applications based algorithmic and computer based approaches and utilize them in real-world clinical applications. The book is divided into four parts, Part-I: Clinical Applications of Medical Imaging, Part-II: Classification and clustering, Part-III: Computer Aided Diagnosis (CAD) Tools and Case Studies and Part-IV: Bio-inspiring based Computer Aided diagnosis techniques. .

  11. Spin-based quantum computation in multielectron quantum dots

    OpenAIRE

    Hu, Xuedong; Sarma, S. Das

    2001-01-01

    In a quantum computer the hardware and software are intrinsically connected because the quantum Hamiltonian (or more precisely its time development) is the code that runs the computer. We demonstrate this subtle and crucial relationship by considering the example of electron-spin-based solid state quantum computer in semiconductor quantum dots. We show that multielectron quantum dots with one valence electron in the outermost shell do not behave simply as an effective single spin system unles...

  12. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    Science.gov (United States)

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  13. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    OpenAIRE

    Mohammad Mohammadi; Masoud Barzgaran

    2010-01-01

    Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.),the question may be whether the two modes of computer- and paper-based te...

  14. Computer-Based Self-Instructional Modules. Final Technical Report.

    Science.gov (United States)

    Weinstock, Harold

    Reported is a project involving seven chemists, six mathematicians, and six physicists in the production of computer-based, self-study modules for use in introductory college courses in chemistry, physics, and mathematics. These modules were designed to be used by students and instructors with little or no computer backgrounds, in institutions…

  15. Essential Means for Urban Computing : Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    NARCIS (Netherlands)

    Nourian, P.; Martinez-Ortiz, Carlos; Arroyo Ohori, G.A.K.

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages,

  16. Data Mining Based on Cloud-Computing Technology

    Directory of Open Access Journals (Sweden)

    Ren Ying

    2016-01-01

    Full Text Available There are performance bottlenecks and scalability problems when traditional data-mining system is used in cloud computing. In this paper, we present a data-mining platform based on cloud computing. Compared with a traditional data mining system, this platform is highly scalable, has massive data processing capacities, is service-oriented, and has low hardware cost. This platform can support the design and applications of a wide range of distributed data-mining systems.

  17. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  18. Evaluating Computer-Based Assessment in a Risk-Based Model

    Science.gov (United States)

    Zakrzewski, Stan; Steven, Christine; Ricketts, Chris

    2009-01-01

    There are three purposes for evaluation: evaluation for action to aid the decision making process, evaluation for understanding to further enhance enlightenment and evaluation for control to ensure compliance to standards. This article argues that the primary function of evaluation in the "Catherine Wheel" computer-based assessment (CBA)…

  19. An E-learning System based on Affective Computing

    Science.gov (United States)

    Duo, Sun; Song, Lu Xue

    In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.

  20. Knowledge-based computer systems for radiotherapy planning.

    Science.gov (United States)

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  1. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  2. Applications of computer based safety systems in Korea nuclear power plants

    International Nuclear Information System (INIS)

    Won Young Yun

    1998-01-01

    With the progress of computer technology, the applications of computer based safety systems in Korea nuclear power plants have increased rapidly in recent decades. The main purpose of this movement is to take advantage of modern computer technology so as to improve the operability and maintainability of the plants. However, in fact there have been a lot of controversies on computer based systems' safety between the regulatory body and nuclear utility in Korea. The Korea Institute of Nuclear Safety (KINS), technical support organization for nuclear plant licensing, is currently confronted with the pressure to set up well defined domestic regulatory requirements from this aspect. This paper presents the current status and the regulatory activities related to the applications of computer based safety systems in Korea. (author)

  3. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  4. A CAMAC-based laboratory computer system

    International Nuclear Information System (INIS)

    Westphal, G.P.

    1975-01-01

    A CAMAC-based laboratory computer network is described by sharing a common mass memory this offers distinct advantages over slow and core-consuming single-processor installations. A fast compiler-BASIC, with extensions for CAMAC and real-time, provides a convenient means for interactive experiment control

  5. Direct method of solving finite difference nonlinear equations for multicomponent diffusion in a gas centrifuge

    International Nuclear Information System (INIS)

    Potemki, Valeri G.; Borisevich, Valentine D.; Yupatov, Sergei V.

    1996-01-01

    This paper describes the the next evolution step in development of the direct method for solving systems of Nonlinear Algebraic Equations (SNAE). These equations arise from the finite difference approximation of original nonlinear partial differential equations (PDE). This method has been extended on the SNAE with three variables. The solving SNAE bases on Reiterating General Singular Value Decomposition of rectangular matrix pencils (RGSVD-algorithm). In contrast to the computer algebra algorithm in integer arithmetic based on the reduction to the Groebner's basis that algorithm is working in floating point arithmetic and realizes the reduction to the Kronecker's form. The possibilities of the method are illustrated on the example of solving the one-dimensional diffusion equation for 3-component model isotope mixture in a ga centrifuge. The implicit scheme for the finite difference equations without simplifying the nonlinear properties of the original equations is realized. The technique offered provides convergence to the solution for the single run. The Toolbox SNAE is developed in the framework of the high performance numeric computation and visualization software MATLAB. It includes more than 30 modules in MATLAB language for solving SNAE with two and three variables. (author)

  6. NURBS-based 3-d anthropomorphic computational phantoms for radiation dosimetry applications

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lodwick, Daniel; Lee, Choonik; Bolch, Wesley E.

    2007-01-01

    Computational anthropomorphic phantoms are computer models used in the evaluation of absorbed dose distributions within the human body. Currently, two classes of the computational phantoms have been developed and widely utilised for dosimetry calculation: (1) stylized (equation-based) and (2) voxel (image-based) phantoms describing human anatomy through the use of mathematical surface equations and 3-D voxel matrices, respectively. However, stylized phantoms have limitations in defining realistic organ contours and positioning as compared to voxel phantoms, which are themselves based on medical images of human subjects. In turn, voxel phantoms that have been developed through medical image segmentation have limitations in describing organs that are presented in low contrast within either magnetic resonance or computed tomography image. The present paper reviews the advantages and disadvantages of these existing classes of computational phantoms and introduces a hybrid approach to a computational phantom construction based on non-uniform rational B-Spline (NURBS) surface animation technology that takes advantage of the most desirable features of the former two phantom types. (authors)

  7. A Comparative Study of Paper-based and Computer-based Contextualization in Vocabulary Learning of EFL Students

    Directory of Open Access Journals (Sweden)

    Mousa Ahmadian

    2015-04-01

    Full Text Available Vocabulary acquisition is one of the largest and most important tasks in language classes. New technologies, such as computers, have helped a lot in this way. The importance of the issue led the researchers to do the present study which concerns the comparison of contextualized vocabulary learning on paper and through Computer Assisted Language Learning (CALL. To this end, 52 Pre-university EFL learners were randomly assigned in two groups: a paper-based group (PB and a computer-based (CB group each with 26 learners. The PB group received PB contextualization of vocabulary items, while the CB group received CB contextualization of the vocabulary items thorough PowerPoint (PP software. One pretest, posttest, along with an immediate and a delayed posttest were given to the learners. Paired samples t-test of pretest and posttest and independent samples t-test of the delayed and immediate posttest were executed by SPSS software. The results revealed that computer-based contextualization had more effects on vocabulary learning of Iranian EFL learners than paper-based contextualization of the words. Keywords: Computer-based contextualization, Paper-based contextualization, Vocabulary learning, CALL

  8. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  9. Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story

    Science.gov (United States)

    Gunbas, N.

    2015-01-01

    The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…

  10. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    Science.gov (United States)

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  11. Advanced methods for scattering amplitudes in gauge theories

    International Nuclear Information System (INIS)

    Peraro, Tiziano

    2014-01-01

    We present new techniques for the evaluation of multi-loop scattering amplitudes and their application to gauge theories, with relevance to the Standard Model phenomenology. We define a mathematical framework for the multi-loop integrand reduction of arbitrary diagrams, and elaborate algebraic approaches, such as the Laurent expansion method, implemented in the software Ninja, and the multivariate polynomial division technique by means of Groebner bases.

  12. Advanced methods for scattering amplitudes in gauge theories

    Energy Technology Data Exchange (ETDEWEB)

    Peraro, Tiziano

    2014-09-24

    We present new techniques for the evaluation of multi-loop scattering amplitudes and their application to gauge theories, with relevance to the Standard Model phenomenology. We define a mathematical framework for the multi-loop integrand reduction of arbitrary diagrams, and elaborate algebraic approaches, such as the Laurent expansion method, implemented in the software Ninja, and the multivariate polynomial division technique by means of Groebner bases.

  13. Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomised controlled trial

    Science.gov (United States)

    2007-01-01

    Background At postgraduate level evidence based medicine (EBM) is currently taught through tutor based lectures. Computer based sessions fit around doctors' workloads, and standardise the quality of educational provision. There have been no randomized controlled trials comparing computer based sessions with traditional lectures at postgraduate level within medicine. Methods This was a randomised controlled trial involving six postgraduate education centres in the West Midlands, U.K. Fifty five newly qualified foundation year one doctors (U.S internship equivalent) were randomised to either computer based sessions or an equivalent lecture in EBM and systematic reviews. The change from pre to post-intervention score was measured using a validated questionnaire assessing knowledge (primary outcome) and attitudes (secondary outcome). Results Both groups were similar at baseline. Participants' improvement in knowledge in the computer based group was equivalent to the lecture based group (gain in score: 2.1 [S.D = 2.0] versus 1.9 [S.D = 2.4]; ANCOVA p = 0.078). Attitudinal gains were similar in both groups. Conclusion On the basis of our findings we feel computer based teaching and learning is as effective as typical lecture based teaching sessions for educating postgraduates in EBM and systematic reviews. PMID:17659076

  14. An endohedral fullerene-based nuclear spin quantum computer

    International Nuclear Information System (INIS)

    Ju Chenyong; Suter, Dieter; Du Jiangfeng

    2011-01-01

    We propose a new scalable quantum computer architecture based on endohedral fullerene molecules. Qubits are encoded in the nuclear spins of the endohedral atoms, which posses even longer coherence times than the electron spins which are used as the qubits in previous proposals. To address the individual qubits, we use the hyperfine interaction, which distinguishes two modes (active and passive) of the nuclear spin. Two-qubit quantum gates are effectively implemented by employing the electronic dipolar interaction between adjacent molecules. The electron spins also assist in the qubit initialization and readout. Our architecture should be significantly easier to implement than earlier proposals for spin-based quantum computers, such as the concept of Kane [B.E. Kane, Nature 393 (1998) 133]. - Research highlights: → We propose an endohedral fullerene-based scalable quantum computer architecture. → Qubits are encoded on nuclear spins, while electron spins serve as auxiliaries. → Nuclear spins are individually addressed using the hyperfine interaction. → Two-qubit gates are implemented through the medium of electron spins.

  15. Organization of the secure distributed computing based on multi-agent system

    Science.gov (United States)

    Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera

    2018-04-01

    Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.

  16. Linear homotopy solution of nonlinear systems of equations in geodesy

    Science.gov (United States)

    Paláncz, Béla; Awange, Joseph L.; Zaletnyik, Piroska; Lewis, Robert H.

    2010-01-01

    A fundamental task in geodesy is solving systems of equations. Many geodetic problems are represented as systems of multivariate polynomials. A common problem in solving such systems is improper initial starting values for iterative methods, leading to convergence to solutions with no physical meaning, or to convergence that requires global methods. Though symbolic methods such as Groebner bases or resultants have been shown to be very efficient, i.e., providing solutions for determined systems such as 3-point problem of 3D affine transformation, the symbolic algebra can be very time consuming, even with special Computer Algebra Systems (CAS). This study proposes the Linear Homotopy method that can be implemented easily in high-level computer languages like C++ and Fortran that are faster than CAS by at least two orders of magnitude. Using Mathematica, the power of Homotopy is demonstrated in solving three nonlinear geodetic problems: resection, GPS positioning, and affine transformation. The method enlarging the domain of convergence is found to be efficient, less sensitive to rounding of numbers, and has lower complexity compared to other local methods like Newton-Raphson.

  17. Computer-Based English Language Testing in China: Present and Future

    Science.gov (United States)

    Yu, Guoxing; Zhang, Jing

    2017-01-01

    In this special issue on high-stakes English language testing in China, the two articles on computer-based testing (Jin & Yan; He & Min) highlight a number of consistent, ongoing challenges and concerns in the development and implementation of the nationwide IB-CET (Internet Based College English Test) and institutional computer-adaptive…

  18. Computer-based Programs in Speech Therapy of Dyslalia and Dyslexia- Dysgraphia

    Directory of Open Access Journals (Sweden)

    Mirela Danubianu

    2010-04-01

    Full Text Available During the last years, the researchers and therapists in speech therapy have been more and more concerned with the elaboration and use of computer programs in speech disorders therapy. The main objective of this study was to evaluate the therapeutic effectiveness of computer-based programs for the Romanian language in speech therapy. Along the study, we will present the experimental research through assessing the effectiveness of computer programs in the speech therapy for speech disorders: dyslalia, dyslexia and dysgraphia. Methodologically, the use of the computer in the therapeutic phases was carried out with the help of some computer-based programs (Logomon, Dislex-Test etc. that we elaborated and we experimented with during several years of therapeutic activity. The sample used in our experiments was composed of 120 subjects; two groups of 60 children with speech disorders were selected for both speech disorders: 30 for the experimental ('computer-based' group and 30 for the control ('classical method' group. The study hypotheses verified whether the results, obtained by the subjects within the experimental group, improved significantly after using the computer-based program, compared to the subjects within the control group, who did not use this program but got a classical therapy. The hypotheses were confirmed for the speech disorders included in this research; the conclusions of the study confirm the advantages of using computer-based programs within speech therapy by correcting these disorders, as well as due to the positive influence these programs have on the development of children’s personality.

  19. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  20. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    Science.gov (United States)

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to

  1. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  2. A Novel UDT-Based Transfer Speed-Up Protocol for Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhijie Han

    2018-01-01

    Full Text Available Fog computing is a distributed computing model as the middle layer between the cloud data center and the IoT device/sensor. It provides computing, network, and storage devices so that cloud based services can be closer to IOT devices and sensors. Cloud computing requires a lot of bandwidth, and the bandwidth of the wireless network is limited. In contrast, the amount of bandwidth required for “fog computing” is much less. In this paper, we improved a new protocol Peer Assistant UDT-Based Data Transfer Protocol (PaUDT, applied to Iot-Cloud computing. Furthermore, we compared the efficiency of the congestion control algorithm of UDT with the Adobe’s Secure Real-Time Media Flow Protocol (RTMFP, based on UDP completely at the transport layer. At last, we built an evaluation model of UDT in RTT and bit error ratio which describes the performance. The theoretical analysis and experiment result have shown that UDT has good performance in IoT-Cloud computing.

  3. Evaluation of cognitive loads imposed by traditional paper-based and innovative computer-based instructional strategies.

    Science.gov (United States)

    Khalil, Mohammed K; Mansour, Mahmoud M; Wilhite, Dewey R

    2010-01-01

    Strategies of presenting instructional information affect the type of cognitive load imposed on the learner's working memory. Effective instruction reduces extraneous (ineffective) cognitive load and promotes germane (effective) cognitive load. Eighty first-year students from two veterinary schools completed a two-section questionnaire that evaluated their perspectives on the educational value of a computer-based instructional program. They compared the difference between cognitive loads imposed by paper-based and computer-based instructional strategies used to teach the anatomy of the canine skeleton. Section I included 17 closed-ended items, rated on a five-point Likert scale, that assessed the use of graphics, content, and the learning process. Section II included a nine-point mental effort rating scale to measure the level of difficulty of instruction; students were asked to indicate the amount of mental effort invested in the learning task using both paper-based and computer-based presentation formats. The closed-ended data were expressed as means and standard deviations. A paired t test with an alpha level of 0.05 was used to determine the overall mean difference between the two presentation formats. Students positively evaluated their experience with the computer-based instructional program with a mean score of 4.69 (SD=0.53) for use of graphics, 4.70 (SD=0.56) for instructional content, and 4.45 (SD=0.67) for the learning process. The mean difference of mental effort (1.50) between the two presentation formats was significant, t=8.26, p≤.0001, df=76, for two-tailed distribution. Consistent with cognitive load theory, innovative computer-based instructional strategies decrease extraneous cognitive load compared with traditional paper-based instructional strategies.

  4. All-optical reservoir computer based on saturation of absorption.

    Science.gov (United States)

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  5. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  6. Computer-Game-Based Tutoring of Mathematics

    Science.gov (United States)

    Ke, Fengfeng

    2013-01-01

    This in-situ, descriptive case study examined the potential of implementing computer mathematics games as an anchor for tutoring of mathematics. Data were collected from middle school students at a rural pueblo school and an urban Hispanic-serving school, through in-field observation, content analysis of game-based tutoring-learning interactions,…

  7. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  8. Optimization and large scale computation of an entropy-based moment closure

    Science.gov (United States)

    Kristopher Garrett, C.; Hauck, Cory; Hill, Judith

    2015-12-01

    We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.

  9. Memristor-Based Synapse Design and Training Scheme for Neuromorphic Computing Architecture

    Science.gov (United States)

    2012-06-01

    system level built upon the conventional Von Neumann computer architecture [2][3]. Developing the neuromorphic architecture at chip level by...SCHEME FOR NEUROMORPHIC COMPUTING ARCHITECTURE 5a. CONTRACT NUMBER FA8750-11-2-0046 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6...creation of memristor-based neuromorphic computing architecture. Rather than the existing crossbar-based neuron network designs, we focus on memristor

  10. A Nuclear Safety System based on Industrial Computer

    International Nuclear Information System (INIS)

    Kim, Ji Hyeon; Oh, Do Young; Lee, Nam Hoon; Kim, Chang Ho; Kim, Jae Hack

    2011-01-01

    The Plant Protection System(PPS), a nuclear safety Instrumentation and Control (I and C) system for Nuclear Power Plants(NPPs), generates reactor trip on abnormal reactor condition. The Core Protection Calculator System (CPCS) is a safety system that generates and transmits the channel trip signal to the PPS on an abnormal condition. Currently, these systems are designed on the Programmable Logic Controller(PLC) based system and it is necessary to consider a new system platform to adapt simpler system configuration and improved software development process. The CPCS was the first implementation using a micro computer in a nuclear power plant safety protection system in 1980 which have been deployed in Ulchin units 3,4,5,6 and Younggwang units 3,4,5,6. The CPCS software was developed in the Concurrent Micro5 minicomputer using assembly language and embedded into the Concurrent 3205 computer. Following the micro computer based CPCS, PLC based Common-Q platform has been used for the ShinKori/ShinWolsong units 1,2 PPS and CPCS, and the POSAFE-Q PLC platform is used for the ShinUlchin units 1,2 PPS and CPCS. In developing the next generation safety system platform, several factors (e.g., hardware/software reliability, flexibility, licensibility and industrial support) can be considered. This paper suggests an Industrial Computer(IC) based protection system that can be developed with improved flexibility without losing system reliability. The IC based system has the advantage of a simple system configuration with optimized processor boards because of improved processor performance and unlimited interoperability between the target system and development system that use commercial CASE tools. This paper presents the background to selecting the IC based system with a case study design of the CPCS. Eventually, this kind of platform can be used for nuclear power plant safety systems like the PPS, CPCS, Qualified Indication and Alarm . Pami(QIAS-P), and Engineering Safety

  11. A Nuclear Safety System based on Industrial Computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Hyeon; Oh, Do Young; Lee, Nam Hoon; Kim, Chang Ho; Kim, Jae Hack [Korea Electric Power Corporation Engineering and Construction, Daejeon (Korea, Republic of)

    2011-05-15

    The Plant Protection System(PPS), a nuclear safety Instrumentation and Control (I and C) system for Nuclear Power Plants(NPPs), generates reactor trip on abnormal reactor condition. The Core Protection Calculator System (CPCS) is a safety system that generates and transmits the channel trip signal to the PPS on an abnormal condition. Currently, these systems are designed on the Programmable Logic Controller(PLC) based system and it is necessary to consider a new system platform to adapt simpler system configuration and improved software development process. The CPCS was the first implementation using a micro computer in a nuclear power plant safety protection system in 1980 which have been deployed in Ulchin units 3,4,5,6 and Younggwang units 3,4,5,6. The CPCS software was developed in the Concurrent Micro5 minicomputer using assembly language and embedded into the Concurrent 3205 computer. Following the micro computer based CPCS, PLC based Common-Q platform has been used for the ShinKori/ShinWolsong units 1,2 PPS and CPCS, and the POSAFE-Q PLC platform is used for the ShinUlchin units 1,2 PPS and CPCS. In developing the next generation safety system platform, several factors (e.g., hardware/software reliability, flexibility, licensibility and industrial support) can be considered. This paper suggests an Industrial Computer(IC) based protection system that can be developed with improved flexibility without losing system reliability. The IC based system has the advantage of a simple system configuration with optimized processor boards because of improved processor performance and unlimited interoperability between the target system and development system that use commercial CASE tools. This paper presents the background to selecting the IC based system with a case study design of the CPCS. Eventually, this kind of platform can be used for nuclear power plant safety systems like the PPS, CPCS, Qualified Indication and Alarm . Pami(QIAS-P), and Engineering Safety

  12. Reciprocal Questioning and Computer-based Instruction in Introductory Auditing: Student Perceptions.

    Science.gov (United States)

    Watters, Mike

    2000-01-01

    An auditing course used reciprocal questioning (Socratic method) and computer-based instruction. Separate evaluations by 67 students revealed a strong aversion to the Socratic method; students expected professors to lecture. They showed a strong preference for the computer-based assignment. (SK)

  13. Nanophotonic quantum computer based on atomic quantum transistor

    International Nuclear Information System (INIS)

    Andrianov, S N; Moiseev, S A

    2015-01-01

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  14. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    Science.gov (United States)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  15. Nanophotonic quantum computer based on atomic quantum transistor

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, S N [Institute of Advanced Research, Academy of Sciences of the Republic of Tatarstan, Kazan (Russian Federation); Moiseev, S A [Kazan E. K. Zavoisky Physical-Technical Institute, Kazan Scientific Center, Russian Academy of Sciences, Kazan (Russian Federation)

    2015-10-31

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  16. Touch-based Brain Computer Interfaces: State of the art

    NARCIS (Netherlands)

    Erp, J.B.F. van; Brouwer, A.M.

    2014-01-01

    Brain Computer Interfaces (BCIs) rely on the user's brain activity to control equipment or computer devices. Many BCIs are based on imagined movement (called active BCIs) or the fact that brain patterns differ in reaction to relevant or attended stimuli in comparison to irrelevant or unattended

  17. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    Science.gov (United States)

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  18. On the Computation of Comprehensive Boolean Gröbner Bases

    Science.gov (United States)

    Inoue, Shutaro

    We show that a comprehensive Boolean Gröbner basis of an ideal I in a Boolean polynomial ring B (bar A,bar X) with main variables bar X and parameters bar A can be obtained by simply computing a usual Boolean Gröbner basis of I regarding both bar X and bar A as variables with a certain block term order such that bar X ≫ bar A. The result together with a fact that a finite Boolean ring is isomorphic to a direct product of the Galois field mathbb{GF}_2 enables us to compute a comprehensive Boolean Gröbner basis by only computing corresponding Gröbner bases in a polynomial ring over mathbb{GF}_2. Our implementation in a computer algebra system Risa/Asir shows that our method is extremely efficient comparing with existing computation algorithms of comprehensive Boolean Gröbner bases.

  19. Computer Animation Based on Particle Methods

    Directory of Open Access Journals (Sweden)

    Rafal Wcislo

    1999-01-01

    Full Text Available The paper presents the main issues of a computer animation of a set of elastic macroscopic objects based on the particle method. The main assumption of the generated animations is to achieve very realistic movements in a scene observed on the computer display. The objects (solid bodies interact mechanically with each other, The movements and deformations of solids are calculated using the particle method. Phenomena connected with the behaviour of solids in the gravitational field, their defomtations caused by collisions and interactions with the optional liquid medium are simulated. The simulation ofthe liquid is performed using the cellular automata method. The paper presents both simulation schemes (particle method and cellular automata rules an the method of combining them in the single animation program. ln order to speed up the execution of the program the parallel version based on the network of workstation was developed. The paper describes the methods of the parallelization and it considers problems of load-balancing, collision detection, process synchronization and distributed control of the animation.

  20. Using computer-based training to facilitate radiation protection review

    International Nuclear Information System (INIS)

    Abercrombie, J.S.; Copenhaver, E.D.

    1989-01-01

    In a national laboratory setting, it is necessary to provide radiation protection overview and training to diverse parts of the laboratory population. This includes employees at research reactors, accelerators, waste facilities, radiochemical isotope processing, and analytical laboratories, among others. In addition, our own radiation protection and monitoring staffs must be trained. To assist in the implementation of this full range of training, ORNL has purchased prepackaged computer-based training in health physics and technical mathematics with training modules that can be selected from many topics. By selection of specific modules, appropriate radiation protection review packages can be determined to meet many individual program needs. Because our radiation protection personnel must have some previous radiation protection experience or the equivalent of an associate's degree in radiation protection for entry level, the computer-based training will serve primarily as review of major principles. Others may need very specific prior training to make the computer-based training effective in their work situations. 4 refs

  1. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  2. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  3. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  4. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    Science.gov (United States)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  5. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  6. Degeneracy of time series models: The best model is not always the correct model

    International Nuclear Information System (INIS)

    Judd, Kevin; Nakamura, Tomomichi

    2006-01-01

    There are a number of good techniques for finding, in some sense, the best model of a deterministic system given a time series of observations. We examine a problem called model degeneracy, which has the consequence that even when a perfect model of a system exists, one does not find it using the best techniques currently available. The problem is illustrated using global polynomial models and the theory of Groebner bases

  7. Computational neuroanatomy: ontology-based representation of neural components and connectivity.

    Science.gov (United States)

    Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron

    2009-02-05

    A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.

  8. ISAT promises fail-safe computer-based reactor protection

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    AEA Technology's ISAT system is a multiplexed microprocessor-based reactor protection system which has very extensive self-monitoring capabilities and is inherently fail safe. It provides a way of addressing software reliability problems that have tended to hamper widespread introduction of computer-based reactor protection. (author)

  9. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing.

    Science.gov (United States)

    van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are "in situ." In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired "blackboards." The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing.

  10. The Study of Pallet Pooling Information Platform Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jia-bin Li

    2018-01-01

    Full Text Available Effective implementation of pallet pooling system needs a strong information platform to support. Through the analysis of existing pallet pooling information platform (PPIP, the paper pointed out that the existing studies of PPIP are mainly based on traditional IT infrastructures and technologies which have software, hardware, resource utilization, and process restrictions. Because of the advantages of cloud computing technology like strong computing power, high flexibility, and low cost which meet the requirements of the PPIP well, this paper gave a PPIP architecture of two parts based on cloud computing: the users client and the cloud services. The cloud services include three layers, which are IaaS, PaaS, and SaaS. The method of how to deploy PPIP based on cloud computing is proposed finally.

  11. USING COMPUTER-BASED TESTING AS ALTERNATIVE ASSESSMENT METHOD OF STUDENT LEARNING IN DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    Amalia SAPRIATI

    2010-04-01

    Full Text Available This paper addresses the use of computer-based testing in distance education, based on the experience of Universitas Terbuka (UT, Indonesia. Computer-based testing has been developed at UT for reasons of meeting the specific needs of distance students as the following: Ø students’ inability to sit for the scheduled test, Ø conflicting test schedules, and Ø students’ flexibility to take examination to improve their grades. In 2004, UT initiated a pilot project in the development of system and program for computer-based testing method. Then in 2005 and 2006 tryouts in the use of computer-based testing methods were conducted in 7 Regional Offices that were considered as having sufficient supporting recourses. The results of the tryouts revealed that students were enthusiastic in taking computer-based tests and they expected that the test method would be provided by UT as alternative to the traditional paper and pencil test method. UT then implemented computer-based testing method in 6 and 12 Regional Offices in 2007 and 2008 respectively. The computer-based testing was administered in the city of the designated Regional Office and was supervised by the Regional Office staff. The development of the computer-based testing was initiated with conducting tests using computers in networked configuration. The system has been continually improved, and it currently uses devices linked to the internet or the World Wide Web. The construction of the test involves the generation and selection of the test items from the item bank collection of the UT Examination Center. Thus the combination of the selected items compromises the test specification. Currently UT has offered 250 courses involving the use of computer-based testing. Students expect that more courses are offered with computer-based testing in Regional Offices within easy access by students.

  12. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  13. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    Science.gov (United States)

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  14. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  15. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  16. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  17. An expert fitness diagnosis system based on elastic cloud computing.

    Science.gov (United States)

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  18. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kevin C. Tseng

    2014-01-01

    Full Text Available This paper presents an expert diagnosis system based on cloud computing. It classifies a user’s fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user’s physiological data, such as age, gender, and body mass index (BMI. In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8% and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  19. Semi-supervised adaptation in ssvep-based brain-computer interface using tri-training

    DEFF Research Database (Denmark)

    Bender, Thomas; Kjaer, Troels W.; Thomsen, Carsten E.

    2013-01-01

    This paper presents a novel and computationally simple tri-training based semi-supervised steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI). It is implemented with autocorrelation-based features and a Naïve-Bayes classifier (NBC). The system uses nine characters...

  20. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment.

    Science.gov (United States)

    Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.

  1. Security personnel training using a computer-based game

    International Nuclear Information System (INIS)

    Ralph, J.; Bickner, L.

    1987-01-01

    Security personnel training is an integral part of a total physical security program, and is essential in enabling security personnel to perform their function effectively. Several training tools are currently available for use by security supervisors, including: textbook study, classroom instruction, and live simulations. However, due to shortcomings inherent in each of these tools, a need exists for the development of low-cost alternative training methods. This paper discusses one such alternative: a computer-based, game-type security training system. This system would be based on a personal computer with high-resolution graphics. Key features of this system include: a high degree of realism; flexibility in use and maintenance; high trainee motivation; and low cost

  2. Fog computing job scheduling optimization based on bees swarm

    Science.gov (United States)

    Bitam, Salim; Zeadally, Sherali; Mellouk, Abdelhamid

    2018-04-01

    Fog computing is a new computing architecture, composed of a set of near-user edge devices called fog nodes, which collaborate together in order to perform computational services such as running applications, storing an important amount of data, and transmitting messages. Fog computing extends cloud computing by deploying digital resources at the premise of mobile users. In this new paradigm, management and operating functions, such as job scheduling aim at providing high-performance, cost-effective services requested by mobile users and executed by fog nodes. We propose a new bio-inspired optimization approach called Bees Life Algorithm (BLA) aimed at addressing the job scheduling problem in the fog computing environment. Our proposed approach is based on the optimized distribution of a set of tasks among all the fog computing nodes. The objective is to find an optimal tradeoff between CPU execution time and allocated memory required by fog computing services established by mobile users. Our empirical performance evaluation results demonstrate that the proposal outperforms the traditional particle swarm optimization and genetic algorithm in terms of CPU execution time and allocated memory.

  3. Learners’ views about cloud computing-based group activities

    Directory of Open Access Journals (Sweden)

    Yildirim Serkan

    2017-01-01

    Full Text Available Thanks to its use independently of time and place during the process of software development and by making it easier to access to information with mobile technologies, cloud based environments attracted the attention of education world and this technology started to be used in various activities. In this study, for programming education, the effects of extracurricular group assignments in cloud based environments on learners were evaluated in terms of group work satisfaction, ease of use and user satisfaction. Within the scope of computer programming education lasting eight weeks, a total of 100 students participated in the study including 34 men and 66 women. Participants were divided into groups of at least three people considering the advantages of cooperative learning in programming education. In this study carried out in both conventional and cloud based environments, between groups factorial design was used as research design. The data collected by questionnaires of opinions of group work were examined with quantitative analysis method. According to the study results extracurricular learning activities as group activity created satisfaction. However, perceptions of easy use of the environment and user satisfaction were partly positive. Despite the similar understandings; male participants were easier to perceive use of cloud computing based environments. Some variables such as class level, satisfaction, computer and internet usage time do not have any effect on satisfaction and perceptions of ease of use. Evening class students stated that they found it easy to use cloud based learning environments and became more satisfied with using these environments besides being happier with group work than daytime students.

  4. Commentary on: "Toward Computer-Based Support of Metacognitive Skills: A Computational Framework to Coach Self Explanation"

    Science.gov (United States)

    Conati, Cristina

    2016-01-01

    This paper is a commentary on "Toward Computer-Based Support of Meta-Cognitive Skills: a Computational Framework to Coach Self-Explanation", by Cristina Conati and Kurt Vanlehn, published in the "IJAED" in 2000 (Conati and VanLehn 2010). This work was one of the first examples of Intelligent Learning Environments (ILE) that…

  5. A rule-based computer control system for PBX-M neutral beams

    International Nuclear Information System (INIS)

    Frank, K.T.; Kozub, T.A.; Kugel, H.W.

    1987-01-01

    The Princeton Beta Experiment (PBX) neutral beams have been routinely operated under automatic computer control. A major upgrade of the computer configuration was undertaken to coincide with the PBX machine modification. The primary tasks included in the computer control system are data acquisition, waveform reduction, automatic control and data storage. The portion of the system which will remain intact is the rule-based approach to automatic control. Increased computational and storage capability will allow the expansion of the knowledge base previously used. The hardware configuration supported by the PBX Neutral Beam (XNB) software includes a dedicated Microvax with five CAMAC crates and four process controllers. The control algorithms are rule-based and goal-driven. The automatic control system raises ion source electrical parameters to selected energy goals and maintains these levels until new goals are requested or faults are detected

  6. Computation of Difference Grobner Bases

    Directory of Open Access Journals (Sweden)

    Vladimir P. Gerdt

    2012-07-01

    Full Text Available This paper is an updated and extended version of our note \\cite{GR'06} (cf.\\ also \\cite{GR-ACAT}. To compute difference \\Gr bases of ideals generated by linear polynomials we adopt to difference polynomial rings the involutive algorithm based on Janet-like division. The algorithm has been implemented in Maple in the form of the package LDA (Linear Difference Algebra and we describe the main features of the package. Its applications are illustrated by generation of finite difference approximations to linear partial differential equations and by reduction of Feynman integrals. We also present the algorithm for an ideal generated by a finite set of nonlinear difference polynomials. If the algorithm terminates, then it constructs a \\Gr basis of the ideal.

  7. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  8. Computer-based interventions for drug use disorders: A systematic review

    Science.gov (United States)

    Moore, Brent A.; Fazzino, Tera; Garnet, Brian; Cutter, Christopher J.; Barry, Declan T.

    2011-01-01

    A range of innovative computer-based interventions for psychiatric disorders have been developed, and are promising for drug use disorders, due to reduced cost and greater availability compared to traditional treatment. Electronic searches were conducted from 1966 to November 19, 2009 using MEDLINE, Psychlit, and EMBASE. 468 non-duplicate records were identified. Two reviewers classified abstracts for study inclusion, resulting in 12 studies of moderate quality. Eleven studies were pilot or full-scale trials compared to a control condition. Interventions showed high acceptability despite substantial variation in type and amount of treatment. Compared to treatment-as-usual, computer-based interventions led to less substance use as well as higher motivation to change, better retention, and greater knowledge of presented information. Computer-based interventions for drug use disorders have the potential to dramatically expand and alter the landscape of treatment. Evaluation of internet and phone-based delivery that allow for treatment-on-demand in patients’ own environment is needed. PMID:21185683

  9. Individual versus Interactive Task-Based Performance through Voice-Based Computer-Mediated Communication

    Science.gov (United States)

    Granena, Gisela

    2016-01-01

    Interaction is a necessary condition for second language (L2) learning (Long, 1980, 1996). Research in computer-mediated communication has shown that interaction opportunities make learners pay attention to form in a variety of ways that promote L2 learning. This research has mostly investigated text-based rather than voice-based interaction. The…

  10. Computer- and Suggestion-based Cognitive Rehabilitation following Acquired Brain Injury

    DEFF Research Database (Denmark)

    Lindeløv, Jonas Kristoffer

    . That is, training does not cause cognitive transfer and thus does not constitute “brain training” or “brain exercise” of any clinical relevance. A larger study found more promising results for a suggestion-based treatment in a hypnotic procedure. Patients improved to above population average in a matter...... of 4-8 hours, making this by far the most effective treatment compared to computer-based training, physical exercise, phamaceuticals, meditation, and attention process training. The contrast between computer-based methods and the hypnotic suggestion treatment may be reflect a more general discrepancy...

  11. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment

    Science.gov (United States)

    Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632

  12. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  13. OpenCL-based vicinity computation for 3D multiresolution mesh compression

    Science.gov (United States)

    Hachicha, Soumaya; Elkefi, Akram; Ben Amar, Chokri

    2017-03-01

    3D multiresolution mesh compression systems are still widely addressed in many domains. These systems are more and more requiring volumetric data to be processed in real-time. Therefore, the performance is becoming constrained by material resources usage and an overall reduction in the computational time. In this paper, our contribution entirely lies on computing, in real-time, triangles neighborhood of 3D progressive meshes for a robust compression algorithm based on the scan-based wavelet transform(WT) technique. The originality of this latter algorithm is to compute the WT with minimum memory usage by processing data as they are acquired. However, with large data, this technique is considered poor in term of computational complexity. For that, this work exploits the GPU to accelerate the computation using OpenCL as a heterogeneous programming language. Experiments demonstrate that, aside from the portability across various platforms and the flexibility guaranteed by the OpenCL-based implementation, this method can improve performance gain in speedup factor of 5 compared to the sequential CPU implementation.

  14. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  15. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    Science.gov (United States)

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  16. A Quantitative Exploration of Preservice Teachers' Intent to Use Computer-based Technology

    Science.gov (United States)

    Kim, Kioh; Jain, Sachin; Westhoff, Guy; Rezabek, Landra

    2008-01-01

    Based on Bandura's (1977) social learning theory, the purpose of this study is to identify the relationship of preservice teachers' perceptions of faculty modeling of computer-based technology and preservice teachers' intent of using computer-based technology in educational settings. There were 92 participants in this study; they were enrolled in…

  17. Providing Feedback on Computer-Based Algebra Homework in Middle-School Classrooms

    Science.gov (United States)

    Fyfe, Emily R.

    2016-01-01

    Homework is transforming at a rapid rate with continuous advances in educational technology. Computer-based homework, in particular, is gaining popularity across a range of schools, with little empirical evidence on how to optimize student learning. The current aim was to test the effects of different types of feedback on computer-based homework.…

  18. GPU-based high-performance computing for radiation therapy

    International Nuclear Information System (INIS)

    Jia, Xun; Jiang, Steve B; Ziegenhein, Peter

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. (topical review)

  19. Computing derivative-based global sensitivity measures using polynomial chaos expansions

    International Nuclear Information System (INIS)

    Sudret, B.; Mai, C.V.

    2015-01-01

    In the field of computer experiments sensitivity analysis aims at quantifying the relative importance of each input parameter (or combinations thereof) of a computational model with respect to the model output uncertainty. Variance decomposition methods leading to the well-known Sobol' indices are recognized as accurate techniques, at a rather high computational cost though. The use of polynomial chaos expansions (PCE) to compute Sobol' indices has allowed to alleviate the computational burden though. However, when dealing with large dimensional input vectors, it is good practice to first use screening methods in order to discard unimportant variables. The derivative-based global sensitivity measures (DGSMs) have been developed recently in this respect. In this paper we show how polynomial chaos expansions may be used to compute analytically DGSMs as a mere post-processing. This requires the analytical derivation of derivatives of the orthonormal polynomials which enter PC expansions. Closed-form expressions for Hermite, Legendre and Laguerre polynomial expansions are given. The efficiency of the approach is illustrated on two well-known benchmark problems in sensitivity analysis. - Highlights: • Derivative-based global sensitivity measures (DGSM) have been developed for screening purpose. • Polynomial chaos expansions (PC) are used as a surrogate model of the original computational model. • From a PC expansion the DGSM can be computed analytically. • The paper provides the derivatives of Hermite, Legendre and Laguerre polynomials for this purpose

  20. Computer-based systems for nuclear power stations

    International Nuclear Information System (INIS)

    Humble, P.J.; Welbourne, D.; Belcher, G.

    1995-01-01

    The published intentions of vendors are for extensive touch-screen control and computer-based protection. The software features needed for acceptance in the UK are indicated. The defence in depth needed is analyzed. Current practice in aircraft flight control systems and the software methods available are discussed. Software partitioning and mathematically formal methods are appropriate for the structures and simple logic needed for nuclear power applications. The potential for claims of diversity and independence between two computer-based subsystems of a protection system is discussed. Features needed to meet a single failure criterion applied to software are discussed. Conclusions are given on the main factors which a design should allow for. The work reported was done for the Health and Safety Executive of the UK (HSE), and acknowledgement is given to them, to NNC Ltd and to GEC-Marconi Avionics Ltd for permission to publish. The opinions and recommendations expressed are those of the authors and do not necessarily reflect those of HSE. (Author)

  1. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  2. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  3. Solid-State Quantum Computer Based on Scanning Tunneling Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Berman, G. P.; Brown, G. W.; Hawley, M. E.; Tsifrinovich, V. I.

    2001-08-27

    We propose a solid-state nuclear-spin quantum computer based on application of scanning tunneling microscopy (STM) and well-developed silicon technology. It requires the measurement of tunneling-current modulation caused by the Larmor precession of a single electron spin. Our envisioned STM quantum computer would operate at the high magnetic field ({approx}10 T) and at low temperature {approx}1 K .

  4. Solid-State Quantum Computer Based on Scanning Tunneling Microscopy

    International Nuclear Information System (INIS)

    Berman, G. P.; Brown, G. W.; Hawley, M. E.; Tsifrinovich, V. I.

    2001-01-01

    We propose a solid-state nuclear-spin quantum computer based on application of scanning tunneling microscopy (STM) and well-developed silicon technology. It requires the measurement of tunneling-current modulation caused by the Larmor precession of a single electron spin. Our envisioned STM quantum computer would operate at the high magnetic field (∼10 T) and at low temperature ∼1 K

  5. Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... in the aftereffect. The findings have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  6. Computational chemistry and metal-based radiopharmaceuticals

    International Nuclear Information System (INIS)

    Neves, M.; Fausto, R.

    1998-01-01

    Computer-assisted techniques have found extensive use in the design of organic pharmaceuticals but have not been widely applied on metal complexes, particularly on radiopharmaceuticals. Some examples of computer generated structures of complexes of In, Ga and Tc with N, S, O and P donor ligands are referred. Besides parameters directly related with molecular geometries, molecular properties of the predicted structures, as ionic charges or dipole moments, are considered to be related with biodistribution studies. The structure of a series of oxo neutral Tc-biguanide complexes are predicted by molecular mechanics calculations, and their interactions with water molecules or peptide chains correlated with experimental data of partition coefficients and percentage of human protein binding. The results stress the interest of using molecular modelling to predict molecular properties of metal-based radiopharmaceuticals, which can be successfully correlated with results of in vitro studies. (author)

  7. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  8. Fault-tolerant measurement-based quantum computing with continuous-variable cluster states.

    Science.gov (United States)

    Menicucci, Nicolas C

    2014-03-28

    A long-standing open question about Gaussian continuous-variable cluster states is whether they enable fault-tolerant measurement-based quantum computation. The answer is yes. Initial squeezing in the cluster above a threshold value of 20.5 dB ensures that errors from finite squeezing acting on encoded qubits are below the fault-tolerance threshold of known qubit-based error-correcting codes. By concatenating with one of these codes and using ancilla-based error correction, fault-tolerant measurement-based quantum computation of theoretically indefinite length is possible with finitely squeezed cluster states.

  9. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  11. The use of gold nanoparticle aggregation for DNA computing and logic-based biomolecular detection

    International Nuclear Information System (INIS)

    Lee, In-Hee; Yang, Kyung-Ae; Zhang, Byoung-Tak; Lee, Ji-Hoon; Park, Ji-Yoon; Chai, Young Gyu; Lee, Jae-Hoon

    2008-01-01

    The use of DNA molecules as a physical computational material has attracted much interest, especially in the area of DNA computing. DNAs are also useful for logical control and analysis of biological systems if efficient visualization methods are available. Here we present a quick and simple visualization technique that displays the results of the DNA computing process based on a colorimetric change induced by gold nanoparticle aggregation, and we apply it to the logic-based detection of biomolecules. Our results demonstrate its effectiveness in both DNA-based logical computation and logic-based biomolecular detection

  12. Concept of development of integrated computer - based control system for 'Ukryttia' object

    International Nuclear Information System (INIS)

    Buyal'skij, V.M.; Maslov, V.P.

    2003-01-01

    The structural concept of Chernobyl NPP 'Ukryttia' Object's integrated computer - based control system development is presented on the basis of general concept of integrated Computer - based Control System (CCS) design process for organizing and technical management subjects.The concept is aimed at state-of-the-art architectural design technique application and allows using modern computer-aided facilities for functional model,information (logical and physical) models development,as well as for system object model under design

  13. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  14. Computer-Based Methods for Collecting Peer Nomination Data: Utility, Practice, and Empirical Support.

    Science.gov (United States)

    van den Berg, Yvonne H M; Gommans, Rob

    2017-09-01

    New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the questionnaire and asking respondents to fill it in on computers. In this chapter the advantages and challenges of computer-based assessments are discussed. In addition, a list of practical recommendations and considerations is provided to inform researchers on how computer-based methods can be applied to their own research. Although the focus is on the collection of peer nomination data in particular, many of the requirements, considerations, and implications are also relevant for those who consider the use of other sociometric assessment methods (e.g., paired comparisons, peer ratings, peer rankings) or computer-based assessments in general. © 2017 Wiley Periodicals, Inc.

  15. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.

    Science.gov (United States)

    Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.

  16. Dataflow-Based Mapping of Computer Vision Algorithms onto FPGAs

    Directory of Open Access Journals (Sweden)

    Ivan Corretjer

    2007-01-01

    Full Text Available We develop a design methodology for mapping computer vision algorithms onto an FPGA through the use of coarse-grain reconfigurable dataflow graphs as a representation to guide the designer. We first describe a new dataflow modeling technique called homogeneous parameterized dataflow (HPDF, which effectively captures the structure of an important class of computer vision applications. This form of dynamic dataflow takes advantage of the property that in a large number of image processing applications, data production and consumption rates can vary, but are equal across dataflow graph edges for any particular application iteration. After motivating and defining the HPDF model of computation, we develop an HPDF-based design methodology that offers useful properties in terms of verifying correctness and exposing performance-enhancing transformations; we discuss and address various challenges in efficiently mapping an HPDF-based application representation into target-specific HDL code; and we present experimental results pertaining to the mapping of a gesture recognition application onto the Xilinx Virtex II FPGA.

  17. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    Energy Technology Data Exchange (ETDEWEB)

    Throneburg, E. B.; Jones, J. M. [AREVA NP Inc., 7207 IBM Drive, Charlotte, NC 28262 (United States)

    2006-07-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presented that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)

  18. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    International Nuclear Information System (INIS)

    Throneburg, E. B.; Jones, J. M.

    2006-01-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presented that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)

  19. Students' Motivation toward Computer-Based Language Learning

    Science.gov (United States)

    Genc, Gulten; Aydin, Selami

    2011-01-01

    The present article examined some factors affecting the motivation level of the preparatory school students in using a web-based computer-assisted language-learning course. The sample group of the study consisted of 126 English-as-a-foreign-language learners at a preparatory school of a state university. After performing statistical analyses…

  20. Computer Game-Based Learning: Perceptions and Experiences of Senior Chinese Adults

    Science.gov (United States)

    Wang, Feihong; Lockee, Barbara B.; Burton, John K.

    2012-01-01

    The purpose of this study was to investigate senior Chinese adults' potential acceptance of computer game-based learning (CGBL) by probing their perceptions of computer game play and their perceived impacts of game play on their learning of computer skills and life satisfaction. A total of 60 senior adults from a local senior adult learning center…

  1. GPU-based cone beam computed tomography.

    Science.gov (United States)

    Noël, Peter B; Walczak, Alan M; Xu, Jinhui; Corso, Jason J; Hoffmann, Kenneth R; Schafer, Sebastian

    2010-06-01

    The use of cone beam computed tomography (CBCT) is growing in the clinical arena due to its ability to provide 3D information during interventions, its high diagnostic quality (sub-millimeter resolution), and its short scanning times (60 s). In many situations, the short scanning time of CBCT is followed by a time-consuming 3D reconstruction. The standard reconstruction algorithm for CBCT data is the filtered backprojection, which for a volume of size 256(3) takes up to 25 min on a standard system. Recent developments in the area of Graphic Processing Units (GPUs) make it possible to have access to high-performance computing solutions at a low cost, allowing their use in many scientific problems. We have implemented an algorithm for 3D reconstruction of CBCT data using the Compute Unified Device Architecture (CUDA) provided by NVIDIA (NVIDIA Corporation, Santa Clara, California), which was executed on a NVIDIA GeForce GTX 280. Our implementation results in improved reconstruction times from minutes, and perhaps hours, to a matter of seconds, while also giving the clinician the ability to view 3D volumetric data at higher resolutions. We evaluated our implementation on ten clinical data sets and one phantom data set to observe if differences occur between CPU and GPU-based reconstructions. By using our approach, the computation time for 256(3) is reduced from 25 min on the CPU to 3.2 s on the GPU. The GPU reconstruction time for 512(3) volumes is 8.5 s. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  2. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  3. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  4. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  5. Real Time Animation of Trees Based on BBSC in Computer Games

    Directory of Open Access Journals (Sweden)

    Xuefeng Ao

    2009-01-01

    Full Text Available That researchers in the field of computer games usually find it is difficult to simulate the motion of actual 3D model trees lies in the fact that the tree model itself has very complicated structure, and many sophisticated factors need to be considered during the simulation. Though there are some works on simulating 3D tree and its motion, few of them are used in computer games due to the high demand for real-time in computer games. In this paper, an approach of animating trees in computer games based on a novel tree model representation—Ball B-Spline Curves (BBSCs are proposed. By taking advantage of the good features of the BBSC-based model, physical simulation of the motion of leafless trees with wind blowing becomes easier and more efficient. The method can generate realistic 3D tree animation in real-time, which meets the high requirement for real time in computer games.

  6. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...

  7. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  8. A Fixpoint-Based Calculus for Graph-Shaped Computational Fields

    DEFF Research Database (Denmark)

    Lluch Lafuente, Alberto; Loreti, Michele; Montanari, Ugo

    2015-01-01

    topology is represented by a graph-shaped field, namely a network with attributes on both nodes and arcs, where arcs represent interaction capabilities between nodes. We propose a calculus where computation is strictly synchronous and corresponds to sequential computations of fixpoints in the graph......-shaped field. Under some conditions, those fixpoints can be computed by synchronised iterations, where in each iteration the attributes of a node is updated based on the attributes of the neighbours in the previous iteration. Basic constructs are reminiscent of the semiring μ-calculus, a semiring......-valued generalisation of the modal μ-calculus, which provides a flexible mechanism to specify the neighbourhood range (according to path formulae) and the way attributes should be combined (through semiring operators). Additional control-How constructs allow one to conveniently structure the fixpoint computations. We...

  9. Design and implementation of distributed spatial computing node based on WPS

    International Nuclear Information System (INIS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-01-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed

  10. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  11. Could one make a diamond-based quantum computer?

    International Nuclear Information System (INIS)

    Stoneham, A Marshall; Harker, A H; Morley, Gavin W

    2009-01-01

    We assess routes to a diamond-based quantum computer, where we specifically look towards scalable devices, with at least 10 linked quantum gates. Such a computer should satisfy the deVincenzo rules and might be used at convenient temperatures. The specific examples that we examine are based on the optical control of electron spins. For some such devices, nuclear spins give additional advantages. Since there have already been demonstrations of basic initialization and readout, our emphasis is on routes to two-qubit quantum gate operations and the linking of perhaps 10-20 such gates. We analyse the dopant properties necessary, especially centres containing N and P, and give results using simple scoping calculations for the key interactions determining gate performance. Our conclusions are cautiously optimistic: it may be possible to develop a useful quantum information processor that works above cryogenic temperatures.

  12. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  13. Efficacy of computer technology-based HIV prevention interventions: a meta-analysis.

    Science.gov (United States)

    Noar, Seth M; Black, Hulda G; Pierce, Larson B

    2009-01-02

    To conduct a meta-analysis of computer technology-based HIV prevention behavioral interventions aimed at increasing condom use among a variety of at-risk populations. Systematic review and meta-analysis of existing published and unpublished studies testing computer-based interventions. Meta-analytic techniques were used to compute and aggregate effect sizes for 12 randomized controlled trials that met inclusion criteria. Variables that had the potential to moderate intervention efficacy were also tested. The overall mean weighted effect size for condom use was d = 0.259 (95% confidence interval = 0.201, 0.317; Z = 8.74, P partners, and incident sexually transmitted diseases. In addition, interventions were significantly more efficacious when they were directed at men or women (versus mixed sex groups), utilized individualized tailoring, used a Stages of Change model, and had more intervention sessions. Computer technology-based HIV prevention interventions have similar efficacy to more traditional human-delivered interventions. Given their low cost to deliver, ability to customize intervention content, and flexible dissemination channels, they hold much promise for the future of HIV prevention.

  14. Spintronic Circuits: The Building Blocks of Spin-Based Computation

    Directory of Open Access Journals (Sweden)

    Roshan Warman

    2016-10-01

    Full Text Available In the most general situation, binary computation is implemented by means of microscopic logical gates known as transistors. According to Moore’s Law, the size of transistors will half every two years, and as these transistors reach their fundamental size limit, the quantum effects of the electrons passing through the transistors will be observed. Due to the inherent randomness of these quantum fluctuations, the basic binary logic will become uncontrollable. This project describes the basic principle governing quantum spin-based computing devices, which may provide an alternative to the conventional solid-state computing devices and circumvent the technological limitations of the current implementation of binary logic.

  15. Computer-based medical education in Benha University, Egypt: knowledge, attitude, limitations, and suggestions.

    Science.gov (United States)

    Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A

    2016-12-01

    Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (Pcomputer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (Pcomputers limited the inclusion of computer in medical education (Pcomputer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (Pcomputer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical

  16. Computer-based teaching module design: principles derived from learning theories.

    Science.gov (United States)

    Lau, K H Vincent

    2014-03-01

    The computer-based teaching module (CBTM), which has recently gained prominence in medical education, is a teaching format in which a multimedia program serves as a single source for knowledge acquisition rather than playing an adjunctive role as it does in computer-assisted learning (CAL). Despite empirical validation in the past decade, there is limited research into the optimisation of CBTM design. This review aims to summarise research in classic and modern multimedia-specific learning theories applied to computer learning, and to collapse the findings into a set of design principles to guide the development of CBTMs. Scopus was searched for: (i) studies of classic cognitivism, constructivism and behaviourism theories (search terms: 'cognitive theory' OR 'constructivism theory' OR 'behaviourism theory' AND 'e-learning' OR 'web-based learning') and their sub-theories applied to computer learning, and (ii) recent studies of modern learning theories applied to computer learning (search terms: 'learning theory' AND 'e-learning' OR 'web-based learning') for articles published between 1990 and 2012. The first search identified 29 studies, dominated in topic by the cognitive load, elaboration and scaffolding theories. The second search identified 139 studies, with diverse topics in connectivism, discovery and technical scaffolding. Based on their relative representation in the literature, the applications of these theories were collapsed into a list of CBTM design principles. Ten principles were identified and categorised into three levels of design: the global level (managing objectives, framing, minimising technical load); the rhetoric level (optimising modality, making modality explicit, scaffolding, elaboration, spaced repeating), and the detail level (managing text, managing devices). This review examined the literature in the application of learning theories to CAL to develop a set of principles that guide CBTM design. Further research will enable educators to

  17. Use of declarative statements in creating and maintaining computer-interpretable knowledge bases for guideline-based care.

    Science.gov (United States)

    Tu, Samson W; Hrabak, Karen M; Campbell, James R; Glasgow, Julie; Nyman, Mark A; McClure, Robert; McClay, James; Abarbanel, Robert; Mansfield, James G; Martins, Susana M; Goldstein, Mary K; Musen, Mark A

    2006-01-01

    Developing computer-interpretable clinical practice guidelines (CPGs) to provide decision support for guideline-based care is an extremely labor-intensive task. In the EON/ATHENA and SAGE projects, we formulated substantial portions of CPGs as computable statements that express declarative relationships between patient conditions and possible interventions. We developed query and expression languages that allow a decision-support system (DSS) to evaluate these statements in specific patient situations. A DSS can use these guideline statements in multiple ways, including: (1) as inputs for determining preferred alternatives in decision-making, and (2) as a way to provide targeted commentaries in the clinical information system. The use of these declarative statements significantly reduces the modeling expertise and effort required to create and maintain computer-interpretable knowledge bases for decision-support purpose. We discuss possible implications for sharing of such knowledge bases.

  18. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal

    2017-01-01

    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  19. Fail-safe computer-based plant protection systems

    International Nuclear Information System (INIS)

    Keats, A.B.

    1983-01-01

    A fail-safe mode of operation for computers used in nuclear reactor protection systems was first evolved in the UK for application to a sodium cooled fast reactor. The fail-safe properties of both the hardware and the software were achieved by permanently connecting test signals to some of the multiplexed inputs. This results in an unambiguous data pattern, each time the inputs are sequentially scanned by the multiplexer. The ''test inputs'' simulate transient excursions beyond defined safe limits. The alternating response of the trip algorithms to the ''out-of-limits'' test signals and the normal plant measurements is recognised by hardwired pattern recognition logic external to the computer system. For more general application to plant protection systems, a ''Test Signal Generator'' (TSG) is used to compute and generate test signals derived from prevailing operational conditions. The TSG, from its knowledge of the sensitivity of the trip algorithm to each of the input variables, generates a ''test disturbance'' which is superimposed upon each variable in turn, to simulate a transient excursion beyond the safe limits. The ''tripped'' status yielded by the trip algorithm when using data from a ''disturbed'' input forms part of a pattern determined by the order in which the disturbances are applied to the multiplexer inputs. The data pattern formed by the interleaved test disturbances is again recognised by logic external to the protection system's computers. This fail-safe mode of operation of computer-based protection systems provides a powerful defence against common-mode failure. It also reduces the importance of software verification in the licensing procedure. (author)

  20. PENGEMBANGAN MODEL COMPUTER-BASED E-LEARNING UNTUK MENINGKATKAN KEMAMPUAN HIGH ORDER MATHEMATICAL THINKING SISWA SMA

    OpenAIRE

    Jarnawi Afgani Dahlan; Yaya Sukjaya Kusumah; Mr Heri Sutarno

    2011-01-01

    The focus of this research is on the development of mathematics teaching and learning activity which is based on the application of computer software. The aim of research is as follows : 1) to identify some mathematics topics which feasible to be presented by computer-based e-learning, 2) design, develop, and implement computer-based e-learning on mathematics, and 3) analyze the impact of computer-based e-learning in the enhancement of SMA students’ high order mathematical thinking. All activ...

  1. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  2. Computer-based assistive technology device for use by children with physical disabilities: a cross-sectional study.

    Science.gov (United States)

    Lidström, Helene; Almqvist, Lena; Hemmingsson, Helena

    2012-07-01

    To investigate the prevalence of children with physical disabilities who used a computer-based ATD, and to examine characteristics differences in children and youths who do or do not use computer-based ATDs, as well as, investigate differences that might influence the satisfaction of those two groups of children and youths when computers are being used for in-school and outside school activities. A cross-sectional survey about computer-based activities in and outside school (n = 287) and group comparisons. The prevalence of using computer-based ATDs was about 44 % (n = 127) of the children in this sample. These children were less satisfied with their computer use in education and outside school activities than the children who did not use an ATD. Improved coordination of the usage of computer-based ATDs in school and in the home, including service and support, could increase the opportunities for children with physical disabilities who use computer-based ATDs to perform the computer activities they want, need and are expected to do in school and outside school.

  3. COMPUTER-BASED REASONING SYSTEMS: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    CIPRIAN CUCU

    2012-12-01

    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  4. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  5.   Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

      Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  6. Computer based training for nuclear operations personnel: From concept to reality

    International Nuclear Information System (INIS)

    Widen, W.C.; Klemm, R.W.

    1986-01-01

    Computer Based Training (CBT) can be subdivided into two categories: Computer Aided Instruction (CAI), or the actual presentation of learning material; and Computer Managed Instruction (CMI), the tracking, recording, and documenting of instruction and student progress. Both CAI and CMI can be attractive to the student and to the training department. A brief overview of CAI and CMI benefits is given in this paper

  7. Computer-based personality judgments are more accurate than those made by humans.

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  8. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    Science.gov (United States)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  9. Computer-Aided Test Flow in Core-Based Design

    OpenAIRE

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of embedded cores. The CAT now is applied to a few cores within the Philips Core Test Pilot IC project

  10. Activity-based computing for medical work in hospitals

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2009-01-01

    principles, the Java-based implementation of the ABC Framework, and an experimental evaluation together with a group of hospital clinicians. The article contributes to the growing research on support for human activities, mobility, collaboration, and context-aware computing. The ABC Framework presents...

  11. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  12. The soft computing-based approach to investigate allergic diseases: a systematic review.

    Science.gov (United States)

    Tartarisco, Gennaro; Tonacci, Alessandro; Minciullo, Paola Lucia; Billeci, Lucia; Pioggia, Giovanni; Incorvaia, Cristoforo; Gangemi, Sebastiano

    2017-01-01

    Early recognition of inflammatory markers and their relation to asthma, adverse drug reactions, allergic rhinitis, atopic dermatitis and other allergic diseases is an important goal in allergy. The vast majority of studies in the literature are based on classic statistical methods; however, developments in computational techniques such as soft computing-based approaches hold new promise in this field. The aim of this manuscript is to systematically review the main soft computing-based techniques such as artificial neural networks, support vector machines, bayesian networks and fuzzy logic to investigate their performances in the field of allergic diseases. The review was conducted following PRISMA guidelines and the protocol was registered within PROSPERO database (CRD42016038894). The research was performed on PubMed and ScienceDirect, covering the period starting from September 1, 1990 through April 19, 2016. The review included 27 studies related to allergic diseases and soft computing performances. We observed promising results with an overall accuracy of 86.5%, mainly focused on asthmatic disease. The review reveals that soft computing-based approaches are suitable for big data analysis and can be very powerful, especially when dealing with uncertainty and poorly characterized parameters. Furthermore, they can provide valuable support in case of lack of data and entangled cause-effect relationships, which make it difficult to assess the evolution of disease. Although most works deal with asthma, we believe the soft computing approach could be a real breakthrough and foster new insights into other allergic diseases as well.

  13. Parallel processing using an optical delay-based reservoir computer

    Science.gov (United States)

    Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy

    2016-04-01

    Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015

  14. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  15. Development of a computer writing system based on EOG

    OpenAIRE

    López, A.; Ferrero, F.; Yangüela, D.; Álvarez, C.; Postolache, O.

    2017-01-01

    WOS:000407517600044 (Nº de Acesso Web of Science) The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical i...

  16. Computer-Based Wireless Advertising Communication System

    Directory of Open Access Journals (Sweden)

    Anwar Al-Mofleh

    2009-10-01

    Full Text Available In this paper we developed a computer based wireless advertising communication system (CBWACS that enables the user to advertise whatever he wants from his own office to the screen in front of the customer via wireless communication system. This system consists of two PIC microcontrollers, transmitter, receiver, LCD, serial cable and antenna. The main advantages of the system are: the wireless structure and the system is less susceptible to noise and other interferences because it uses digital communication techniques.

  17. Computerbasiert prüfen [Computer-based Assessment

    Directory of Open Access Journals (Sweden)

    Frey, Peter

    2006-08-01

    Full Text Available [english] Computer-based testing in medical education offers new perspectives. Advantages are sequential or adaptive testing, integration of movies or sound, rapid feedback to candidates and management of web-based question banks. Computer-based testing can also be implemented in an OSCE examination. In e-learning environments formative self-assessment are often implemented and gives helpful feedbacks to learners. Disadvantages in high-stake exams are the high requirements as well for the quality of testing (e.g. standard setting as additionally for the information technology and especially for security. [german] Computerbasierte Prüfungen im Medizinstudium eröffnen neue Möglichkeiten. Vorteile solcher Prüfungen liegen im sequentiellen oder adaptiven Prüfen, in der Integration von Bewegtbildern oder Ton, der raschen Auswertung und zentraler Verwaltung der Prüfungsfragen via Internet. Ein Einsatzgebiet mit vertretbarem Aufwand sind Prüfungen mit mehreren Stationen wie beispielsweise die OSCE-Prüfung. Computerbasierte formative Selbsttests werden im Bereiche e-learning häufig angeboten. Das hilft den Lernenden ihren Wissensstand besser einzuschätzen oder sich mit den Leistungen anderer zu vergleichen. Grenzen zeigen sich bei den summativen Prüfungen beim Prüfungsort, da zuhause Betrug möglich ist. Höhere ärztliche Kompetenzen wie Untersuchungstechnik oder Kommunikation eigenen sich kaum für rechnergestützte Prüfungen.

  18. Fast decoder for local quantum codes using Groebner basis

    Science.gov (United States)

    Haah, Jeongwan

    2013-03-01

    Based on arXiv:1204.1063. A local translation-invariant quantum code has a description in terms of Laurent polynomials. As an application of this observation, we present a fast decoding algorithm for translation-invariant local quantum codes in any spatial dimensions using the straightforward division algorithm for multivariate polynomials. The running time is O (n log n) on average, or O (n2 log n) on worst cases, where n is the number of physical qubits. The algorithm improves a subroutine of the renormalization-group decoder by Bravyi and Haah (arXiv:1112.3252) in the translation-invariant case. This work is supported in part by the Insitute for Quantum Information and Matter, an NSF Physics Frontier Center, and the Korea Foundation for Advanced Studies.

  19. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  20. Development of a Computer Writing System Based on EOG.

    Science.gov (United States)

    López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian

    2017-06-26

    The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  1. Development of a Computer Writing System Based on EOG

    Directory of Open Access Journals (Sweden)

    Alberto López

    2017-06-01

    Full Text Available The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1 A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2 A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3 A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  2. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  3. Computational studies of physical properties of Nb-Si based alloys

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, Lizhi [Middle Tennessee State Univ., Murfreesboro, TN (United States)

    2015-04-16

    The overall goal is to provide physical properties data supplementing experiments for thermodynamic modeling and other simulations such as phase filed simulation for microstructure and continuum simulations for mechanical properties. These predictive computational modeling and simulations may yield insights that can be used to guide materials design, processing, and manufacture. Ultimately, they may lead to usable Nb-Si based alloy which could play an important role in current plight towards greener energy. The main objectives of the proposed projects are: (1) developing a first principles method based supercell approach for calculating thermodynamic and mechanic properties of ordered crystals and disordered lattices including solid solution; (2) application of the supercell approach to Nb-Si base alloy to compute physical properties data that can be used for thermodynamic modeling and other simulations to guide the optimal design of Nb-Si based alloy.

  4. [Efficiency of computer-based documentation in long-term care--preliminary project].

    Science.gov (United States)

    Lüngen, Markus; Gerber, Andreas; Rupprecht, Christoph; Lauterbach, Karl W

    2008-06-01

    In Germany the documentation of processes in long-term care is mainly paper-based. Planning, realization and evaluation are not supported in an optimal way. In a preliminary study we evaluated the consequences of the introduction of a computer-based documentation system using handheld devices. We interviewed 16 persons before and after introducing the computer-based documentation and assessed costs for the documentation process and administration. The results show that reducing costs is likely. The job satisfaction of the personnel increased, more time could be spent for caring for the residents. We suggest further research to reach conclusive results.

  5. Developing a personal computer based expert system for radionuclide identification

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Hakulinen, T.T.

    1990-01-01

    Several expert system development tools are available for personal computers today. We have used one of the LISP-based high end tools for nearly two years in developing an expert system for identification of gamma sources. The system contains a radionuclide database of 2055 nuclides and 48000 gamma transitions with a knowledge base of about sixty rules. This application combines a LISP-based inference engine with database management and relatively heavy numerical calculations performed using C-language. The most important feature needed has been the possibility to use LISP and C together with the more advanced object oriented features of the development tool. Main difficulties have been long response times and the big amount (10-16 MB) of computer memory required

  6. Computer Based Asset Management System For Commercial Banks

    Directory of Open Access Journals (Sweden)

    Amanze

    2015-08-01

    Full Text Available ABSTRACT The Computer-based Asset Management System is a web-based system. It allows commercial banks to keep track of their assets. The most advantages of this system are the effective management of asset by keeping records of the asset and retrieval of information. In this research I gather the information to define the requirements of the new application and look at factors how commercial banks managed their asset.

  7. Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.

    Science.gov (United States)

    Park, Eun-Jun; Park, Mihyun

    2015-11-01

    The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.

  8. Synchronized Pair Configuration in Virtualization-Based Lab for Learning Computer Networks

    Science.gov (United States)

    Kongcharoen, Chaknarin; Hwang, Wu-Yuin; Ghinea, Gheorghita

    2017-01-01

    More studies are concentrating on using virtualization-based labs to facilitate computer or network learning concepts. Some benefits are lower hardware costs and greater flexibility in reconfiguring computer and network environments. However, few studies have investigated effective mechanisms for using virtualization fully for collaboration.…

  9. Effect of Computer-Based Video Games on Children: An Experimental Study

    Science.gov (United States)

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  10. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    Science.gov (United States)

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  11. Evaluation of E-Rat, a Computer-based Rat Dissection in Terms of Student Learning Outcomes.

    Science.gov (United States)

    Predavec, Martin

    2001-01-01

    Presents a study that used computer-based rat anatomy to compare student learning outcomes from computer-based instruction with a conventional dissection. Indicates that there was a significant relationship between the time spent on both classes and the marks gained. Shows that computer-based instruction can be a viable alternative to the use of…

  12. Incorporating electronic-based and computer-based strategies: graduate nursing courses in administration.

    Science.gov (United States)

    Graveley, E; Fullerton, J T

    1998-04-01

    The use of electronic technology allows faculty to improve their course offerings. Four graduate courses in nursing administration were contemporized to incorporate fundamental computer-based skills that would be expected of graduates in the work setting. Principles of adult learning offered a philosophical foundation that guided course development and revision. Course delivery strategies included computer-assisted instructional modules, e-mail interactive discussion groups, and use of the electronic classroom. Classroom seminar discussions and two-way interactive video conferencing focused on group resolution of problems derived from employment settings and assigned readings. Using these electronic technologies, a variety of courses can be revised to accommodate the learners' needs.

  13. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  14. The ENSDF radioactivity data base for IBM-PC and computer network access

    International Nuclear Information System (INIS)

    Ekstroem, P.; Spanier, L.

    1989-08-01

    A data base system for radioactivity gamma rays is described. A base with approximately 15000 gamma rays from 2777 decays is available for installation on the hard disk of a PC, and a complete system with approximately 73000 gamma rays is available for on-line access via the NORDic University computer NETwork (NORDUNET) and the Swedish University computer NETwork (SUNET)

  15. Computer-based, Jeopardy™-like game in general chemistry for engineering majors

    Science.gov (United States)

    Ling, S. S.; Saffre, F.; Kadadha, M.; Gater, D. L.; Isakovic, A. F.

    2013-03-01

    We report on the design of Jeopardy™-like computer game for enhancement of learning of general chemistry for engineering majors. While we examine several parameters of student achievement and attitude, our primary concern is addressing the motivation of students, which tends to be low in a traditionally run chemistry lectures. The effect of the game-playing is tested by comparing paper-based game quiz, which constitutes a control group, and computer-based game quiz, constituting a treatment group. Computer-based game quizzes are Java™-based applications that students run once a week in the second part of the last lecture of the week. Overall effectiveness of the semester-long program is measured through pretest-postest conceptual testing of general chemistry. The objective of this research is to determine to what extent this ``gamification'' of the course delivery and course evaluation processes may be beneficial to the undergraduates' learning of science in general, and chemistry in particular. We present data addressing gender-specific difference in performance, as well as background (pre-college) level of general science and chemistry preparation. We outline the plan how to extend such approach to general physics courses and to modern science driven electives, and we offer live, in-lectures examples of our computer gaming experience. We acknowledge support from Khalifa University, Abu Dhabi

  16. Application of the Decomposition Method to the Design Complexity of Computer-based Display

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyoung Ju; Lee, Seung Woo; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Jin Kyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    The importance of the design of human machine interfaces (HMIs) for human performance and safety has long been recognized in process industries. In case of nuclear power plants (NPPs), HMIs have significant implications for the safety of the NPPs since poor implementation of HMIs can impair the operators' information searching ability which is considered as one of the important aspects of human behavior. To support and increase the efficiency of the operators' information searching behavior, advanced HMIs based on computer technology are provided. Operators in advanced main control room (MCR) acquire information through video display units (VDUs), and large display panel (LDP) required for the operation of NPPs. These computer-based displays contain a very large quantity of information and present them in a variety of formats than conventional MCR. For example, these displays contain more elements such as abbreviations, labels, icons, symbols, coding, and highlighting than conventional ones. As computer-based displays contain more information, complexity of the elements becomes greater due to less distinctiveness of each element. A greater understanding is emerging about the effectiveness of designs of computer-based displays, including how distinctively display elements should be designed. And according to Gestalt theory, people tend to group similar elements based on attributes such as shape, color or pattern based on the principle of similarity. Therefore, it is necessary to consider not only human operator's perception but the number of element consisting of computer-based display

  17. Application of the Decomposition Method to the Design Complexity of Computer-based Display

    International Nuclear Information System (INIS)

    Kim, Hyoung Ju; Lee, Seung Woo; Seong, Poong Hyun; Park, Jin Kyun

    2012-01-01

    The importance of the design of human machine interfaces (HMIs) for human performance and safety has long been recognized in process industries. In case of nuclear power plants (NPPs), HMIs have significant implications for the safety of the NPPs since poor implementation of HMIs can impair the operators' information searching ability which is considered as one of the important aspects of human behavior. To support and increase the efficiency of the operators' information searching behavior, advanced HMIs based on computer technology are provided. Operators in advanced main control room (MCR) acquire information through video display units (VDUs), and large display panel (LDP) required for the operation of NPPs. These computer-based displays contain a very large quantity of information and present them in a variety of formats than conventional MCR. For example, these displays contain more elements such as abbreviations, labels, icons, symbols, coding, and highlighting than conventional ones. As computer-based displays contain more information, complexity of the elements becomes greater due to less distinctiveness of each element. A greater understanding is emerging about the effectiveness of designs of computer-based displays, including how distinctively display elements should be designed. And according to Gestalt theory, people tend to group similar elements based on attributes such as shape, color or pattern based on the principle of similarity. Therefore, it is necessary to consider not only human operator's perception but the number of element consisting of computer-based display

  18. +Cloud: An Agent-Based Cloud Computing Platform

    OpenAIRE

    González, Roberto; Hernández de la Iglesia, Daniel; de la Prieta Pintado, Fernando; Gil González, Ana Belén

    2017-01-01

    Cloud computing is revolutionizing the services provided through the Internet, and is continually adapting itself in order to maintain the quality of its services. This study presents the platform +Cloud, which proposes a cloud environment for storing information and files by following the cloud paradigm. This study also presents Warehouse 3.0, a cloud-based application that has been developed to validate the services provided by +Cloud.

  19. Convincing Conversations : Using a Computer-Based Dialogue System to Promote a Plant-Based Diet

    NARCIS (Netherlands)

    Zaal, Emma; Mills, Gregory; Hagen, Afke; Huisman, Carlijn; Hoeks, Jacobus

    2017-01-01

    In this study, we tested the effectiveness of a computer-based persuasive dialogue system designed to promote a plant-based diet. The production and consumption of meat and dairy has been shown to be a major cause of climate change and a threat to public health, bio-diversity, animal rights and

  20. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  1. Computer-based personality judgments are more accurate than those made by humans

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  2. A security mechanism based on evolutionary game in fog computing

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2018-02-01

    Full Text Available Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  3. A security mechanism based on evolutionary game in fog computing.

    Science.gov (United States)

    Sun, Yan; Lin, Fuhong; Zhang, Nan

    2018-02-01

    Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  4. Why advanced computing? The key to space-based operations

    Science.gov (United States)

    Phister, Paul W., Jr.; Plonisch, Igor; Mineo, Jack

    2000-11-01

    The 'what is the requirement?' aspect of advanced computing and how it relates to and supports Air Force space-based operations is a key issue. In support of the Air Force Space Command's five major mission areas (space control, force enhancement, force applications, space support and mission support), two-fifths of the requirements have associated stringent computing/size implications. The Air Force Research Laboratory's 'migration to space' concept will eventually shift Science and Technology (S&T) dollars from predominantly airborne systems to airborne-and-space related S&T areas. One challenging 'space' area is in the development of sophisticated on-board computing processes for the next generation smaller, cheaper satellite systems. These new space systems (called microsats or nanosats) could be as small as a softball, yet perform functions that are currently being done by large, vulnerable ground-based assets. The Joint Battlespace Infosphere (JBI) concept will be used to manage the overall process of space applications coupled with advancements in computing. The JBI can be defined as a globally interoperable information 'space' which aggregates, integrates, fuses, and intelligently disseminates all relevant battlespace knowledge to support effective decision-making at all echelons of a Joint Task Force (JTF). This paper explores a single theme -- on-board processing is the best avenue to take advantage of advancements in high-performance computing, high-density memories, communications, and re-programmable architecture technologies. The goal is to break away from 'no changes after launch' design to a more flexible design environment that can take advantage of changing space requirements and needs while the space vehicle is 'on orbit.'

  5. A computer-based teaching programme (CBTP) developed for ...

    African Journals Online (AJOL)

    The nursing profession, like other professions, is focused on preparing students for practice, and particular attention must be paid to the ability of student nurses to extend their knowledge and to solve nursing care problems effectively. A computer-based teaching programme (CBTP) for clinical practice to achieve these ...

  6. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  7. Primary Health Care Software-A Computer Based Data Management System

    Directory of Open Access Journals (Sweden)

    Tuli K

    1990-01-01

    Full Text Available Realising the duplication and time consumption in the usual manual system of data collection necessitated experimentation with computer based management system for primary health care in the primary health centers. The details of the population as available in the existing manual system were used for computerizing the data. Software was designed for data entry and analysis. It was written in Dbase III plus language. It was so designed that a person with no knowledge about computer could use it, A cost analysis was done and the computer system was found more cost effective than the usual manual system.

  8. Glider-based computing in reaction-diffusion hexagonal cellular automata

    International Nuclear Information System (INIS)

    Adamatzky, Andrew; Wuensche, Andrew; De Lacy Costello, Benjamin

    2006-01-01

    A three-state hexagonal cellular automaton, discovered in [Wuensche A. Glider dynamics in 3-value hexagonal cellular automata: the beehive rule. Int J Unconvention Comput, in press], presents a conceptual discrete model of a reaction-diffusion system with inhibitor and activator reagents. The automaton model of reaction-diffusion exhibits mobile localized patterns (gliders) in its space-time dynamics. We show how to implement the basic computational operations with these mobile localizations, and thus demonstrate collision-based logical universality of the hexagonal reaction-diffusion cellular automaton

  9. Computational Model-Based Design of Leadership Support Based on Situational Leadership Theory

    NARCIS (Netherlands)

    Bosse, T.; Duell, R.; Memon, Z.A.; Treur, J.; van der Wal, C.N.

    2017-01-01

    This paper introduces the design of an agent-based leadership support system exploiting a computational model for development of individuals or groups. It is to be used, for example, as a basis for systems to support a group leader in the development of individual group members or a group as a

  10. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  11. Usability test of the ImPRO, computer-based procedure system

    International Nuclear Information System (INIS)

    Jung, Y.; Lee, J.

    2006-01-01

    ImPRO is a computer based procedure in both flowchart and success logic tree. It is evaluated on the basis of computer based procedure guidelines. It satisfies most requirements such as presentations and functionalities. Besides, SGTR has been performed with ImPRO to evaluate reading comprehension and situation awareness. ImPRO is a software engine which can interpret procedure script language, so that ImPRO is reliable by nature and verified with formal method. One bug, however, had hidden one year after release, but it was fixed. Finally backup paper procedures can be prepared on the same format as VDU in case of ImPRO failure. (authors)

  12. Microscope self-calibration based on micro laser line imaging and soft computing algorithms

    Science.gov (United States)

    Apolinar Muñoz Rodríguez, J.

    2018-06-01

    A technique to perform microscope self-calibration via micro laser line and soft computing algorithms is presented. In this technique, the microscope vision parameters are computed by means of soft computing algorithms based on laser line projection. To implement the self-calibration, a microscope vision system is constructed by means of a CCD camera and a 38 μm laser line. From this arrangement, the microscope vision parameters are represented via Bezier approximation networks, which are accomplished through the laser line position. In this procedure, a genetic algorithm determines the microscope vision parameters by means of laser line imaging. Also, the approximation networks compute the three-dimensional vision by means of the laser line position. Additionally, the soft computing algorithms re-calibrate the vision parameters when the microscope vision system is modified during the vision task. The proposed self-calibration improves accuracy of the traditional microscope calibration, which is accomplished via external references to the microscope system. The capability of the self-calibration based on soft computing algorithms is determined by means of the calibration accuracy and the micro-scale measurement error. This contribution is corroborated by an evaluation based on the accuracy of the traditional microscope calibration.

  13. Blind topological measurement-based quantum computation.

    Science.gov (United States)

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-01-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3 × 10(-3), which is comparable to that (7.5 × 10(-3)) of non-blind topological quantum computation. As the error per gate of the order 10(-3) was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.

  14. Feasibility of Computer-Based Videogame Therapy for Children with Cerebral Palsy.

    Science.gov (United States)

    Radtka, Sandra; Hone, Robert; Brown, Charles; Mastick, Judy; Melnick, Marsha E; Dowling, Glenna A

    2013-08-01

    Standing and gait balance problems are common in children with cerebral palsy (CP), resulting in falls and injuries. Task-oriented exercises to strengthen and stretch muscles that shift the center of mass and change the base of support are effective in improving balance. Gaming environments can be challenging and fun, encouraging children to engage in exercises at home. The aims of this project were to demonstrate the technical feasibility, ease of use, appeal, and safety of a computer-based videogame program designed to improve balance in children with CP. This study represents a close collaboration between computer design and clinical team members. The first two phases were performed in the laboratory, and the final phase was done in subjects' homes. The prototype balance game was developed using computer-based real-time three-dimensional programming that enabled the team to capture engineering data necessary to tune the system. Videogame modifications, including identifying compensatory movements, were made in an iterative fashion based on feedback from subjects and observations of clinical and software team members. Subjects ( n =14) scored the game 21.5 out of 30 for ease of use and appeal, 4.0 out of 5 for enjoyment, and 3.5 on comprehension. There were no safety issues, and the games performed without technical flaws in final testing. A computer-based videogame incorporating therapeutic movements to improve gait and balance in children with CP was appealing and feasible for home use. A follow-up study examining its effectiveness in improving balance in children with CP is recommended.

  15. Automatic calibration system of the temperature instrument display based on computer vision measuring

    Science.gov (United States)

    Li, Zhihong; Li, Jinze; Bao, Changchun; Hou, Guifeng; Liu, Chunxia; Cheng, Fang; Xiao, Nianxin

    2010-07-01

    With the development of computers and the techniques of dealing with pictures and computer optical measurement, various measuring techniques are maturing gradually on the basis of optical picture processing technique and using in practice. On the bases, we make use of the many years' experience and social needs in temperature measurement and computer vision measurement to come up with the completely automatic way of the temperature measurement meter with integration of the computer vision measuring technique. It realizes synchronization collection with theory temperature value, improves calibration efficiency. based on least square fitting principle, integrate data procession and the best optimize theory, rapidly and accurately realizes automation acquisition and calibration of temperature.

  16. Two polynomial representations of experimental design

    OpenAIRE

    Notari, Roberto; Riccomagno, Eva; Rogantin, Maria-Piera

    2007-01-01

    In the context of algebraic statistics an experimental design is described by a set of polynomials called the design ideal. This, in turn, is generated by finite sets of polynomials. Two types of generating sets are mostly used in the literature: Groebner bases and indicator functions. We briefly describe them both, how they are used in the analysis and planning of a design and how to switch between them. Examples include fractions of full factorial designs and designs for mixture experiments.

  17. Reconfigurable computing the theory and practice of FPGA-based computation

    CERN Document Server

    Hauck, Scott

    2010-01-01

    Reconfigurable Computing marks a revolutionary and hot topic that bridges the gap between the separate worlds of hardware and software design- the key feature of reconfigurable computing is its groundbreaking ability to perform computations in hardware to increase performance while retaining the flexibility of a software solution. Reconfigurable computers serve as affordable, fast, and accurate tools for developing designs ranging from single chip architectures to multi-chip and embedded systems. Scott Hauck and Andre DeHon have assembled a group of the key experts in the fields of both hardwa

  18. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Science.gov (United States)

    2010-04-01

    ...: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development... Calculating Formula Expenses § 990.180 Utilities expense level: Computation of the rolling base consumption level. (a) General. (1) The rolling base consumption level (RBCL) shall be equal to the average of...

  19. The Effect of Computer Game-Based Learning on FL Vocabulary Transferability

    Science.gov (United States)

    Franciosi, Stephan J.

    2017-01-01

    In theory, computer game-based learning can support several vocabulary learning affordances that have been identified in the foreign language learning research. In the observable evidence, learning with computer games has been shown to improve performance on vocabulary recall tests. However, while simple recall can be a sign of learning,…

  20. Effects of mobile phone-based app learning compared to computer-based web learning on nursing students: pilot randomized controlled trial.

    Science.gov (United States)

    Lee, Myung Kyung

    2015-04-01

    This study aimed to determine the effect of mobile-based discussion versus computer-based discussion on self-directed learning readiness, academic motivation, learner-interface interaction, and flow state. This randomized controlled trial was conducted at one university. Eighty-six nursing students who were able to use a computer, had home Internet access, and used a mobile phone were recruited. Participants were randomly assigned to either the mobile phone app-based discussion group (n = 45) or a computer web-based discussion group (n = 41). The effect was measured at before and after an online discussion via self-reported surveys that addressed academic motivation, self-directed learning readiness, time distortion, learner-learner interaction, learner-interface interaction, and flow state. The change in extrinsic motivation on identified regulation in the academic motivation (p = 0.011) as well as independence and ability to use basic study (p = 0.047) and positive orientation to the future in self-directed learning readiness (p = 0.021) from pre-intervention to post-intervention was significantly more positive in the mobile phone app-based group compared to the computer web-based discussion group. Interaction between learner and interface (p = 0.002), having clear goals (p = 0.012), and giving and receiving unambiguous feedback (p = 0.049) in flow state was significantly higher in the mobile phone app-based discussion group than it was in the computer web-based discussion group at post-test. The mobile phone might offer more valuable learning opportunities for discussion teaching and learning methods in terms of self-directed learning readiness, academic motivation, learner-interface interaction, and the flow state of the learning process compared to the computer.

  1. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  2. An Interactive Computer-Based Circulation System: Design and Development

    Directory of Open Access Journals (Sweden)

    James S. Aagaard

    1972-03-01

    Full Text Available An on-line computer-based circulation control system has been installed at the Northwestern University library. Features of the system include self-service book charge, remote terminal inquiry and update, and automatic production of notices for call-ins and books available. Fine notices are also prepared daily and overdue notices weekly. Important considerations in the design of the system were to minimize costs of operation and to include technical services functions eventually. The system operates on a relatively small computer in a multiprogrammed mode.

  3. Computational neural network regression model for Host based Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Gautam

    2016-09-01

    Full Text Available The current scenario of information gathering and storing in secure system is a challenging task due to increasing cyber-attacks. There exists computational neural network techniques designed for intrusion detection system, which provide security to single machine and entire network's machine. In this paper, we have used two types of computational neural network models, namely, Generalized Regression Neural Network (GRNN model and Multilayer Perceptron Neural Network (MPNN model for Host based Intrusion Detection System using log files that are generated by a single personal computer. The simulation results show correctly classified percentage of normal and abnormal (intrusion class using confusion matrix. On the basis of results and discussion, we found that the Host based Intrusion Systems Model (HISM significantly improved the detection accuracy while retaining minimum false alarm rate.

  4. Rehabilitation of patients with motor disabilities using computer vision based techniques

    Directory of Open Access Journals (Sweden)

    Alejandro Reyes-Amaro

    2012-05-01

    Full Text Available In this paper we present details about the implementation of computer vision based applications for the rehabilitation of patients with motor disabilities. The applications are conceived as serious games, where the computer-patient interaction during playing contributes to the development of different motor skills. The use of computer vision methods allows the automatic guidance of the patient’s movements making constant specialized supervision unnecessary. The hardware requirements are limited to low-cost devices like usual webcams and Netbooks.

  5. A Study on GPU-based Iterative ML-EM Reconstruction Algorithm for Emission Computed Tomographic Imaging Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Woo Seok; Kim, Soo Mee; Park, Min Jae; Lee, Dong Soo; Lee, Jae Sung [Seoul National University, Seoul (Korea, Republic of)

    2009-10-15

    The maximum likelihood-expectation maximization (ML-EM) is the statistical reconstruction algorithm derived from probabilistic model of the emission and detection processes. Although the ML-EM has many advantages in accuracy and utility, the use of the ML-EM is limited due to the computational burden of iterating processing on a CPU (central processing unit). In this study, we developed a parallel computing technique on GPU (graphic processing unit) for ML-EM algorithm. Using Geforce 9800 GTX+ graphic card and CUDA (compute unified device architecture) the projection and backprojection in ML-EM algorithm were parallelized by NVIDIA's technology. The time delay on computations for projection, errors between measured and estimated data and backprojection in an iteration were measured. Total time included the latency in data transmission between RAM and GPU memory. The total computation time of the CPU- and GPU-based ML-EM with 32 iterations were 3.83 and 0.26 sec, respectively. In this case, the computing speed was improved about 15 times on GPU. When the number of iterations increased into 1024, the CPU- and GPU-based computing took totally 18 min and 8 sec, respectively. The improvement was about 135 times and was caused by delay on CPU-based computing after certain iterations. On the other hand, the GPU-based computation provided very small variation on time delay per iteration due to use of shared memory. The GPU-based parallel computation for ML-EM improved significantly the computing speed and stability. The developed GPU-based ML-EM algorithm could be easily modified for some other imaging geometries

  6. A Study on GPU-based Iterative ML-EM Reconstruction Algorithm for Emission Computed Tomographic Imaging Systems

    International Nuclear Information System (INIS)

    Ha, Woo Seok; Kim, Soo Mee; Park, Min Jae; Lee, Dong Soo; Lee, Jae Sung

    2009-01-01

    The maximum likelihood-expectation maximization (ML-EM) is the statistical reconstruction algorithm derived from probabilistic model of the emission and detection processes. Although the ML-EM has many advantages in accuracy and utility, the use of the ML-EM is limited due to the computational burden of iterating processing on a CPU (central processing unit). In this study, we developed a parallel computing technique on GPU (graphic processing unit) for ML-EM algorithm. Using Geforce 9800 GTX+ graphic card and CUDA (compute unified device architecture) the projection and backprojection in ML-EM algorithm were parallelized by NVIDIA's technology. The time delay on computations for projection, errors between measured and estimated data and backprojection in an iteration were measured. Total time included the latency in data transmission between RAM and GPU memory. The total computation time of the CPU- and GPU-based ML-EM with 32 iterations were 3.83 and 0.26 sec, respectively. In this case, the computing speed was improved about 15 times on GPU. When the number of iterations increased into 1024, the CPU- and GPU-based computing took totally 18 min and 8 sec, respectively. The improvement was about 135 times and was caused by delay on CPU-based computing after certain iterations. On the other hand, the GPU-based computation provided very small variation on time delay per iteration due to use of shared memory. The GPU-based parallel computation for ML-EM improved significantly the computing speed and stability. The developed GPU-based ML-EM algorithm could be easily modified for some other imaging geometries

  7. Issues in Text Design and Layout for Computer Based Communications.

    Science.gov (United States)

    Andresen, Lee W.

    1991-01-01

    Discussion of computer-based communications (CBC) focuses on issues involved with screen design and layout for electronic text, based on experiences with electronic messaging, conferencing, and publishing within the Australian Open Learning Information Network (AOLIN). Recommendations for research on design and layout for printed text are also…

  8. Developing a project-based computational physics course grounded in expert practice

    Science.gov (United States)

    Burke, Christopher J.; Atherton, Timothy J.

    2017-04-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  9. Industrial application of a graphics computer-based training system

    International Nuclear Information System (INIS)

    Klemm, R.W.

    1985-01-01

    Graphics Computer Based Training (GCBT) roles include drilling, tutoring, simulation and problem solving. Of these, Commonwealth Edison uses mainly tutoring, simulation and problem solving. These roles are not separate in any particular program. They are integrated to provide tutoring and part-task simulation, part-task simulation and problem solving, or problem solving tutoring. Commonwealth's Graphics Computer Based Training program was a result of over a year's worth of research and planning. The keys to the program are it's flexibility and control. Flexibility is maintained through stand alone units capable of program authoring and modification for plant/site specific users. Yet, the system has the capability to support up to 31 terminals with a 40 mb hard disk drive. Control of the GCBT program is accomplished through establishment of development priorities and a central development facility (Commonwealth Edison's Production Training Center)

  10. Computer-based liquid radioactive waste control with plant emergency and generator temperature monitoring

    International Nuclear Information System (INIS)

    Plotnick, R.J.; Schneider, M.I.; Shaffer, C.E.

    1986-01-01

    At the start of the design of the liquid radwaste control system for a nuclear generating station under construction, several serious problems were detected. The solution incorporated a new approach utilizing a computer and a blend of standard and custom software to replace the existing conventionally instrumented benchboard. The computer-based system, in addition to solving the problems associated with the benchboard design, also provided other enhancements which significantly improved the operability and reliability of the radwaste system. The functionality of the computer-based radwaste control system also enabled additional applications to be added to an expanded multitask version of the radwaste computer: 1) a Nuclear Regulatory Commission (NRC) requirement that all nuclear power plants have an emergency response facility status monitoring system; and 2) the sophisticated temperature monitoring and trending requested by the electric generator manufacturer to continue its warranty commitments. The addition of these tasks to the radwaste computer saved the cost of one or more computers that would be dedicated to these work requirements

  11. Computer holography: 3D digital art based on high-definition CGH

    International Nuclear Information System (INIS)

    Matsushima, K; Arima, Y; Nishi, H; Yamashita, H; Yoshizaki, Y; Ogawa, K; Nakahara, S

    2013-01-01

    Our recent works of high-definition computer-generated holograms (CGH) and the techniques used for the creation, such as the polygon-based method, silhouette method and digitized holography, are summarized and reviewed in this paper. The concept of computer holography is proposed in terms of integrating and crystalizing the techniques into novel digital art.

  12. Intelligent Aggregation Based on Content Routing Scheme for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiachen Xu

    2017-10-01

    Full Text Available Cloud computing has emerged as today’s most exciting computing paradigm for providing services using a shared framework, which opens a new door for solving the problems of the explosive growth of digital resource demands and their corresponding convenience. With the exponential growth of the number of data types and data size in so-called big data work, the backbone network is under great pressure due to its transmission capacity, which is lower than the growth of the data size and would seriously hinder the development of the network without an effective approach to solve this problem. In this paper, an Intelligent Aggregation based on a Content Routing (IACR scheme for cloud computing, which could reduce the amount of data in the network effectively and play a basic supporting role in the development of cloud computing, is first put forward. All in all, the main innovations in this paper are: (1 A framework for intelligent aggregation based on content routing is proposed, which can support aggregation based content routing; (2 The proposed IACR scheme could effectively route the high aggregation ratio data to the data center through the same routing path so as to effectively reduce the amount of data that the network transmits. The theoretical analyses experiments and results show that, compared with the previous original routing scheme, the IACR scheme can balance the load of the whole network, reduce the amount of data transmitted in the network by 41.8%, and reduce the transmission time by 31.6% in the same network with a more balanced network load.

  13. Genre-adaptive Semantic Computing and Audio-based Modelling for Music Mood Annotation

    DEFF Research Database (Denmark)

    Saari, Pasi; Fazekas, György; Eerola, Tuomas

    2016-01-01

    This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling are prop......This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling...... related to a set of 600 popular music tracks spanning multiple genres. The results show that ACTwg outperforms a semantic computing technique that does not exploit genre information, and ACTwg-SLPwg outperforms conventional techniques and other genre-adaptive alternatives. In particular, improvements......-based genre representation for genre-adaptive music mood analysis....

  14. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  15. Standardized Computer-based Organized Reporting of EEG: SCORE

    Science.gov (United States)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C; Fuglsang-Frederiksen, Anders; Martins-da-Silva, António; Trinka, Eugen; Visser, Gerhard; Rubboli, Guido; Hjalgrim, Helle; Stefan, Hermann; Rosén, Ingmar; Zarubova, Jana; Dobesberger, Judith; Alving, Jørgen; Andersen, Kjeld V; Fabricius, Martin; Atkins, Mary D; Neufeld, Miri; Plouin, Perrine; Marusic, Petr; Pressler, Ronit; Mameniskiene, Ruta; Hopfengärtner, Rüdiger; Emde Boas, Walter; Wolf, Peter

    2013-01-01

    The electroencephalography (EEG) signal has a high complexity, and the process of extracting clinically relevant features is achieved by visual analysis of the recordings. The interobserver agreement in EEG interpretation is only moderate. This is partly due to the method of reporting the findings in free-text format. The purpose of our endeavor was to create a computer-based system for EEG assessment and reporting, where the physicians would construct the reports by choosing from predefined elements for each relevant EEG feature, as well as the clinical phenomena (for video-EEG recordings). A working group of EEG experts took part in consensus workshops in Dianalund, Denmark, in 2010 and 2011. The faculty was approved by the Commission on European Affairs of the International League Against Epilepsy (ILAE). The working group produced a consensus proposal that went through a pan-European review process, organized by the European Chapter of the International Federation of Clinical Neurophysiology. The Standardised Computer-based Organised Reporting of EEG (SCORE) software was constructed based on the terms and features of the consensus statement and it was tested in the clinical practice. The main elements of SCORE are the following: personal data of the patient, referral data, recording conditions, modulators, background activity, drowsiness and sleep, interictal findings, “episodes” (clinical or subclinical events), physiologic patterns, patterns of uncertain significance, artifacts, polygraphic channels, and diagnostic significance. The following specific aspects of the neonatal EEGs are scored: alertness, temporal organization, and spatial organization. For each EEG finding, relevant features are scored using predefined terms. Definitions are provided for all EEG terms and features. SCORE can potentially improve the quality of EEG assessment and reporting; it will help incorporate the results of computer-assisted analysis into the report, it will make

  16. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  17. [Problem list in computer-based patient records].

    Science.gov (United States)

    Ludwig, C A

    1997-01-14

    Computer-based clinical information systems are capable of effectively processing even large amounts of patient-related data. However, physicians depend on rapid access to summarized, clearly laid out data on the computer screen to inform themselves about a patient's current clinical situation. In introducing a clinical workplace system, we therefore transformed the problem list-which for decades has been successfully used in clinical information management-into an electronic equivalent and integrated it into the medical record. The table contains a concise overview of diagnoses and problems as well as related findings. Graphical information can also be integrated into the table, and an additional space is provided for a summary of planned examinations or interventions. The digital form of the problem list makes it possible to use the entire list or selected text elements for generating medical documents. Diagnostic terms for medical reports are transferred automatically to corresponding documents. Computer technology has an immense potential for the further development of problem list concepts. With multimedia applications sound and images will be included in the problem list. For hyperlink purpose the problem list could become a central information board and table of contents of the medical record, thus serving as the starting point for database searches and supporting the user in navigating through the medical record.

  18. Computational steering of GEM based detector simulations

    Science.gov (United States)

    Sheharyar, Ali; Bouhali, Othmane

    2017-10-01

    Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.

  19. NOSTOS: a paper-based ubiquitous computing healthcare environment to support data capture and collaboration.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2003-01-01

    In this paper, we present a new approach to clinical workplace computerization that departs from the window-based user interface paradigm. NOSTOS is an experimental computer-augmented work environment designed to support data capture and teamwork in an emergency room. NOSTOS combines multiple technologies, such as digital pens, walk-up displays, headsets, a smart desk, and sensors to enhance an existing paper-based practice with computer power. The physical interfaces allow clinicians to retain mobile paper-based collaborative routines and still benefit from computer technology. The requirements for the system were elicited from situated workplace studies. We discuss the advantages and disadvantages of augmenting a paper-based clinical work environment.

  20. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  1. Feasibility of Computer-Based Videogame Therapy for Children with Cerebral Palsy

    Science.gov (United States)

    Radtka, Sandra; Hone, Robert; Brown, Charles; Mastick, Judy; Melnick, Marsha E.

    2013-01-01

    Abstract Objectives Standing and gait balance problems are common in children with cerebral palsy (CP), resulting in falls and injuries. Task-oriented exercises to strengthen and stretch muscles that shift the center of mass and change the base of support are effective in improving balance. Gaming environments can be challenging and fun, encouraging children to engage in exercises at home. The aims of this project were to demonstrate the technical feasibility, ease of use, appeal, and safety of a computer-based videogame program designed to improve balance in children with CP. Materials and Methods This study represents a close collaboration between computer design and clinical team members. The first two phases were performed in the laboratory, and the final phase was done in subjects' homes. The prototype balance game was developed using computer-based real-time three-dimensional programming that enabled the team to capture engineering data necessary to tune the system. Videogame modifications, including identifying compensatory movements, were made in an iterative fashion based on feedback from subjects and observations of clinical and software team members. Results Subjects (n=14) scored the game 21.5 out of 30 for ease of use and appeal, 4.0 out of 5 for enjoyment, and 3.5 on comprehension. There were no safety issues, and the games performed without technical flaws in final testing. Conclusions A computer-based videogame incorporating therapeutic movements to improve gait and balance in children with CP was appealing and feasible for home use. A follow-up study examining its effectiveness in improving balance in children with CP is recommended. PMID:24761324

  2. Computer-based training at Sellafield

    International Nuclear Information System (INIS)

    Cartmell, A.; Evans, M.C.

    1986-01-01

    British Nuclear Fuel Limited (BNFL) operate the United Kingdom's spent-fuel receipt, storage, and reprocessing complex at Sellafield. Spent fuel from graphite-moderated CO 2 -cooled Magnox reactors has been reprocessed at Sellafield for 22 yr. Spent fuel from light water and advanced gas reactors is stored pending reprocessing in the Thermal Oxide Reprocessing Plant currently being constructed. The range of knowledge and skills needed for plant operation, construction, and commissioning represents a formidable training requirement. In addition, employees need to be acquainted with company practices and procedures. Computer-based training (CBT) is expected to play a significant role in this process. In this paper, current applications of CBT to the filed of nuclear criticality safety are described and plans for the immediate future are outlined

  3. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  4. Can Dictionary-based Computational Models Outperform the Best Linear Ones?

    Czech Academy of Sciences Publication Activity Database

    Gnecco, G.; Kůrková, Věra; Sanguineti, M.

    2011-01-01

    Roč. 24, č. 8 (2011), s. 881-887 ISSN 0893-6080 R&D Project s: GA MŠk OC10047 Grant - others:CNR - AV ČR project 2010-2012(XE) Complexity of Neural-Network and Kernel Computational Models Institutional research plan: CEZ:AV0Z10300504 Keywords : dictionary-based approximation * linear approximation * rates of approximation * worst-case error * Kolmogorov width * perceptron networks Subject RIV: IN - Informatics, Computer Science Impact factor: 2.182, year: 2011

  5. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2004-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  6. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2005-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  7. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2000-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  8. Plancton: an opportunistic distributed computing project based on Docker containers

    Science.gov (United States)

    Concas, Matteo; Berzano, Dario; Bagnasco, Stefano; Lusso, Stefano; Masera, Massimo; Puccio, Maximiliano; Vallero, Sara

    2017-10-01

    The computing power of most modern commodity computers is far from being fully exploited by standard usage patterns. In this work we describe the development and setup of a virtual computing cluster based on Docker containers used as worker nodes. The facility is based on Plancton: a lightweight fire-and-forget background service. Plancton spawns and controls a local pool of Docker containers on a host with free resources, by constantly monitoring its CPU utilisation. It is designed to release the resources allocated opportunistically, whenever another demanding task is run by the host user, according to configurable policies. This is attained by killing a number of running containers. One of the advantages of a thin virtualization layer such as Linux containers is that they can be started almost instantly upon request. We will show how fast the start-up and disposal of containers eventually enables us to implement an opportunistic cluster based on Plancton daemons without a central control node, where the spawned Docker containers behave as job pilots. Finally, we will show how Plancton was configured to run up to 10 000 concurrent opportunistic jobs on the ALICE High-Level Trigger facility, by giving a considerable advantage in terms of management compared to virtual machines.

  9. The computer-based control system of the NAC accelerator

    International Nuclear Information System (INIS)

    Burdzik, G.F.; Bouckaert, R.F.A.; Cloete, I.; Du Toit, J.S.; Kohler, I.H.; Truter, J.N.J.; Visser, K.

    1982-01-01

    The National Accelerator Centre (NAC) of the CSIR is building a two-stage accelerator which will provide charged-particle beams for the use in medical and research applications. The control system for this accelerator is based on three mini-computers and a CAMAC interfacing network. Closed-loop control is being relegated to the various subsystems of the accelerators, and the computers and CAMAC network will be used in the first instance for data transfer, monitoring and servicing of the control consoles. The processing power of the computers will be utilized for automating start-up and beam-change procedures, for providing flexible and convenient information at the control consoles, for fault diagnosis and for beam-optimizing procedures. Tasks of a localized or dedicated nature are being off-loaded onto microcomputers, which are being used either in front-end devices or as slaves to the mini-computers. On the control consoles only a few instruments for setting and monitoring variables are being provided, but these instruments are universally-linkable to any appropriate machine variable

  10. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....

  11. Computational Methods to Assess the Production Potential of Bio-Based Chemicals.

    Science.gov (United States)

    Campodonico, Miguel A; Sukumara, Sumesh; Feist, Adam M; Herrgård, Markus J

    2018-01-01

    Elevated costs and long implementation times of bio-based processes for producing chemicals represent a bottleneck for moving to a bio-based economy. A prospective analysis able to elucidate economically and technically feasible product targets at early research phases is mandatory. Computational tools can be implemented to explore the biological and technical spectrum of feasibility, while constraining the operational space for desired chemicals. In this chapter, two different computational tools for assessing potential for bio-based production of chemicals from different perspectives are described in detail. The first tool is GEM-Path: an algorithm to compute all structurally possible pathways from one target molecule to the host metabolome. The second tool is a framework for Modeling Sustainable Industrial Chemicals production (MuSIC), which integrates modeling approaches for cellular metabolism, bioreactor design, upstream/downstream processes, and economic impact assessment. Integrating GEM-Path and MuSIC will play a vital role in supporting early phases of research efforts and guide the policy makers with decisions, as we progress toward planning a sustainable chemical industry.

  12. Computer-Based Job and Occupational Data Collection Methods: Feasibility Study

    National Research Council Canada - National Science Library

    Mitchell, Judith I

    1998-01-01

    .... The feasibility study was conducted to assess the operational and logistical problems involved with the development, implementation, and evaluation of computer-based job and occupational data collection methods...

  13. Development of an Evaluation Method for the Design Complexity of Computer-Based Displays

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyoung Ju; Lee, Seung Woo; Kang, Hyun Gook; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Park, Jin Kyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The importance of the design of human machine interfaces (HMIs) for human performance and the safety of process industries has long been continuously recognized for many decades. Especially, in the case of nuclear power plants (NPPs), HMIs have significant implications for the safety of the NPPs because poor HMIs can impair the decision making ability of human operators. In order to support and increase the decision making ability of human operators, advanced HMIs based on the up-to-date computer technology are provided. Human operators in advanced main control room (MCR) acquire information through video display units (VDUs) and large display panel (LDP), which is required for the operation of NPPs. These computer-based displays contain a huge amount of information and present it with a variety of formats compared to those of a conventional MCR. For example, these displays contain more display elements such as abbreviations, labels, icons, symbols, coding, etc. As computer-based displays contain more information, the complexity of advanced displays becomes greater due to less distinctiveness of each display element. A greater understanding is emerging about the effectiveness of designs of computer-based displays, including how distinctively display elements should be designed. This study covers the early phase in the development of an evaluation method for the design complexity of computer-based displays. To this end, a series of existing studies were reviewed to suggest an appropriate concept that is serviceable to unravel this problem

  14. Image communication scheme based on dynamic visual cryptography and computer generated holography

    Science.gov (United States)

    Palevicius, Paulius; Ragulskis, Minvydas

    2015-01-01

    Computer generated holograms are often exploited to implement optical encryption schemes. This paper proposes the integration of dynamic visual cryptography (an optical technique based on the interplay of visual cryptography and time-averaging geometric moiré) with Gerchberg-Saxton algorithm. A stochastic moiré grating is used to embed the secret into a single cover image. The secret can be visually decoded by a naked eye if only the amplitude of harmonic oscillations corresponds to an accurately preselected value. The proposed visual image encryption scheme is based on computer generated holography, optical time-averaging moiré and principles of dynamic visual cryptography. Dynamic visual cryptography is used both for the initial encryption of the secret image and for the final decryption. Phase data of the encrypted image are computed by using Gerchberg-Saxton algorithm. The optical image is decrypted using the computationally reconstructed field of amplitudes.

  15. Location-Based Services and Privacy Protection Under Mobile Cloud Computing

    OpenAIRE

    Yan, Yan; Xiaohong, Hao; Wanjun, Wang

    2015-01-01

    Location-based services can provide personalized services based on location information of moving objects and have already been widely used in public safety services, transportation, entertainment and many other areas. With the rapid development of mobile communication technology and popularization of intelligent terminals, there will be great commercial prospects to provide location-based services under mobile cloud computing environment. However, the high adhesion degree of mobile terminals...

  16. VMEbus based computer and real-time UNIX as infrastructure of DAQ

    International Nuclear Information System (INIS)

    Yasu, Y.; Fujii, H.; Nomachi, M.; Kodama, H.; Inoue, E.; Tajima, Y.; Takeuchi, Y.; Shimizu, Y.

    1994-01-01

    This paper describes what the authors have constructed as the infrastructure of data acquisition system (DAQ). The paper reports recent developments concerned with HP VME board computer with LynxOS (HP742rt/HP-RT) and Alpha/OSF1 with VMEbus adapter. The paper also reports current status of developing a Benchmark Suite for Data Acquisition (DAQBENCH) for measuring not only the performance of VME/CAMAC access but also that of the context switching, the inter-process communications and so on, for various computers including Workstation-based systems and VME board computers

  17. Quantum computing based on space states without charge transfer

    International Nuclear Information System (INIS)

    Vyurkov, V.; Filippov, S.; Gorelik, L.

    2010-01-01

    An implementation of a quantum computer based on space states in double quantum dots is discussed. There is no charge transfer in qubits during a calculation, therefore, uncontrolled entanglement between qubits due to long-range Coulomb interaction is suppressed. Encoding and processing of quantum information is merely performed on symmetric and antisymmetric states of the electron in double quantum dots. Other plausible sources of decoherence caused by interaction with phonons and gates could be substantially suppressed in the structure as well. We also demonstrate how all necessary quantum logic operations, initialization, writing, and read-out could be carried out in the computer.

  18. Centralized computer-based controls of the Nova Laser Facility

    International Nuclear Information System (INIS)

    Krammen, J.

    1985-01-01

    This article introduces the overall architecture of the computer-based Nova Laser Control System and describes its basic components. Use of standard hardware and software components ensures that the system, while specialized and distributed throughout the facility, is adaptable. 9 references, 6 figures

  19. Discovery of technical methanation catalysts based on computational screening

    DEFF Research Database (Denmark)

    Sehested, Jens; Larsen, Kasper Emil; Kustov, Arkadii

    2007-01-01

    Methanation is a classical reaction in heterogeneous catalysis and significant effort has been put into improving the industrially preferred nickel-based catalysts. Recently, a computational screening study showed that nickel-iron alloys should be more active than the pure nickel catalyst and at ...

  20. Traffic Flow Prediction Model for Large-Scale Road Network Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhaosheng Yang

    2014-01-01

    Full Text Available To increase the efficiency and precision of large-scale road network traffic flow prediction, a genetic algorithm-support vector machine (GA-SVM model based on cloud computing is proposed in this paper, which is based on the analysis of the characteristics and defects of genetic algorithm and support vector machine. In cloud computing environment, firstly, SVM parameters are optimized by the parallel genetic algorithm, and then this optimized parallel SVM model is used to predict traffic flow. On the basis of the traffic flow data of Haizhu District in Guangzhou City, the proposed model was verified and compared with the serial GA-SVM model and parallel GA-SVM model based on MPI (message passing interface. The results demonstrate that the parallel GA-SVM model based on cloud computing has higher prediction accuracy, shorter running time, and higher speedup.

  1. Home-Based Computer Gaming in Vestibular Rehabilitation of Gaze and Balance Impairment.

    Science.gov (United States)

    Szturm, Tony; Reimer, Karen M; Hochman, Jordan

    2015-06-01

    Disease or damage of the vestibular sense organs cause a range of distressing symptoms and functional problems that could include loss of balance, gaze instability, disorientation, and dizziness. A novel computer-based rehabilitation system with therapeutic gaming application has been developed. This method allows different gaze and head movement exercises to be coupled to a wide range of inexpensive, commercial computer games. It can be used in standing, and thus graded balance demands using a sponge pad can be incorporated into the program. A case series pre- and postintervention study was conducted of nine adults diagnosed with peripheral vestibular dysfunction who received a 12-week home rehabilitation program. The feasibility and usability of the home computer-based therapeutic program were established. Study findings revealed that using head rotation to interact with computer games, when coupled to demanding balance conditions, resulted in significant improvements in standing balance, dynamic visual acuity, gaze control, and walking performance. Perception of dizziness as measured by the Dizziness Handicap Inventory also decreased significantly. These preliminary findings provide support that a low-cost home game-based exercise program is well suited to train standing balance and gaze control (with active and passive head motion).

  2. A Reputation-Based Identity Management Model for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lifa Wu

    2015-01-01

    Full Text Available In the field of cloud computing, most research on identity management has concentrated on protecting user data. However, users typically leave a trail when they access cloud services, and the resulting user traceability can potentially lead to the leakage of sensitive user information. Meanwhile, malicious users can do harm to cloud providers through the use of pseudonyms. To solve these problems, we introduce a reputation mechanism and design a reputation-based identity management model for cloud computing. In the model, pseudonyms are generated based on a reputation signature so as to guarantee the untraceability of pseudonyms, and a mechanism that calculates user reputation is proposed, which helps cloud service providers to identify malicious users. Analysis verifies that the model can ensure that users access cloud services anonymously and that cloud providers assess the credibility of users effectively without violating user privacy.

  3. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    International Nuclear Information System (INIS)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E; Loukopoulos, Klearchos

    2011-01-01

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  4. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Loukopoulos, Klearchos, E-mail: m.hoban@ucl.ac.uk [Department of Materials, Oxford University, Parks Road, Oxford OX1 4PH (United Kingdom)

    2011-02-15

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  5. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    Science.gov (United States)

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  6. The impact of computer-based versus "traditional" textbook science instruction on selected student learning outcomes

    Science.gov (United States)

    Rothman, Alan H.

    This study reports the results of research designed to examine the impact of computer-based science instruction on elementary school level students' science content achievement, their attitude about science learning, their level of critical thinking-inquiry skills, and their level of cognitive and English language development. The study compared these learning outcomes resulting from a computer-based approach compared to the learning outcomes from a traditional, textbook-based approach to science instruction. The computer-based approach was inherent in a curriculum titled The Voyage of the Mimi , published by The Bank Street College Project in Science and Mathematics (1984). The study sample included 209 fifth-grade students enrolled in three schools in a suburban school district. This sample was divided into three groups, each receiving one of the following instructional treatments: (a) Mixed-instruction primarily based on the use of a hardcopy textbook in conjunction with computer-based instructional materials as one component of the science course; (b) Non-Traditional, Technology-Based -instruction fully utilizing computer-based material; and (c) Traditional, Textbook-Based-instruction utilizing only the textbook as the basis for instruction. Pre-test, or pre-treatment, data related to each of the student learning outcomes was collected at the beginning of the school year and post-test data was collected at the end of the school year. Statistical analyses of pre-test data were used as a covariate to account for possible pre-existing differences with regard to the variables examined among the three student groups. This study concluded that non-traditional, computer-based instruction in science significantly improved students' attitudes toward science learning and their level of English language development. Non-significant, positive trends were found for the following student learning outcomes: overall science achievement and development of critical thinking

  7. Rediscovering the Economics of Keynes in an Agent-Based Computational Setting

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    The aim of this paper is to use agent-based computational economics to explore the economic thinking of Keynes. Taking his starting point at the macroeconomic level, Keynes argued that economic systems are characterized by fundamental uncertainty - an uncertainty that makes rule-based behaviour...... and reliance on monetary magnitudes more optimal to the economic agent than profit- and utility optimazation in the traditional sense. Unfortunately more systematic studies of the properties of such a system was not possible at the time of Keynes. The system envisioned by Keynes holds a lot of properties...... in commen with what we today call complex dynamic systems, and today we may aply the method of agent-based computational economics to the ideas of Keynes. The presented agent-based Keynesian model demonstrate, as argued by Keynes, that the economy can selforganize without relying on price movement...

  8. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  9. Promoting healthy computer use among middle school students: a pilot school-based health promotion program.

    Science.gov (United States)

    Ciccarelli, Marina; Portsmouth, Linda; Harris, Courtenay; Jacobs, Karen

    2012-01-01

    Introduction of notebook computers in many schools has become integral to learning. This has increased students' screen-based exposure and the potential risks to physical and visual health. Unhealthy computing behaviours include frequent and long durations of exposure; awkward postures due to inappropriate furniture and workstation layout, and ignoring computer-related discomfort. Describe the framework for a planned school-based health promotion program to encourage healthy computing behaviours among middle school students. This planned program uses a community- based participatory research approach. Students in Year 7 in 2011 at a co-educational middle school, their parents, and teachers have been recruited. Baseline data was collected on students' knowledge of computer ergonomics, current notebook exposure, and attitudes towards healthy computing behaviours; and teachers' and self-perceived competence to promote healthy notebook use among students, and what education they wanted. The health promotion program is being developed by an inter-professional team in collaboration with students, teachers and parents to embed concepts of ergonomics education in relevant school activities and school culture. End of year changes in reported and observed student computing behaviours will be used to determine the effectiveness of the program. Building a body of evidence regarding physical health benefits to students from this school-based ergonomics program can guide policy development on the healthy use of computers within children's educational environments.

  10. Metaheuristic Based Scheduling Meta-Tasks in Distributed Heterogeneous Computing Systems

    Directory of Open Access Journals (Sweden)

    Hesam Izakian

    2009-07-01

    Full Text Available Scheduling is a key problem in distributed heterogeneous computing systems in order to benefit from the large computing capacity of such systems and is an NP-complete problem. In this paper, we present a metaheuristic technique, namely the Particle Swarm Optimization (PSO algorithm, for this problem. PSO is a population-based search algorithm based on the simulation of the social behavior of bird flocking and fish schooling. Particles fly in problem search space to find optimal or near-optimal solutions. The scheduler aims at minimizing makespan, which is the time when finishes the latest task. Experimental studies show that the proposed method is more efficient and surpasses those of reported PSO and GA approaches for this problem.

  11. MONOMIALS AND BASIN CYLINDERS FOR NETWORK DYNAMICS.

    Science.gov (United States)

    Austin, Daniel; Dinwoodie, Ian H

    We describe methods to identify cylinder sets inside a basin of attraction for Boolean dynamics of biological networks. Such sets are used for designing regulatory interventions that make the system evolve towards a chosen attractor, for example initiating apoptosis in a cancer cell. We describe two algebraic methods for identifying cylinders inside a basin of attraction, one based on the Groebner fan that finds monomials that define cylinders and the other on primary decomposition. Both methods are applied to current examples of gene networks.

  12. Computer-based irrigation scheduling for cotton crop

    International Nuclear Information System (INIS)

    Laghari, K.Q.; Memon, H.M.

    2008-01-01

    In this study a real time irrigation schedule for cotton crop has been tested using mehran model, a computer-based DDS (Decision Support System). The irrigation schedule was set on selected MAD (Management Allowable Depletion) and the current root depth position. The total 451 mm irrigation water applied to the crop field. The seasonal computed crop ET (Evapotranspiration) was estimated 421.32 mm and actual (ET/sub ca/) observed was 413 mm. The model over-estimated seasonal ET by only 1.94. WUE (Water Use Efficiency) for seed-cotton achieved 6.59 Kg (ha mm)/sup -1/. The statistical analysis (R/sup 2/=0.96, ARE%=2.00, T-1.17 and F=550.57) showed good performance of the model in simulated and observed ET values. The designed Mehran model is designed quite versatile for irrigation scheduling and can be successfully used as irrigation DSS tool for various crop types. (author)

  13. What Does Research on Computer-Based Instruction Have to Say to the Reading Teacher?

    Science.gov (United States)

    Balajthy, Ernest

    1987-01-01

    Examines questions typically asked about the effectiveness of computer-based reading instruction, suggesting that these questions must be refined to provide meaningful insight into the issues involved. Describes several critical problems with existing research and presents overviews of research on the effects of computer-based instruction on…

  14. Computer-based versus in-person interventions for preventing and reducing stress in workers.

    Science.gov (United States)

    Kuster, Anootnara Talkul; Dalsbø, Therese K; Luong Thanh, Bao Yen; Agarwal, Arnav; Durand-Moreau, Quentin V; Kirkehei, Ingvild

    2017-08-30

    Chronic exposure to stress has been linked to several negative physiological and psychological health outcomes. Among employees, stress and its associated effects can also result in productivity losses and higher healthcare costs. In-person (face-to-face) and computer-based (web- and mobile-based) stress management interventions have been shown to be effective in reducing stress in employees compared to no intervention. However, it is unclear if one form of intervention delivery is more effective than the other. It is conceivable that computer-based interventions are more accessible, convenient, and cost-effective. To compare the effects of computer-based interventions versus in-person interventions for preventing and reducing stress in workers. We searched CENTRAL, MEDLINE, PubMed, Embase, PsycINFO, NIOSHTIC, NIOSHTIC-2, HSELINE, CISDOC, and two trials registers up to February 2017. We included randomised controlled studies that compared the effectiveness of a computer-based stress management intervention (using any technique) with a face-to-face intervention that had the same content. We included studies that measured stress or burnout as an outcome, and used workers from any occupation as participants. Three authors independently screened and selected 75 unique studies for full-text review from 3431 unique reports identified from the search. We excluded 73 studies based on full-text assessment. We included two studies. Two review authors independently extracted stress outcome data from the two included studies. We contacted study authors to gather additional data. We used standardised mean differences (SMDs) with 95% confidence intervals (CIs) to report study results. We did not perform meta-analyses due to variability in the primary outcome and considerable statistical heterogeneity. We used the GRADE approach to rate the quality of the evidence. Two studies met the inclusion criteria, including a total of 159 participants in the included arms of the studies

  15. A Cost–Effective Computer-Based, Hybrid Motorised and Gravity ...

    African Journals Online (AJOL)

    A Cost–Effective Computer-Based, Hybrid Motorised and Gravity-Driven Material Handling System for the Mauritian Apparel Industry. ... Thus, many companies are investing significantly in a Research & Development department in order to design new techniques to improve worker's efficiency, and to decrease the amount ...

  16. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  17. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  18. The Accuracy of Cognitive Monitoring during Computer-Based Instruction.

    Science.gov (United States)

    Garhart, Casey; Hannafin, Michael J.

    This study was conducted to determine the accuracy of learners' comprehension monitoring during computer-based instruction and to assess the relationship between enroute monitoring and different levels of learning. Participants were 50 university undergraduate students enrolled in an introductory educational psychology class. All students received…

  19. Current advances on polynomial resultant formulations

    Science.gov (United States)

    Sulaiman, Surajo; Aris, Nor'aini; Ahmad, Shamsatun Nahar

    2017-08-01

    Availability of computer algebra systems (CAS) lead to the resurrection of the resultant method for eliminating one or more variables from the polynomials system. The resultant matrix method has advantages over the Groebner basis and Ritt-Wu method due to their high complexity and storage requirement. This paper focuses on the current resultant matrix formulations and investigates their ability or otherwise towards producing optimal resultant matrices. A determinantal formula that gives exact resultant or a formulation that can minimize the presence of extraneous factors in the resultant formulation is often sought for when certain conditions that it exists can be determined. We present some applications of elimination theory via resultant formulations and examples are given to explain each of the presented settings.

  20. A Multi-agent Supply Chain Information Coordination Mode Based on Cloud Computing

    OpenAIRE

    Wuxue Jiang; Jing Zhang; Junhuai Li

    2013-01-01

     In order to improve the high efficiency and security of supply chain information coordination under cloud computing environment, this paper proposes a supply chain information coordination mode based on cloud computing. This mode has two basic statuses which are online status and offline status. At the online status, cloud computing center is responsible for coordinating the whole supply chain information. At the offline status, information exchange can be realized among different nodes by u...

  1. Solid-state nuclear-spin quantum computer based on magnetic resonance force microscopy

    International Nuclear Information System (INIS)

    Berman, G. P.; Doolen, G. D.; Hammel, P. C.; Tsifrinovich, V. I.

    2000-01-01

    We propose a nuclear-spin quantum computer based on magnetic resonance force microscopy (MRFM). It is shown that an MRFM single-electron spin measurement provides three essential requirements for quantum computation in solids: (a) preparation of the ground state, (b) one- and two-qubit quantum logic gates, and (c) a measurement of the final state. The proposed quantum computer can operate at temperatures up to 1 K. (c) 2000 The American Physical Society

  2. Hanford general employee training: Computer-based training instructor's manual

    Energy Technology Data Exchange (ETDEWEB)

    1990-10-01

    The Computer-Based Training portion of the Hanford General Employee Training course is designed to be used in a classroom setting with a live instructor. Future references to this course'' refer only to the computer-based portion of the whole. This course covers the basic Safety, Security, and Quality issues that pertain to all employees of Westinghouse Hanford Company. The topics that are covered were taken from the recommendations and requirements for General Employee Training as set forth by the Institute of Nuclear Power Operations (INPO) in INPO 87-004, Guidelines for General Employee Training, applicable US Department of Energy orders, and Westinghouse Hanford Company procedures and policy. Besides presenting fundamental concepts, this course also contains information on resources that are available to assist students. It does this using Interactive Videodisk technology, which combines computer-generated text and graphics with audio and video provided by a videodisk player.

  3. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    NARCIS (Netherlands)

    Molenaar, I.; Roda, Claudia; van Boxtel, Carla A.M.; Sleegers, P.J.C.

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N = 56) are supported with computer-generated scaffolds and students in the control condition (N =

  4. Many-core computing for space-based stereoscopic imaging

    Science.gov (United States)

    McCall, Paul; Torres, Gildo; LeGrand, Keith; Adjouadi, Malek; Liu, Chen; Darling, Jacob; Pernicka, Henry

    The potential benefits of using parallel computing in real-time visual-based satellite proximity operations missions are investigated. Improvements in performance and relative navigation solutions over single thread systems can be achieved through multi- and many-core computing. Stochastic relative orbit determination methods benefit from the higher measurement frequencies, allowing them to more accurately determine the associated statistical properties of the relative orbital elements. More accurate orbit determination can lead to reduced fuel consumption and extended mission capabilities and duration. Inherent to the process of stereoscopic image processing is the difficulty of loading, managing, parsing, and evaluating large amounts of data efficiently, which may result in delays or highly time consuming processes for single (or few) processor systems or platforms. In this research we utilize the Single-Chip Cloud Computer (SCC), a fully programmable 48-core experimental processor, created by Intel Labs as a platform for many-core software research, provided with a high-speed on-chip network for sharing information along with advanced power management technologies and support for message-passing. The results from utilizing the SCC platform for the stereoscopic image processing application are presented in the form of Performance, Power, Energy, and Energy-Delay-Product (EDP) metrics. Also, a comparison between the SCC results and those obtained from executing the same application on a commercial PC are presented, showing the potential benefits of utilizing the SCC in particular, and any many-core platforms in general for real-time processing of visual-based satellite proximity operations missions.

  5. An IPMI-based slow control system for the PANDA compute node

    Energy Technology Data Exchange (ETDEWEB)

    Galuska, Martin; Gessler, Thomas; Kuehn, Wolfgang; Lang, Johannes; Lange, Jens Soeren; Liang, Yutie; Liu, Ming; Spruck, Bjoern; Wang, Qiang [II. Physikalisches Institut, Justus-Liebig-Universitaet Giessen (Germany); Collaboration: PANDA-Collaboration

    2011-07-01

    Reaction rate of 10-20 MHz from antiproton-proton-collisions are expected for the PANDA experiment at FAIR, leading to a raw data output rate of up to 200 GB/s. A sophisticated data acquisition system is needed in order to select physically relevant events online. A network of FPGA-based Compute Nodes will be used for this purpose. An AdvancedTCA shelf provides the infrastructure for up to 14 Compute Nodes. A Shelf Manager supervises the system health and regulates power distribution and temperature. It relies on a local controller on each Compute Node to relay sensor readings, provide power requirements etc. This makes remote management of the entire system possible. An IPM Controller based on an Atmel microcontroller was designed for this purpose, and a prototype was produced. The necessary firmware is being developed to allow interaction with the components of the Compute Node and the Shelf Manager conform to the AdvancedTCA specification. A set of basic mandatory functions was implemented that can be extended easily. An improved version of the controller is in development. An overview of the intended functions of the controller and a status report will be given.

  6. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses

    Science.gov (United States)

    Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-01

    Background Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. Objective The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Methods Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Results Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however

  7. Features of Computer-Based Decision Aids: Systematic Review, Thematic Synthesis, and Meta-Analyses.

    Science.gov (United States)

    Syrowatka, Ania; Krömker, Dörthe; Meguerditchian, Ari N; Tamblyn, Robyn

    2016-01-26

    Patient information and education, such as decision aids, are gradually moving toward online, computer-based environments. Considerable research has been conducted to guide content and presentation of decision aids. However, given the relatively new shift to computer-based support, little attention has been given to how multimedia and interactivity can improve upon paper-based decision aids. The first objective of this review was to summarize published literature into a proposed classification of features that have been integrated into computer-based decision aids. Building on this classification, the second objective was to assess whether integration of specific features was associated with higher-quality decision making. Relevant studies were located by searching MEDLINE, Embase, CINAHL, and CENTRAL databases. The review identified studies that evaluated computer-based decision aids for adults faced with preference-sensitive medical decisions and reported quality of decision-making outcomes. A thematic synthesis was conducted to develop the classification of features. Subsequently, meta-analyses were conducted based on standardized mean differences (SMD) from randomized controlled trials (RCTs) that reported knowledge or decisional conflict. Further subgroup analyses compared pooled SMDs for decision aids that incorporated a specific feature to other computer-based decision aids that did not incorporate the feature, to assess whether specific features improved quality of decision making. Of 3541 unique publications, 58 studies met the target criteria and were included in the thematic synthesis. The synthesis identified six features: content control, tailoring, patient narratives, explicit values clarification, feedback, and social support. A subset of 26 RCTs from the thematic synthesis was used to conduct the meta-analyses. As expected, computer-based decision aids performed better than usual care or alternative aids; however, some features performed better than

  8. Strengthen Cloud Computing Security with Federal Identity Management Using Hierarchical Identity-Based Cryptography

    Science.gov (United States)

    Yan, Liang; Rong, Chunming; Zhao, Gansen

    More and more companies begin to provide different kinds of cloud computing services for Internet users at the same time these services also bring some security problems. Currently the majority of cloud computing systems provide digital identity for users to access their services, this will bring some inconvenience for a hybrid cloud that includes multiple private clouds and/or public clouds. Today most cloud computing system use asymmetric and traditional public key cryptography to provide data security and mutual authentication. Identity-based cryptography has some attraction characteristics that seem to fit well the requirements of cloud computing. In this paper, by adopting federated identity management together with hierarchical identity-based cryptography (HIBC), not only the key distribution but also the mutual authentication can be simplified in the cloud.

  9. Molecular architectures based on π-conjugated block copolymers for global quantum computation

    International Nuclear Information System (INIS)

    Mujica Martinez, C A; Arce, J C; Reina, J H; Thorwart, M

    2009-01-01

    We propose a molecular setup for the physical implementation of a barrier global quantum computation scheme based on the electron-doped π-conjugated copolymer architecture of nine blocks PPP-PDA-PPP-PA-(CCH-acene)-PA-PPP-PDA-PPP (where each block is an oligomer). The physical carriers of information are electrons coupled through the Coulomb interaction, and the building block of the computing architecture is composed by three adjacent qubit systems in a quasi-linear arrangement, each of them allowing qubit storage, but with the central qubit exhibiting a third accessible state of electronic energy far away from that of the qubits' transition energy. The third state is reached from one of the computational states by means of an on-resonance coherent laser field, and acts as a barrier mechanism for the direct control of qubit entanglement. Initial estimations of the spontaneous emission decay rates associated to the energy level structure allow us to compute a damping rate of order 10 -7 s, which suggest a not so strong coupling to the environment. Our results offer an all-optical, scalable, proposal for global quantum computing based on semiconducting π-conjugated polymers.

  10. Molecular architectures based on pi-conjugated block copolymers for global quantum computation

    Energy Technology Data Exchange (ETDEWEB)

    Mujica Martinez, C A; Arce, J C [Universidad del Valle, Departamento de QuImica, A. A. 25360, Cali (Colombia); Reina, J H [Universidad del Valle, Departamento de Fisica, A. A. 25360, Cali (Colombia); Thorwart, M, E-mail: camujica@univalle.edu.c, E-mail: j.reina-estupinan@physics.ox.ac.u, E-mail: jularce@univalle.edu.c [Institut fuer Theoretische Physik IV, Heinrich-Heine-Universitaet Duesseldorf, 40225 Duesseldorf (Germany)

    2009-05-01

    We propose a molecular setup for the physical implementation of a barrier global quantum computation scheme based on the electron-doped pi-conjugated copolymer architecture of nine blocks PPP-PDA-PPP-PA-(CCH-acene)-PA-PPP-PDA-PPP (where each block is an oligomer). The physical carriers of information are electrons coupled through the Coulomb interaction, and the building block of the computing architecture is composed by three adjacent qubit systems in a quasi-linear arrangement, each of them allowing qubit storage, but with the central qubit exhibiting a third accessible state of electronic energy far away from that of the qubits' transition energy. The third state is reached from one of the computational states by means of an on-resonance coherent laser field, and acts as a barrier mechanism for the direct control of qubit entanglement. Initial estimations of the spontaneous emission decay rates associated to the energy level structure allow us to compute a damping rate of order 10{sup -7} s, which suggest a not so strong coupling to the environment. Our results offer an all-optical, scalable, proposal for global quantum computing based on semiconducting pi-conjugated polymers.

  11. Evidence-based ergonomics education: Promoting risk factor awareness among office computer workers.

    Science.gov (United States)

    Mani, Karthik; Provident, Ingrid; Eckel, Emily

    2016-01-01

    Work-related musculoskeletal disorders (WMSDs) related to computer work have become a serious public health concern. Literature revealed a positive association between computer use and WMSDs. The purpose of this evidence-based pilot project was to provide a series of evidence-based educational sessions on ergonomics to office computer workers to enhance the awareness of risk factors of WMSDs. Seventeen office computer workers who work for the National Board of Certification in Occupational Therapy volunteered for this project. Each participant completed a baseline and post-intervention ergonomics questionnaire and attended six educational sessions. The Rapid Office Strain Assessment and an ergonomics questionnaire were used for data collection. The post-intervention data revealed that 89% of participants were able to identify a greater number of risk factors and answer more questions correctly in knowledge tests of the ergonomics questionnaire. Pre- and post-intervention comparisons showed changes in work posture and behaviors (taking rest breaks, participating in exercise, adjusting workstation) of participants. The findings have implications for injury prevention in office settings and suggest that ergonomics education may yield positive knowledge and behavioral changes among computer workers.

  12. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  13. The use of computer based instructions to enhance Rwandan ...

    African Journals Online (AJOL)

    Annestar

    (2) To what extent the newly acquired ICT skills impact on teachers' competency? (3) How suitable is computer based instruction to enhance teachers' continuous professional development? Literature review. ICT competency for teachers. Regardless of the quantity and quality of technology available in classrooms, the key ...

  14. Evaluation of computer-based library services at Kenneth Dike ...

    African Journals Online (AJOL)

    This study evaluated computer-based library services/routines at Kenneth Dike Library, University of Ibadan. Four research questions were developed and answered. A survey research design was adopted; using questionnaire as the instrument for data collection. A total of 200 respondents randomly selected from 10 ...

  15. Task-and-role-based access-control model for computational grid

    Institute of Scientific and Technical Information of China (English)

    LONG Tao; HONG Fan; WU Chi; SUN Ling-li

    2007-01-01

    Access control in a grid environment is a challenging issue because the heterogeneous nature and independent administration of geographically dispersed resources in grid require access control to use fine-grained policies. We established a task-and-role-based access-control model for computational grid (CG-TRBAC model), integrating the concepts of role-based access control (RBAC) and task-based access control (TBAC). In this model, condition restrictions are defined and concepts specifically tailored to Workflow Management System are simplified or omitted so that role assignment and security administration fit computational grid better than traditional models; permissions are mutable with the task status and system variables, and can be dynamically controlled. The CG-TRBAC model is proved flexible and extendible. It can implement different control policies. It embodies the security principle of least privilege and executes active dynamic authorization. A task attribute can be extended to satisfy different requirements in a real grid system.

  16. Availability-based computer management of a cold thermal storage system

    International Nuclear Information System (INIS)

    Wong, K.F.V.; Ferrano, F.J.

    1990-01-01

    This paper reports on work to develop an availability-based, on-line expert system to manage a thermal energy storage air-conditioning system. The management system is designed to be used by mechanical engineers in the field of air-conditioning control and maintenance. Specifically, the expert system permits the user to easily monitor the second law of thermodynamics operating efficiencies of the major components and the system as a whole in addition to the daily scheduled operating parameters of a cold thermal storage system. Through the use of computer-generated and continually updated screen display pages, the user is permitted interaction with the expert system. The knowledge-based system is developed with a commercially available expert system shell that is resident in a personal computer. In the case studied, 130 various analog and binary inputs/outputs are used. The knowledge base for the thermal energy storage expert system included nine different display pages that are continually updated, 25 rules, three tasks, and three loops

  17. GRAPH-BASED POST INCIDENT INTERNAL AUDIT METHOD OF COMPUTER EQUIPMENT

    Directory of Open Access Journals (Sweden)

    I. S. Pantiukhin

    2016-05-01

    Full Text Available Graph-based post incident internal audit method of computer equipment is proposed. The essence of the proposed solution consists in the establishing of relationships among hard disk damps (image, RAM and network. This method is intended for description of information security incident properties during the internal post incident audit of computer equipment. Hard disk damps receiving and formation process takes place at the first step. It is followed by separation of these damps into the set of components. The set of components includes a large set of attributes that forms the basis for the formation of the graph. Separated data is recorded into the non-relational database management system (NoSQL that is adapted for graph storage, fast access and processing. Damps linking application method is applied at the final step. The presented method gives the possibility to human expert in information security or computer forensics for more precise, informative internal audit of computer equipment. The proposed method allows reducing the time spent on internal audit of computer equipment, increasing accuracy and informativeness of such audit. The method has a development potential and can be applied along with the other components in the tasks of users’ identification and computer forensics.

  18. Trend of computer-based console for nuclear power plants

    International Nuclear Information System (INIS)

    Wajima, Tsunetaka; Serizawa, Michiya

    1975-01-01

    The amount of informations to be watched by the operators in the central operation room increased with the increase of the capacity of nuclear power generation plants, and the necessity of computer-based consoles, in which the informations are compiled and the rationalization of the interface between the operators and the plants is intended by introducing CRT displays and process computers, became to be recognized. The integrated monitoring and controlling system is explained briefly by taking Dungeness B Nuclear Power Station in Britain as a typical example. This power station comprises two AGRs, and these two plants can be controlled in one central control room, each by one man. Three computers including stand-by one are installed. Each computer has the core memory of 16 K words (24 bits/word), and 4 magnetic drums of 256 K words are installed as the external memory. The peripheral equipments are 12 CRT displays, 6 typewriters, high speed tape reader and tape punch for each plant. The display and record of plant data, the analysis, display and record of alarms, the control of plants including reactors, and post incident record are assigned to the computers. In Hitachi Ltd. in Japan, the introduction of color CRTs, the developments of operating consoles, new data-accessing method, and the consoles for maintenance management are in progress. (Kako, I.)

  19. Improving the learning of clinical reasoning through computer-based cognitive representation.

    Science.gov (United States)

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  20. Fragment informatics and computational fragment-based drug design: an overview and update.

    Science.gov (United States)

    Sheng, Chunquan; Zhang, Wannian

    2013-05-01

    Fragment-based drug design (FBDD) is a promising approach for the discovery and optimization of lead compounds. Despite its successes, FBDD also faces some internal limitations and challenges. FBDD requires a high quality of target protein and good solubility of fragments. Biophysical techniques for fragment screening necessitate expensive detection equipment and the strategies for evolving fragment hits to leads remain to be improved. Regardless, FBDD is necessary for investigating larger chemical space and can be applied to challenging biological targets. In this scenario, cheminformatics and computational chemistry can be used as alternative approaches that can significantly improve the efficiency and success rate of lead discovery and optimization. Cheminformatics and computational tools assist FBDD in a very flexible manner. Computational FBDD can be used independently or in parallel with experimental FBDD for efficiently generating and optimizing leads. Computational FBDD can also be integrated into each step of experimental FBDD and help to play a synergistic role by maximizing its performance. This review will provide critical analysis of the complementarity between computational and experimental FBDD and highlight recent advances in new algorithms and successful examples of their applications. In particular, fragment-based cheminformatics tools, high-throughput fragment docking, and fragment-based de novo drug design will provide the focus of this review. We will also discuss the advantages and limitations of different methods and the trends in new developments that should inspire future research. © 2012 Wiley Periodicals, Inc.

  1. A cloud computing based 12-lead ECG telemedicine service.

    Science.gov (United States)

    Hsieh, Jui-Chien; Hsu, Meng-Wei

    2012-07-28

    Due to the great variability of 12-lead ECG instruments and medical specialists' interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists' decision making support in emergency telecardiology. We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  2. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    Science.gov (United States)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-03-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  3. Development and evaluation of a computer-based medical work assessment programme

    Directory of Open Access Journals (Sweden)

    Spallek Michael

    2008-12-01

    Full Text Available Abstract Background There are several ways to conduct a job task analysis in medical work environments including pencil-paper observations, interviews and questionnaires. However these methods implicate bias problems such as high inter-individual deviations and risks of misjudgement. Computer-based observation helps to reduce these problems. The aim of this paper is to give an overview of the development process of a computer-based job task analysis instrument for real-time observations to quantify the job tasks performed by physicians working in different medical settings. In addition reliability and validity data of this instrument will be demonstrated. Methods This instrument was developed in consequential steps. First, lists comprising tasks performed by physicians in different care settings were classified. Afterwards content validity of task lists was proved. After establishing the final task categories, computer software was programmed and implemented in a mobile personal computer. At least inter-observer reliability was evaluated. Two trained observers recorded simultaneously tasks of the same physician. Results Content validity of the task lists was confirmed by observations and experienced specialists of each medical area. Development process of the job task analysis instrument was completed successfully. Simultaneous records showed adequate interrater reliability. Conclusion Initial results of this analysis supported the validity and reliability of this developed method for assessing physicians' working routines as well as organizational context factors. Based on results using this method, possible improvements for health professionals' work organisation can be identified.

  4. Computer-based programs on acquisition of reading skills in schoolchildren (review of contemporary foreign investigations

    Directory of Open Access Journals (Sweden)

    Prikhoda N.A.

    2015-03-01

    Full Text Available The article presents a description of 17 computer-based programs, which were used over the last 5 years (2008—2013 in 15 studies of computer-assisted reading instruction and intervention of schoolchildren. The article includes a description of specificity of various terms used in the above-mentioned studies and the contents of training sessions. The article also carries out a brief analysis of main characteristics of computer-based techniques — language of instruction, age and basic characteristics of students, duration and frequency of training sessions, dependent variables of education. Special attention is paid to efficiency of acquisition of different reading skills through computer-based programs in comparison to traditional school instruction.

  5. Multidimensional control using a mobile-phone based brain-muscle-computer interface.

    Science.gov (United States)

    Vernon, Scott; Joshi, Sanjay S

    2011-01-01

    Many well-known brain-computer interfaces measure signals at the brain, and then rely on the brain's ability to learn via operant conditioning in order to control objects in the environment. In our lab, we have been developing brain-muscle-computer interfaces, which measure signals at a single muscle and then rely on the brain's ability to learn neuromuscular skills via operant conditioning. Here, we report a new mobile-phone based brain-muscle-computer interface prototype for severely paralyzed persons, based on previous results from our group showing that humans may actively create specified power levels in two separate frequency bands of a single sEMG signal. Electromyographic activity on the surface of a single face muscle (Auricularis superior) is recorded with a standard electrode. This analog electrical signal is imported into an Android-based mobile phone. User-modulated power in two separate frequency band serves as two separate and simultaneous control channels for machine control. After signal processing, the Android phone sends commands to external devices via Bluetooth. Users are trained to use the device via biofeedback, with simple cursor-to-target activities on the phone screen.

  6. GPU-based acceleration of computations in nonlinear finite element deformation analysis.

    Science.gov (United States)

    Mafi, Ramin; Sirouspour, Shahin

    2014-03-01

    The physics of deformation for biological soft-tissue is best described by nonlinear continuum mechanics-based models, which then can be discretized by the FEM for a numerical solution. However, computational complexity of such models have limited their use in applications requiring real-time or fast response. In this work, we propose a graphic processing unit-based implementation of the FEM using implicit time integration for dynamic nonlinear deformation analysis. This is the most general formulation of the deformation analysis. It is valid for large deformations and strains and can account for material nonlinearities. The data-parallel nature and the intense arithmetic computations of nonlinear FEM equations make it particularly suitable for implementation on a parallel computing platform such as graphic processing unit. In this work, we present and compare two different designs based on the matrix-free and conventional preconditioned conjugate gradients algorithms for solving the FEM equations arising in deformation analysis. The speedup achieved with the proposed parallel implementations of the algorithms will be instrumental in the development of advanced surgical simulators and medical image registration methods involving soft-tissue deformation. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Towards Modeling False Memory With Computational Knowledge Bases.

    Science.gov (United States)

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  8. A computer-based purchase management system

    International Nuclear Information System (INIS)

    Kuriakose, K.K.; Subramani, M.G.

    1989-01-01

    The details of a computer-based purchase management system developed to meet the specific requirements of Madras Regional Purchase Unit (MRPU) is given. Howe ver it can be easily modified to meet the requirements of any other purchase department. It covers various operations of MRPU starting from indent processing to preparation of purchase orders and reminders. In order to enable timely management action and control facilities are provided to generate the necessary management information reports. The scope for further work is also discussed. The system is completely menu driven and user friendly. Appendix A and B contains the menu implemented and the sample outputs respectively. (author)

  9. A review of computer-based simulators for ultrasound training.

    Science.gov (United States)

    Blum, Tobias; Rieger, Andreas; Navab, Nassir; Friess, Helmut; Martignoni, Marc

    2013-04-01

    Computer-based simulators for ultrasound training are a topic of recent interest. During the last 15 years, many different systems and methods have been proposed. This article provides an overview and classification of systems in this domain and a discussion of their advantages. Systems are classified and discussed according to the image simulation method, user interactions and medical applications. Computer simulation of ultrasound has one key advantage over traditional training. It enables novel training concepts, for example, through advanced visualization, case databases, and automatically generated feedback. Qualitative evaluations have mainly shown positive learning effects. However, few quantitative evaluations have been performed and long-term effects have to be examined.

  10. Quantum computing based on semiconductor nanowires

    NARCIS (Netherlands)

    Frolov, S.M.; Plissard, S.R.; Nadj-Perge, S.; Kouwenhoven, L.P.; Bakkers, E.P.A.M.

    2013-01-01

    A quantum computer will have computational power beyond that of conventional computers, which can be exploited for solving important and complex problems, such as predicting the conformations of large biological molecules. Materials play a major role in this emerging technology, as they can enable

  11. Wireless-Uplinks-Based Energy-Efficient Scheduling in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xing Liu

    2015-01-01

    Full Text Available Mobile cloud computing (MCC combines cloud computing and mobile internet to improve the computational capabilities of resource-constrained mobile devices (MDs. In MCC, mobile users could not only improve the computational capability of MDs but also save operation consumption by offloading the mobile applications to the cloud. However, MCC faces the problem of energy efficiency because of time-varying channels when the offloading is being executed. In this paper, we address the issue of energy-efficient scheduling for wireless uplink in MCC. By introducing Lyapunov optimization, we first propose a scheduling algorithm that can dynamically choose channel to transmit data based on queue backlog and channel statistics. Then, we show that the proposed scheduling algorithm can make a tradeoff between queue backlog and energy consumption in a channel-aware MCC system. Simulation results show that the proposed scheduling algorithm can reduce the time average energy consumption for offloading compared to the existing algorithm.

  12. Replacement of traditional lectures with computer-based tutorials: a case study

    Directory of Open Access Journals (Sweden)

    Derek Lavelle

    1996-12-01

    Full Text Available This paper reports on a pilot project with a group of 60 second-year undergraduates studying the use of standard forms of contract in the construction industry. The project entailed the replacement of two of a series of nine scheduled lectures with a computer-based tutorial. The two main aims of the project were to test the viability of converting existing lecture material into computer-based material on an in-house production basis, and to obtain feedback from the student cohort on their behavioural response to the change in media. The effect on student performance was not measured at this stage of development.

  13. The Computer Student Worksheet Based Mathematical Literacy for Statistics

    Science.gov (United States)

    Manoy, J. T.; Indarasati, N. A.

    2018-01-01

    The student worksheet is one of media teaching which is able to improve teaching an activity in the classroom. Indicators in mathematical literacy were included in a student worksheet is able to help the students for applying the concept in daily life. Then, the use of computers in learning can create learning with environment-friendly. This research used developmental research which was Thiagarajan (Four-D) development design. There are 4 stages in the Four-D, define, design, develop, and disseminate. However, this research was finish until the third stage, develop stage. The computer student worksheet based mathematical literacy for statistics executed good quality. This student worksheet is achieving the criteria if able to achieve three aspects, validity, practicality, and effectiveness. The subject in this research was the students at The 1st State Senior High School of Driyorejo, Gresik, grade eleven of The 5th Mathematics and Natural Sciences. The computer student worksheet products based mathematical literacy for statistics executed good quality, while it achieved the aspects for validity, practical, and effectiveness. This student worksheet achieved the validity aspects with an average of 3.79 (94.72%), and practical aspects with an average of 2.85 (71.43%). Besides, it achieved the effectiveness aspects with a percentage of the classical complete students of 94.74% and a percentage of the student positive response of 75%.

  14. Optimal Computing Resource Management Based on Utility Maximization in Mobile Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Haoyu Meng

    2017-01-01

    Full Text Available Mobile crowdsourcing, as an emerging service paradigm, enables the computing resource requestor (CRR to outsource computation tasks to each computing resource provider (CRP. Considering the importance of pricing as an essential incentive to coordinate the real-time interaction among the CRR and CRPs, in this paper, we propose an optimal real-time pricing strategy for computing resource management in mobile crowdsourcing. Firstly, we analytically model the CRR and CRPs behaviors in form of carefully selected utility and cost functions, based on concepts from microeconomics. Secondly, we propose a distributed algorithm through the exchange of control messages, which contain the information of computing resource demand/supply and real-time prices. We show that there exist real-time prices that can align individual optimality with systematic optimality. Finally, we also take account of the interaction among CRPs and formulate the computing resource management as a game with Nash equilibrium achievable via best response. Simulation results demonstrate that the proposed distributed algorithm can potentially benefit both the CRR and CRPs. The coordinator in mobile crowdsourcing can thus use the optimal real-time pricing strategy to manage computing resources towards the benefit of the overall system.

  15. Smart learning services based on smart cloud computing.

    Science.gov (United States)

    Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik

    2011-01-01

    Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  16. Smart Learning Services Based on Smart Cloud Computing

    Directory of Open Access Journals (Sweden)

    Yong-Ik Yoon

    2011-08-01

    Full Text Available Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  17. Rediscovering the Economics of Keynes in an Agent-Based Computational Setting

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2016-01-01

    The aim of this paper is to use agent-based computational economics to explore the economic thinking of Keynes. Taking his starting point at the macroeconomic level, Keynes argued that economic systems are characterized by fundamental uncertainty — an uncertainty that makes rule-based behavior...

  18. Methods of physical experiment and installation automation on the base of computers

    International Nuclear Information System (INIS)

    Stupin, Yu.V.

    1983-01-01

    Peculiarities of using computers for physical experiment and installation automation are considered. Systems for data acquisition and processing on the base of microprocessors, micro- and mini-computers, CAMAC equipment and real time operational systems as well as systems intended for automation of physical experiments on accelerators and installations of laser thermonuclear fusion and installations for plasma investigation are dpscribed. The problems of multimachine complex and multi-user system, arrangement, development of automated systems for collective use, arrangement of intermachine data exchange and control of experimental data base are discussed. Data on software systems used for complex experimental data processing are presented. It is concluded that application of new computers in combination with new possibilities provided for users by universal operational systems essentially exceeds efficiency of a scientist work

  19. MEDUSA - An overset grid flow solver for network-based parallel computer systems

    Science.gov (United States)

    Smith, Merritt H.; Pallis, Jani M.

    1993-01-01

    Continuing improvement in processing speed has made it feasible to solve the Reynolds-Averaged Navier-Stokes equations for simple three-dimensional flows on advanced workstations. Combining multiple workstations into a network-based heterogeneous parallel computer allows the application of programming principles learned on MIMD (Multiple Instruction Multiple Data) distributed memory parallel computers to the solution of larger problems. An overset-grid flow solution code has been developed which uses a cluster of workstations as a network-based parallel computer. Inter-process communication is provided by the Parallel Virtual Machine (PVM) software. Solution speed equivalent to one-third of a Cray-YMP processor has been achieved from a cluster of nine commonly used engineering workstation processors. Load imbalance and communication overhead are the principal impediments to parallel efficiency in this application.

  20. Decomposition and Cross-Product-Based Method for Computing the Dynamic Equation of Robots

    Directory of Open Access Journals (Sweden)

    Ching-Long Shih

    2012-08-01

    Full Text Available This paper aims to demonstrate a clear relationship between Lagrange equations and Newton-Euler equations regarding computational methods for robot dynamics, from which we derive a systematic method for using either symbolic or on-line numerical computations. Based on the decomposition approach and cross-product operation, a computing method for robot dynamics can be easily developed. The advantages of this computing framework are that: it can be used for both symbolic and on-line numeric computation purposes, and it can also be applied to biped systems, as well as some simple closed-chain robot systems.

  1. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  2. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  3. Read-only-memory-based quantum computation: Experimental explorations using nuclear magnetic resonance and future prospects

    International Nuclear Information System (INIS)

    Sypher, D.R.; Brereton, I.M.; Wiseman, H.M.; Hollis, B.L.; Travaglione, B.C.

    2002-01-01

    Read-only-memory-based (ROM-based) quantum computation (QC) is an alternative to oracle-based QC. It has the advantages of being less 'magical', and being more suited to implementing space-efficient computation (i.e., computation using the minimum number of writable qubits). Here we consider a number of small (one- and two-qubit) quantum algorithms illustrating different aspects of ROM-based QC. They are: (a) a one-qubit algorithm to solve the Deutsch problem; (b) a one-qubit binary multiplication algorithm; (c) a two-qubit controlled binary multiplication algorithm; and (d) a two-qubit ROM-based version of the Deutsch-Jozsa algorithm. For each algorithm we present experimental verification using nuclear magnetic resonance ensemble QC. The average fidelities for the implementation were in the ranges 0.9-0.97 for the one-qubit algorithms, and 0.84-0.94 for the two-qubit algorithms. We conclude with a discussion of future prospects for ROM-based quantum computation. We propose a four-qubit algorithm, using Grover's iterate, for solving a miniature 'real-world' problem relating to the lengths of paths in a network

  4. Review of P-scan computer-based ultrasonic inservice inspection system. Supplement 1

    International Nuclear Information System (INIS)

    Harris, R.V. Jr.; Angel, L.J.

    1995-12-01

    This Supplement reviews the P-scan system, a computer-based ultrasonic system used for inservice inspection of piping and other components in nuclear power plants. The Supplement was prepared using the methodology described in detail in Appendix A of NUREG/CR-5985, and is based on one month of using the system in a laboratory. This Supplement describes and characterizes: computer system, ultrasonic components, and mechanical components; scanning, detection, digitizing, imaging, data interpretation, operator interaction, data handling, and record-keeping. It includes a general description, a review checklist, and detailed results of all tests performed

  5. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  6. ORGANIZATION OF CLOUD COMPUTING INFRASTRUCTURE BASED ON SDN NETWORK

    Directory of Open Access Journals (Sweden)

    Alexey A. Efimenko

    2013-01-01

    Full Text Available The article presents the main approaches to cloud computing infrastructure based on the SDN network in present data processing centers (DPC. The main indexes of management effectiveness of network infrastructure of DPC are determined. The examples of solutions for the creation of virtual network devices are provided.

  7. Dealing with media distractions: An observational study of computer-based multitasking among children and adults in the Netherlands

    NARCIS (Netherlands)

    Baumgartner, S.E.; Sumter, S.R.

    2017-01-01

    The aim of this observational study was to investigate differences in computer-based multitasking among children and adults. Moreover, the study investigated how attention problems are related to computer-based multitasking and how these individual differences interact with age. Computer-based

  8. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    Science.gov (United States)

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  9. Teaching advance care planning to medical students with a computer-based decision aid.

    Science.gov (United States)

    Green, Michael J; Levi, Benjamin H

    2011-03-01

    Discussing end-of-life decisions with cancer patients is a crucial skill for physicians. This article reports findings from a pilot study evaluating the effectiveness of a computer-based decision aid for teaching medical students about advance care planning. Second-year medical students at a single medical school were randomized to use a standard advance directive or a computer-based decision aid to help patients with advance care planning. Students' knowledge, skills, and satisfaction were measured by self-report; their performance was rated by patients. 121/133 (91%) of students participated. The Decision-Aid Group (n = 60) outperformed the Standard Group (n = 61) in terms of students' knowledge (p satisfaction with their learning experience (p student performance. Use of a computer-based decision aid may be an effective way to teach medical students how to discuss advance care planning with cancer patients.

  10. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    Science.gov (United States)

    Molenaar, Inge; Roda, Claudia; van Boxtel, Carla; Sleegers, Peter

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N=56) are supported with computer-generated scaffolds and students in the control condition (N=54) do not receive scaffolds. The scaffolds are…

  11. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  12. English Language Learners' Strategies for Reading Computer-Based Texts at Home and in School

    Science.gov (United States)

    Park, Ho-Ryong; Kim, Deoksoon

    2016-01-01

    This study investigated four elementary-level English language learners' (ELLs') use of strategies for reading computer-based texts at home and in school. The ELLs in this study were in the fourth and fifth grades in a public elementary school. We identify the ELLs' strategies for reading computer-based texts in home and school environments. We…

  13. Discovery Learning, Representation, and Explanation within a Computer-Based Simulation: Finding the Right Mix

    Science.gov (United States)

    Rieber, Lloyd P.; Tzeng, Shyh-Chii; Tribble, Kelly

    2004-01-01

    The purpose of this research was to explore how adult users interact and learn during an interactive computer-based simulation supplemented with brief multimedia explanations of the content. A total of 52 college students interacted with a computer-based simulation of Newton's laws of motion in which they had control over the motion of a simple…

  14. Improving the learning of clinical reasoning through computer-based cognitive representation

    Directory of Open Access Journals (Sweden)

    Bian Wu

    2014-12-01

    Full Text Available Objective: Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods: Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results: A significant improvement was found in students’ learning products from the beginning to the end of the study, consistent with students’ report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions: The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge

  15. Computer-based testing of the modified essay question: the Singapore experience.

    Science.gov (United States)

    Lim, Erle Chuen-Hian; Seet, Raymond Chee-Seong; Oh, Vernon M S; Chia, Boon-Lock; Aw, Marion; Quak, Seng-Hock; Ong, Benjamin K C

    2007-11-01

    The modified essay question (MEQ), featuring an evolving case scenario, tests a candidate's problem-solving and reasoning ability, rather than mere factual recall. Although it is traditionally conducted as a pen-and-paper examination, our university has run the MEQ using computer-based testing (CBT) since 2003. We describe our experience with running the MEQ examination using the IVLE, or integrated virtual learning environment (https://ivle.nus.edu.sg), provide a blueprint for universities intending to conduct computer-based testing of the MEQ, and detail how our MEQ examination has evolved since its inception. An MEQ committee, comprising specialists in key disciplines from the departments of Medicine and Paediatrics, was formed. We utilized the IVLE, developed for our university in 1998, as the online platform on which we ran the MEQ. We calculated the number of man-hours (academic and support staff) required to run the MEQ examination, using either a computer-based or pen-and-paper format. With the support of our university's information technology (IT) specialists, we have successfully run the MEQ examination online, twice a year, since 2003. Initially, we conducted the examination with short-answer questions only, but have since expanded the MEQ examination to include multiple-choice and extended matching questions. A total of 1268 man-hours was spent in preparing for, and running, the MEQ examination using CBT, compared to 236.5 man-hours to run it using a pen-and-paper format. Despite being more labour-intensive, our students and staff prefer CBT to the pen-and-paper format. The MEQ can be conducted using a computer-based testing scenario, which offers several advantages over a pen-and-paper format. We hope to increase the number of questions and incorporate audio and video files, featuring clinical vignettes, to the MEQ examination in the near future.

  16. Noise filtering algorithm for the MFTF-B computer based control system

    International Nuclear Information System (INIS)

    Minor, E.G.

    1983-01-01

    An algorithm to reduce the message traffic in the MFTF-B computer based control system is described. The algorithm filters analog inputs to the control system. Its purpose is to distinguish between changes in the inputs due to noise and changes due to significant variations in the quantity being monitored. Noise is rejected while significant changes are reported to the control system data base, thus keeping the data base updated with a minimum number of messages. The algorithm is memory efficient, requiring only four bytes of storage per analog channel, and computationally simple, requiring only subtraction and comparison. Quantitative analysis of the algorithm is presented for the case of additive Gaussian noise. It is shown that the algorithm is stable and tends toward the mean value of the monitored variable over a wide variety of additive noise distributions

  17. Comparison of Computer-Based Versus Counselor-Based Occupational Information Systems with Disadvantaged Vocational Students

    Science.gov (United States)

    Maola, Joseph; Kane, Gary

    1976-01-01

    Subjects, who were Occupational Work Experience students, were randomly assigned to individual guidance from either a computerized occupational information system, to a counselor-based information system or to a control group. Results demonstrate a hierarchical learning effect: The computer group learned more than the counseled group, which…

  18. Application of CT-PSF-based computer-simulated lung nodules for evaluating the accuracy of computer-aided volumetry.

    Science.gov (United States)

    Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji

    2012-07-01

    With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.

  19. A MULTICORE COMPUTER SYSTEM FOR DESIGN OF STREAM CIPHERS BASED ON RANDOM FEEDBACK

    Directory of Open Access Journals (Sweden)

    Borislav BEDZHEV

    2013-01-01

    Full Text Available The stream ciphers are an important tool for providing information security in the present communication and computer networks. Due to this reason our paper describes a multicore computer system for design of stream ciphers based on the so - named random feedback shift registers (RFSRs. The interest to this theme is inspired by the following facts. First, the RFSRs are a relatively new type of stream ciphers which demonstrate a significant enhancement of the crypto - resistance in a comparison with the classical stream ciphers. Second, the studding of the features of the RFSRs is in very initial stage. Third, the theory of the RFSRs seems to be very hard, which leads to the necessity RFSRs to be explored mainly by the means of computer models. The paper is organized as follows. First, the basics of the RFSRs are recalled. After that, our multicore computer system for design of stream ciphers based on RFSRs is presented. Finally, the advantages and possible areas of application of the computer system are discussed.

  20. A Computer-Based Instrument That Identifies Common Science Misconceptions

    Science.gov (United States)

    Larrabee, Timothy G.; Stein, Mary; Barman, Charles

    2006-01-01

    This article describes the rationale for and development of a computer-based instrument that helps identify commonly held science misconceptions. The instrument, known as the Science Beliefs Test, is a 47-item instrument that targets topics in chemistry, physics, biology, earth science, and astronomy. The use of an online data collection system…

  1. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing

    NARCIS (Netherlands)

    van der Velde, Frank; van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are “in situ.” In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain

  2. Connection machine: a computer architecture based on cellular automata

    Energy Technology Data Exchange (ETDEWEB)

    Hillis, W D

    1984-01-01

    This paper describes the connection machine, a programmable computer based on cellular automata. The essential idea behind the connection machine is that a regular locally-connected cellular array can be made to behave as if the processing cells are connected into any desired topology. When the topology of the machine is chosen to match the topology of the application program, the result is a fast, powerful computing engine. The connection machine was originally designed to implement knowledge retrieval operations in artificial intelligence programs, but the hardware and the programming techniques are apparently applicable to a much larger class of problems. A machine with 100000 processing cells is currently being constructed. 27 references.

  3. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  4. Fast parallel molecular algorithms for DNA-based computation: factoring integers.

    Science.gov (United States)

    Chang, Weng-Long; Guo, Minyi; Ho, Michael Shan-Hui

    2005-06-01

    The RSA public-key cryptosystem is an algorithm that converts input data to an unrecognizable encryption and converts the unrecognizable data back into its original decryption form. The security of the RSA public-key cryptosystem is based on the difficulty of factoring the product of two large prime numbers. This paper demonstrates to factor the product of two large prime numbers, and is a breakthrough in basic biological operations using a molecular computer. In order to achieve this, we propose three DNA-based algorithms for parallel subtractor, parallel comparator, and parallel modular arithmetic that formally verify our designed molecular solutions for factoring the product of two large prime numbers. Furthermore, this work indicates that the cryptosystems using public-key are perhaps insecure and also presents clear evidence of the ability of molecular computing to perform complicated mathematical operations.

  5. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    Science.gov (United States)

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  6. Evaluation of computer-based NDE techniques and regional support of inspection activities

    International Nuclear Information System (INIS)

    Taylor, T.T.; Kurtz, R.J.; Heasler, P.G.; Doctor, S.R.

    1991-01-01

    This paper describes the technical progress during fiscal year 1990 for the program entitled 'Evaluation of Computer-Based nondestructive evaluation (NDE) Techniques and Regional Support of Inspection Activities.' Highlights of the technical progress include: development of a seminar to provide basic knowledge required to review and evaluate computer-based systems; review of a typical computer-based field procedure to determine compliance with applicable codes, ambiguities in procedure guidance, and overall effectiveness and utility; design and fabrication of a series of three test blocks for NRC staff use for training or audit of UT systems; technical assistance in reviewing (1) San Onofre ten year reactor pressure vessel inservice inspection activities and (2) the capability of a proposed phased array inspection of the feedwater nozzle at Oyster Creek; completion of design calculations to determine the feasibility and significance of various sizes of mockup assemblies that could be used to evaluate the effectiveness of eddy current examinations performed on steam generators; and discussion of initial mockup design features and methods for fabricating flaws in steam generator tubes

  7. The numerical computation of seismic fragility of base-isolated Nuclear Power Plants buildings

    International Nuclear Information System (INIS)

    Perotti, Federico; Domaneschi, Marco; De Grandis, Silvia

    2013-01-01

    Highlights: • Seismic fragility of structural components in base isolated NPP is computed. • Dynamic integration, Response Surface, FORM and Monte Carlo Simulation are adopted. • Refined approach for modeling the non-linearities behavior of isolators is proposed. • Beyond-design conditions are addressed. • The preliminary design of the isolated IRIS is the application of the procedure. -- Abstract: The research work here described is devoted to the development of a numerical procedure for the computation of seismic fragilities for equipment and structural components in Nuclear Power Plants; in particular, reference is made, in the present paper, to the case of isolated buildings. The proposed procedure for fragility computation makes use of the Response Surface Methodology to model the influence of the random variables on the dynamic response. To account for stochastic loading, the latter is computed by means of a simulation procedure. Given the Response Surface, the Monte Carlo method is used to compute the failure probability. The procedure is here applied to the preliminary design of the Nuclear Power Plant reactor building within the International Reactor Innovative and Secure international project; the building is equipped with a base isolation system based on the introduction of High Damping Rubber Bearing elements showing a markedly non linear mechanical behavior. The fragility analysis is performed assuming that the isolation devices become the critical elements in terms of seismic risk and that, once base-isolation is introduced, the dynamic behavior of the building can be captured by low-dimensional numerical models

  8. MOMCC: Market-Oriented Architecture for Mobile Cloud Computing Based on Service Oriented Architecture

    OpenAIRE

    Abolfazli, Saeid; Sanaei, Zohreh; Gani, Abdullah; Shiraz, Muhammad

    2012-01-01

    The vision of augmenting computing capabilities of mobile devices, especially smartphones with least cost is likely transforming to reality leveraging cloud computing. Cloud exploitation by mobile devices breeds a new research domain called Mobile Cloud Computing (MCC). However, issues like portability and interoperability should be addressed for mobile augmentation which is a non-trivial task using component-based approaches. Service Oriented Architecture (SOA) is a promising design philosop...

  9. A Performance/Cost Evaluation for a GPU-Based Drug Discovery Application on Volunteer Computing

    Science.gov (United States)

    Guerrero, Ginés D.; Imbernón, Baldomero; García, José M.

    2014-01-01

    Bioinformatics is an interdisciplinary research field that develops tools for the analysis of large biological databases, and, thus, the use of high performance computing (HPC) platforms is mandatory for the generation of useful biological knowledge. The latest generation of graphics processing units (GPUs) has democratized the use of HPC as they push desktop computers to cluster-level performance. Many applications within this field have been developed to leverage these powerful and low-cost architectures. However, these applications still need to scale to larger GPU-based systems to enable remarkable advances in the fields of healthcare, drug discovery, genome research, etc. The inclusion of GPUs in HPC systems exacerbates power and temperature issues, increasing the total cost of ownership (TCO). This paper explores the benefits of volunteer computing to scale bioinformatics applications as an alternative to own large GPU-based local infrastructures. We use as a benchmark a GPU-based drug discovery application called BINDSURF that their computational requirements go beyond a single desktop machine. Volunteer computing is presented as a cheap and valid HPC system for those bioinformatics applications that need to process huge amounts of data and where the response time is not a critical factor. PMID:25025055

  10. Computer Vision Based Measurement of Wildfire Smoke Dynamics

    Directory of Open Access Journals (Sweden)

    BUGARIC, M.

    2015-02-01

    Full Text Available This article presents a novel method for measurement of wildfire smoke dynamics based on computer vision and augmented reality techniques. The aspect of smoke dynamics is an important feature in video smoke detection that could distinguish smoke from visually similar phenomena. However, most of the existing smoke detection systems are not capable of measuring the real-world size of the detected smoke regions. Using computer vision and GIS-based augmented reality, we measure the real dimensions of smoke plumes, and observe the change in size over time. The measurements are performed on offline video data with known camera parameters and location. The observed data is analyzed in order to create a classifier that could be used to eliminate certain categories of false alarms induced by phenomena with different dynamics than smoke. We carried out an offline evaluation where we measured the improvement in the detection process achieved using the proposed smoke dynamics characteristics. The results show a significant increase in algorithm performance, especially in terms of reducing false alarms rate. From this it follows that the proposed method for measurement of smoke dynamics could be used to improve existing smoke detection algorithms, or taken into account when designing new ones.

  11. Field microcomputerized multichannel γ ray spectrometer based on notebook computer

    International Nuclear Information System (INIS)

    Jia Wenyi; Wei Biao; Zhou Rongsheng; Li Guodong; Tang Hong

    1996-01-01

    Currently, field γ ray spectrometry can not rapidly measure γ ray full spectrum, so a field microcomputerized multichannel γ ray spectrometer based on notebook computer is described, and the γ ray full spectrum can be rapidly measured in the field

  12. Applications of decision theory to computer-based adaptive instructional systems

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1988-01-01

    This paper considers applications of decision theory to the problem of instructional decision-making in computer-based adaptive instructional systems, using the Minnesota Adaptive Instructional System (MAIS) as an example. The first section indicates how the problem of selecting the appropriate

  13. Application of data base management systems for developing experimental data base using ES computers

    International Nuclear Information System (INIS)

    Vasil'ev, V.I.; Karpov, V.V.; Mikhajlyuk, D.N.; Ostroumov, Yu.A.; Rumyantsev, A.N.

    1987-01-01

    Modern data base measurement systems (DBMS) are widely used for development and operation of different data bases by assignment of data processing systems in economy, planning, management. But up today development and operation of data masses with experimental physical data in ES computer has been based mainly on the traditional technology of consequent or index-consequent files. The principal statements of DBMS technology applicability for compiling and operation of data bases with data on physical experiments are formulated based on the analysis of DBMS opportunities. It is shown that application of DBMS allows to essentially reduce general costs of calculational resources for development and operation of data bases and to decrease the scope of stored experimental data when analyzing information content of data

  14. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Directory of Open Access Journals (Sweden)

    Akitoshi Ogawa

    Full Text Available The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion. Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround, 3D with monaural sound (3D-Mono, 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG. The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life

  15. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    Science.gov (United States)

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  16. A computer-based anaglyphic system for the treatment of amblyopia

    Directory of Open Access Journals (Sweden)

    Rastegarpour A

    2011-09-01

    Full Text Available Ali Rastegarpour Ophthalmic Research Center, Shahid Beheshti University of Medical Sciences, Tehran, Iran Purpose: Virtual reality (VR-based treatment has been introduced as a potential option for amblyopia management, presumably without involving the problems of occlusion and penalization, including variable and unsatisfactory outcomes, long duration of treatment, poor compliance, psychological impact, and complications. However, VR-based treatment is costly and not accessible for most children. This paper introduces a method that encompasses the advantages of VR-based treatment at a lower cost. Methods: The presented system consists of a pair of glasses with two color filters and software for use on a personal computer. The software is designed such that some active graphic components can only be seen by the amblyopic eye and are filtered out for the other eye. Some components would be seen by both to encourage fusion. The result is that the patient must use both eyes, and specifically the amblyopic eye, to play the games. Results: A prototype of the system, the ABG InSight, was found capable of successfully filtering out elements of a certain color and therefore, could prove to be a viable alternative to VR-based treatment for amblyopia. Conclusion: The anaglyphic system maintains most of the advantages of VR-based systems, but is less costly and highly accessible. It fulfills the means that VR-based systems are designed to achieve, and warrants further investigation. Keywords: amblyopia, computer-based, open source, virtual reality, color filters, 3-D

  17. Unit cell-based computer-aided manufacturing system for tissue engineering

    International Nuclear Information System (INIS)

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-01-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering. (paper)

  18. Unit cell-based computer-aided manufacturing system for tissue engineering.

    Science.gov (United States)

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-03-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering.

  19. [Computer-aided Prognosis for Breast Cancer Based on Hematoxylin & Eosin Histopathology Image].

    Science.gov (United States)

    Chen, Jiamei; Qu, Aiping; Liu, Wenlou; Wang, Linwei; Yuan, Jingping; Liu, Juan; Li, Yan

    2016-06-01

    Quantitatively analyzing hematoxylin &eosin(H&E)histopathology images is an emerging field attracting increasing attentions in recent years.This paper reviews the application of computer-aided image analysis in breast cancer prognosis.The traditional prognosis based on H&E histopathology image for breast cancer is firstly sketched,followed by a detailed description of the workflow of computer-aided prognosis including image acquisition,image preprocessing,regions of interest detection and object segmentation,feature extraction,and computer-aided prognosis.In the end,major technical challenges and future directions in this field are summarized.

  20. A cloud computing based 12-lead ECG telemedicine service

    Science.gov (United States)

    2012-01-01

    Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan. PMID:22838382

  1. A cloud computing based 12-lead ECG telemedicine service

    Directory of Open Access Journals (Sweden)

    Hsieh Jui-chien

    2012-07-01

    Full Text Available Abstract Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  2. Effects of computer-based immediate feedback on foreign language listening comprehension and test-associated anxiety.

    Science.gov (United States)

    Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da

    2012-06-01

    This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.

  3. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  4. ARAC: a computer-based emergency dose-assessment service

    International Nuclear Information System (INIS)

    Sullivan, T.J.

    1990-01-01

    Over the past 15 years, the Lawrence Livermore National Laboratory's Atmospheric Release Advisory Capability (ARAC) has developed and evolved a computer-based, real-time, radiological-dose-assessment service for the United States Departments of Energy and Defense. This service is built on the integrated components of real-time computer-acquired meteorological data, extensive computer databases, numerical atmospheric-dispersion models, graphical displays, and operational-assessment-staff expertise. The focus of ARAC is the off-site problem where regional meteorology and topography are dominant influences on transport and dispersion. Through application to numerous radiological accidents/releases on scales from small accidental ventings to the Chernobyl reactor disaster, ARAC has developed methods to provide emergency dose assessments from the local to the hemispheric scale. As the power of computers has evolved inversely with respect to cost and size, ARAC has expanded its service and reduced the response time from hours to minutes for an accident within the United States. Concurrently the quality of the assessments has improved as more advanced models have been developed and incorporated into the ARAC system. Over the past six years, the number of directly connected facilities has increased from 6 to 73. All major U.S. Federal agencies now have access to ARAC via the Department of Energy. This assures a level of consistency as well as experience. ARAC maintains its real-time skills by participation in approximately 150 exercises per year; ARAC also continuously validates its modeling systems by application to all available tracer experiments and data sets

  5. The data base management system alternative for computing in the human services.

    Science.gov (United States)

    Sircar, S; Schkade, L L; Schoech, D

    1983-01-01

    The traditional incremental approach to computerization presents substantial problems as systems develop and grow. The Data Base Management System approach to computerization was developed to overcome the problems resulting from implementing computer applications one at a time. The authors describe the applications approach and the alternative Data Base Management System (DBMS) approach through their developmental history, discuss the technology of DBMS components, and consider the implications of choosing the DBMS alternative. Human service managers need an understanding of the DBMS alternative and its applicability to their agency data processing needs. The basis for a conscious selection of computing alternatives is outlined.

  6. An agent-based computational model for tuberculosis spreading on age-structured populations

    Science.gov (United States)

    Graciani Rodrigues, C. C.; Espíndola, Aquino L.; Penna, T. J. P.

    2015-06-01

    In this work we present an agent-based computational model to study the spreading of the tuberculosis (TB) disease on age-structured populations. The model proposed is a merge of two previous models: an agent-based computational model for the spreading of tuberculosis and a bit-string model for biological aging. The combination of TB with the population aging, reproduces the coexistence of health states, as seen in real populations. In addition, the universal exponential behavior of mortalities curves is still preserved. Finally, the population distribution as function of age shows the prevalence of TB mostly in elders, for high efficacy treatments.

  7. Development of multimedia computer-based training for VXI integrated fuel monitors

    International Nuclear Information System (INIS)

    Keeffe, R.; Ellacott, T.; Truong, Q.S.

    1999-01-01

    The Canadian Safeguards Support Program has developed the VXI Integrated Fuel Monitor (VFIM) which is based on the international VXI instrument bus standard. This equipment is a generic radiation monitor which can be used in an integrated mode where several detection systems can be connected to a common system where information is collected, displayed, and analyzed via a virtual control panel with the aid of computers, trackball and computer monitor. The equipment can also be used in an autonomous mode as a portable radiation monitor with a very low power consumption. The equipment has been described at previous international symposia. Integration of several monitoring systems (bundle counter, core discharge monitor, and yes/no monitor) has been carried out at Wolsong 2. Performance results from one of the monitoring systems which was installed at CANDU nuclear stations are discussed in a companion paper at this symposium. This paper describes the development of an effective multimedia computer-based training package for the primary users of the equipment; namely IAEA inspectors and technicians. (author)

  8. Template based parallel checkpointing in a massively parallel computer system

    Science.gov (United States)

    Archer, Charles Jens [Rochester, MN; Inglett, Todd Alan [Rochester, MN

    2009-01-13

    A method and apparatus for a template based parallel checkpoint save for a massively parallel super computer system using a parallel variation of the rsync protocol, and network broadcast. In preferred embodiments, the checkpoint data for each node is compared to a template checkpoint file that resides in the storage and that was previously produced. Embodiments herein greatly decrease the amount of data that must be transmitted and stored for faster checkpointing and increased efficiency of the computer system. Embodiments are directed to a parallel computer system with nodes arranged in a cluster with a high speed interconnect that can perform broadcast communication. The checkpoint contains a set of actual small data blocks with their corresponding checksums from all nodes in the system. The data blocks may be compressed using conventional non-lossy data compression algorithms to further reduce the overall checkpoint size.

  9. Online LDA BASED brain-computer interface system to aid disabled people

    OpenAIRE

    Apdullah Yayık; Yakup Kutlu

    2017-01-01

    This paper aims to develop brain-computer interface system based on electroencephalography that can aid disabled people in daily life. The system relies on one of the most effective event-related potential wave, P300, which can be elicited by oddball paradigm. Developed application has a basic interaction tool that enables disabled people to convey their needs to other people selecting related objects. These objects pseudo-randomly flash in a visual interface on computer screen. The user must...

  10. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  11. Demonstration of optical computing logics based on binary decision diagram.

    Science.gov (United States)

    Lin, Shiyun; Ishikawa, Yasuhiko; Wada, Kazumi

    2012-01-16

    Optical circuits are low power consumption and fast speed alternatives for the current information processing based on transistor circuits. However, because of no transistor function available in optics, the architecture for optical computing should be chosen that optics prefers. One of which is Binary Decision Diagram (BDD), where signal is processed by sending an optical signal from the root through a serial of switching nodes to the leaf (terminal). Speed of optical computing is limited by either transmission time of optical signals from the root to the leaf or switching time of a node. We have designed and experimentally demonstrated 1-bit and 2-bit adders based on the BDD architecture. The switching nodes are silicon ring resonators with a modulation depth of 10 dB and the states are changed by the plasma dispersion effect. The quality, Q of the rings designed is 1500, which allows fast transmission of signal, e.g., 1.3 ps calculated by a photon escaping time. A total processing time is thus analyzed to be ~9 ps for a 2-bit adder and would scales linearly with the number of bit. It is two orders of magnitude faster than the conventional CMOS circuitry, ~ns scale of delay. The presented results show the potential of fast speed optical computing circuits.

  12. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  13. An Intelligent Computer-Based System for Sign Language Tutoring

    Science.gov (United States)

    Ritchings, Tim; Khadragi, Ahmed; Saeb, Magdy

    2012-01-01

    A computer-based system for sign language tutoring has been developed using a low-cost data glove and a software application that processes the movement signals for signs in real-time and uses Pattern Matching techniques to decide if a trainee has closely replicated a teacher's recorded movements. The data glove provides 17 movement signals from…

  14. JAX: a micro-computer based X-ray diffractometer controller

    International Nuclear Information System (INIS)

    Naval, P.C. Jr.

    1987-05-01

    This paper describes a micro-computer based X-ray diffractometer controller and explores its possibilities in simplifying acquisition and analysis of X-ray powder diffraction data. The interrupt-driven controller can operate in both present time and present count data acquisition modes and allows a data analysis program to execute concurrently with data collection. (Auth.). 16 figs.; 2 tabs

  15. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  16. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy

    OpenAIRE

    Morimoto, Satoshi; Remijn, Gerard B.; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects ...

  17. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se...

  18. A PC-based computer program for simulation of containment pressurization

    International Nuclear Information System (INIS)

    Seifaee, F.

    1990-01-01

    This paper reports that a PC-based computer program has been developed to simulate a pressurized water reactor (PWR) containment during various transients. This containment model is capable of determining pressure and temperature history of a PWR containment in the event of a loss of coolant accident, as well as main steam line breaks inside the containment. Conservation of mass and energy equations are applied to the containment model. Development of the program is based on minimization of input specified information and user friendliness. Maximization of calculation efficiency is obtained by superseding the traditional trial and error procedure for determination of the state variables and implementation of an explicit solution for pressure. The program includes simplified models for active heat removal systems. The results are in close agreement between the present model and CONTEMPT-MOD5 computer code for pressure and temperature inside the containment

  19. Efficacy of navigation in skull base surgery using composite computer graphics of magnetic resonance and computed tomography images

    International Nuclear Information System (INIS)

    Hayashi, Nakamasa; Kurimoto, Masanori; Hirashima, Yutaka; Ikeda, Hiroaki; Shibata, Takashi; Tomita, Takahiro; Endo, Shunro

    2001-01-01

    The efficacy of a neurosurgical navigation system using three-dimensional composite computer graphics (CGs) of magnetic resonance (MR) and computed tomography (CT) images was evaluated in skull base surgery. Three-point transformation was used for integration of MR and CT images. MR and CT image data were obtained with three skin markers placed on the patient's scalp. Volume-rendering manipulations of the data produced three-dimensional CGs of the scalp, brain, and lesions from the MR images, and the scalp and skull from the CT. Composite CGs of the scalp, skull, brain, and lesion were created by registering the three markers on the three-dimensional rendered scalp images obtained from MR imaging and CT in the system. This system was used for 14 patients with skull base lesions. Three-point transformation using three-dimensional CGs was easily performed for multimodal registration. Simulation of surgical procedures on composite CGs aided in comprehension of the skull base anatomy and selection of the optimal approaches. Intraoperative navigation aided in determination of actual spatial position in the skull base and the optimal trajectory to the tumor during surgical procedures. (author)

  20. Simulation of Si:P spin-based quantum computer architecture

    International Nuclear Information System (INIS)

    Chang Yiachung; Fang Angbo

    2008-01-01

    We present realistic simulation for single and double phosphorous donors in a silicon-based quantum computer design by solving a valley-orbit coupled effective-mass equation for describing phosphorous donors in strained silicon quantum well (QW). Using a generalized unrestricted Hartree-Fock method, we solve the two-electron effective-mass equation with quantum well confinement and realistic gate potentials. The effects of QW width, gate voltages, donor separation, and donor position shift on the lowest singlet and triplet energies and their charge distributions for a neighboring donor pair in the quantum computer(QC) architecture are analyzed. The gate tunability are defined and evaluated for a typical QC design. Estimates are obtained for the duration of spin half-swap gate operation.