WorldWideScience

Sample records for computing groebner bases

  1. Sweetening the sour taste of inhomogeneous signature-based Groebner basis computations

    CERN Document Server

    Eder, Christian

    2012-01-01

    In this paper we want to give an insight in the rather unknown behaviour of signature-based Groebner basis algorithms, like F5, G2V, or GVW, for inhomogeneous input. On the one hand, it seems that the restriction to sig-safe reductions in those algorithms puts a huge penalty on their performance. The lost connection between polynomial degree and signature degree can disallow lots of reductions and lead to a huge overhead in the computations. On the other hand, the way critical pairs are sorted and the corresponding s-polynomials are handled is a quite good one. We show in detail the strong connection to the sorting of critical pairs w.r.t. well-known sugar degree of polynomials. Those properties hold for signature-based Groebner basis algorithms in general, not depending on specific implementations of the underlying criteria to discard useless critical pairs.

  2. Groebner bases via linkage

    CERN Document Server

    Gorla, Elisa; Nagel, Uwe

    2010-01-01

    In this paper, we give a sufficient condition for a set $\\mathal G$ of polynomials to be a Gr\\"obner basis with respect to a given term-order for the ideal $I$ that it generates. Our criterion depends on the linkage pattern of the ideal $I$ and of the ideal generated by the initial terms of the elements of $\\mathcal G$. We then apply this criterion to ideals generated by minors and pfaffians. More precisely, we consider large families of ideals generated by minors or pfaffians in a matrix or a ladder, where the size of the minors or pfaffians is allowed to vary in different regions of the matrix or the ladder. We use the sufficient condition that we established to prove that the minors or pfaffians form a reduced Gr\\"obner basis for the ideal that they generate, with respect to any diagonal or anti-diagonal term-order. We also show that the corresponding initial ideal is Cohen-Macaulay. Our proof relies on known results in liaison theory, combined with a simple Hilbert function computation. In particular, our...

  3. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  4. Mechanical Geometry Theorem Proving Based on Groebner Bases

    Institute of Scientific and Technical Information of China (English)

    吴尽昭

    1997-01-01

    A new method for the mechanical elementary geometry theorem proving is presented by using Groebner bases of polynomial ideals.It has two main advantages over the approach proposed in literature:(i)It is complete and not a refutational procdure;(ii) The subcases of the geometry statements which are not generally true can be differentiated clearly.

  5. The Closed-form Solution of Groebner Bases for the Spatial Burmester Problem

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Groebner bases is an important concept in polynomial ideals. In this paper the method of Groebner bases is applied to solving spatial Burmester problem for the first time, and the symbolic "triangular" Groebner bases, i. e. the closed-form solution for the problem is obtained. An example of the synthesis of rigid body guidance of a spatial 5s-s mechanism which can realize spatial Burmester points is given to demonstrate the efficiencyof the method.

  6. THE UNIVERSAL GROEBNER BASIS UNDER COMPOSITION

    Institute of Scientific and Technical Information of China (English)

    LIU Jinwang; HOU Jinjun; PENG Zhenyun; CAO Wensheng

    2005-01-01

    Composition is the operation of replacing variables in a polynomial with other polynomials. The main question in this paper is: when does composition commute with universal Groebner basis computation? We prove that this happens iff the composition is single variable. This has a natural application in the computation of universal Groebner bases of composed polynomials.

  7. Groebner Basis Solutions to Satellite Trajectory Control by Pole Placement

    Science.gov (United States)

    Kukelova, Z.; Krsek, P.; Smutny, V.; Pajdla, T.

    2013-09-01

    Satellites play an important role, e.g., in telecommunication, navigation and weather monitoring. Controlling their trajectories is an important problem. In [1], an approach to the pole placement for the synthesis of a linear controller has been presented. It leads to solving five polynomial equations in nine unknown elements of the state space matrices of a compensator. This is an underconstrained system and therefore four of the unknown elements need to be considered as free parameters and set to some prior values to obtain a system of five equations in five unknowns. In [1], this system was solved for one chosen set of free parameters with the help of Dixon resultants. In this work, we study and present Groebner basis solutions to this problem of computation of a dynamic compensator for the satellite for different combinations of input free parameters. We show that the Groebner basis method for solving systems of polynomial equations leads to very simple solutions for all combinations of free parameters. These solutions require to perform only the Gauss-Jordan elimination of a small matrix and computation of roots of a single variable polynomial. The maximum degree of this polynomial is not greater than six in general but for most combinations of the input free parameters its degree is even lower. [1] B. Palancz. Application of Dixon resultant to satellite trajectory control by pole placement. Journal of Symbolic Computation, Volume 50, March 2013, Pages 79-99, Elsevier.

  8. Groebner basis, resultants and the generalized Mandelbrot set

    Energy Technology Data Exchange (ETDEWEB)

    Geum, Young Hee [Centre of Research for Computational Sciences and Informatics in Biology, Bioindustry, Environment, Agriculture and Healthcare, University of Malaya, 50603 Kuala Lumpur (Malaysia)], E-mail: conpana@empal.com; Hare, Kevin G. [Department of Pure Mathematics, University of Waterloo, Waterloo, Ont., N2L 3G1 (Canada)], E-mail: kghare@math.uwaterloo.ca

    2009-10-30

    This paper demonstrates how the Groebner basis algorithm can be used for finding the bifurcation points in the generalized Mandelbrot set. It also shows how resultants can be used to find components of the generalized Mandelbrot set.

  9. Computation of unirational fields (extended abstract)

    CERN Document Server

    Gutierrez, Jaime

    2008-01-01

    In this paper we present an algorithm for computing all algebraic intermediate subfields in a separably generated unirational field extension (which in particular includes the zero characteristic case). One of the main tools is Groebner bases theory. Our algorithm also requires computing computing primitive elements and factoring over algebraic extensions. Moreover, the method can be extended to finitely generated K-algebras.

  10. Classical versus Computer Algebra Methods in Elementary Geometry

    Science.gov (United States)

    Pech, Pavel

    2005-01-01

    Computer algebra methods based on results of commutative algebra like Groebner bases of ideals and elimination of variables make it possible to solve complex, elementary and non elementary problems of geometry, which are difficult to solve using a classical approach. Computer algebra methods permit the proof of geometric theorems, automatic…

  11. Computations in intersection rings of flag bundles

    CERN Document Server

    Grayson, Daniel R; Stillman, Michael E

    2012-01-01

    Intersection rings of flag varieties and of isotropic flag varieties are generated by Chern classes of the tautological bundles modulo the relations coming from multiplicativity of total Chern classes. In this paper we describe the Groebner bases of the ideals of relations and give applications to computation of intersections, as implemented in Macaulay2.

  12. GB-hash : Hash Functions Using Groebner Basis

    CERN Document Server

    Dey, Dhananjoy; Sengupta, Indranath

    2010-01-01

    In this paper we present an improved version of HF-hash, viz., GB-hash : Hash Functions Using Groebner Basis. In case of HF-hash, the compression function consists of 32 polynomials with 64 variables which were taken from the first 32 polynomials of hidden field equations challenge-1 by forcing last 16 variables as 0. In GB-hash we have designed the compression function in such way that these 32 polynomials with 64 variables form a minimal Groebner basis of the ideal generated by them with respect to graded lexicographical (grlex) ordering as well as with respect to graded reverse lexicographical (grevlex) ordering. In this paper we will prove that GB-hash is more secure than HF-hash as well as more secure than SHA-256. We have also compared the efficiency of our GB-hash with SHA-256 and HF-hash.

  13. Connecting Gr\\"obner Bases Programs with Coq to do Proofs in Algebra, Geometry and Arithmetics

    CERN Document Server

    Pottier, Loïc

    2010-01-01

    We describe how we connected three programs that compute Groebner bases to Coq, to do automated proofs on algebraic, geometrical and arithmetical expressions. The result is a set of Coq tactics and a certificate mechanism (downloadable at http://www-sop.inria.fr/marelle/Loic.Pottier/gb-keappa.tgz). The programs are: F4, GB \\, and gbcoq. F4 and GB are the fastest (up to our knowledge) available programs that compute Groebner bases. Gbcoq is slow in general but is proved to be correct (in Coq), and we adapted it to our specific problem to be efficient. The automated proofs concern equalities and non-equalities on polynomials with coefficients and indeterminates in R or Z, and are done by reducing to Groebner computation, via Hilbert's Nullstellensatz. We adapted also the results of Harrison, to allow to prove some theorems about modular arithmetics. The connection between Coq and the programs that compute Groebner bases is done using the "external" tactic of Coq that allows to call arbitrary programs accepting ...

  14. Closed form solution for a double quantum well using Groebner basis

    Energy Technology Data Exchange (ETDEWEB)

    Acus, A [Institute of Theoretical Physics and Astronomy, Vilnius University, A Gostauto 12, LT-01108 Vilnius (Lithuania); Dargys, A, E-mail: dargys@pfi.lt [Center for Physical Sciences and Technology, Semiconductor Physics Institute, A Gostauto 11, LT-01108 Vilnius (Lithuania)

    2011-07-01

    Analytical expressions for the spectrum, eigenfunctions and dipole matrix elements of a square double quantum well (DQW) are presented for a general case when the potential in different regions of the DQW has different heights and the effective masses are different. This was achieved by using a Groebner basis algorithm that allowed us to disentangle the resulting coupled polynomials without explicitly solving the transcendental eigenvalue equation.

  15. Discrimination of Neutral Postures in Computer Based Work

    Science.gov (United States)

    2013-03-01

    6 equations and 6 unknowns for the 4 relevant joints - hip, knee , ankle, and base of toes. The solutions yielded by the Groebner basis will be applied...comprehension, application, analysis, and synthesis, exercising skills that are not just simple recall. The experiment was conducted in several stages: 1...same second central moments of the two hand regions, and include 1) orientation in degrees, 2) eccentricity where a value of 0 specifies a circle and

  16. Groebner basis methods for stationary solutions of a low-dimensional model for a shear flow

    CERN Document Server

    Pausch, Marina; Eckhardt, Bruno; Romanovski, Valery G

    2014-01-01

    We use Groebner basis methods to extract all stationary solutions for the 9-mode shear flow model that is described in Moehlis et al, New J. Phys. 6, 54 (2004). Using rational approximations to irrational wave numbers and algebraic manipulation techniques we reduce the problem of determining all stationary states to finding roots of a polynomial of order 30. The coefficients differ by 30 powers of 10 so that algorithms for extended precision are needed to extract the roots reliably. We find that there are eight stationary solutions consisting of two distinct states that each appear in four symmetry-related phases. We discuss extensions of these results for other flows.

  17. Extraction of human gait signatures: an inverse kinematic approach using Groebner basis theory applied to gait cycle analysis

    Science.gov (United States)

    Barki, Anum; Kendricks, Kimberly; Tuttle, Ronald F.; Bunker, David J.; Borel, Christoph C.

    2013-05-01

    This research highlights the results obtained from applying the method of inverse kinematics, using Groebner basis theory, to the human gait cycle to extract and identify lower extremity gait signatures. The increased threat from suicide bombers and the force protection issues of today have motivated a team at Air Force Institute of Technology (AFIT) to research pattern recognition in the human gait cycle. The purpose of this research is to identify gait signatures of human subjects and distinguish between subjects carrying a load to those subjects without a load. These signatures were investigated via a model of the lower extremities based on motion capture observations, in particular, foot placement and the joint angles for subjects affected by carrying extra load on the body. The human gait cycle was captured and analyzed using a developed toolkit consisting of an inverse kinematic motion model of the lower extremity and a graphical user interface. Hip, knee, and ankle angles were analyzed to identify gait angle variance and range of motion. Female subjects exhibited the most knee angle variance and produced a proportional correlation between knee flexion and load carriage.

  18. Fraction-free algorithm for the computation of diagonal forms matrices over Ore domains using Gr{\\"o}bner bases

    CERN Document Server

    Levandovskyy, Viktor

    2011-01-01

    This paper is a sequel to "Computing diagonal form and Jacobson normal form of a matrix using Groebner bases", J. of Symb. Computation, 46 (5), 2011. We present a new fraction-free algorithm for the computation of a diagonal form of a matrix over a certain non-commutative Euclidean domain over a computable field with the help of Gr\\"obner bases. This algorithm is formulated in a general constructive framework of non-commutative Ore localizations of $G$-algebras (OLGAs). We split the computation of a normal form of a matrix into the diagonalization and the normalization processes. Both of them can be made fraction-free. For a matrix $M$ over an OLGA we provide a diagonalization algorithm to compute $U,V$ and $D$ with fraction-free entries such that $UMV=D$ holds and $D$ is diagonal. The fraction-free approach gives us more information on the system of linear functional equations and its solutions, than the classical setup of an operator algebra with rational functions coefficients. In particular, one can handl...

  19. DNA based computers II

    CERN Document Server

    Landweber, Laura F; Baum, Eric B

    1998-01-01

    The fledgling field of DNA computers began in 1994 when Leonard Adleman surprised the scientific community by using DNA molecules, protein enzymes, and chemicals to solve an instance of a hard computational problem. This volume presents results from the second annual meeting on DNA computers held at Princeton only one and one-half years after Adleman's discovery. By drawing on the analogy between DNA computing and cutting-edge fields of biology (such as directed evolution), this volume highlights some of the exciting progress in the field and builds a strong foundation for the theory of molecular computation. DNA computing is a radically different approach to computing that brings together computer science and molecular biology in a way that is wholly distinct from other disciplines. This book outlines important advances in the field and offers comprehensive discussion on potential pitfalls and the general practicality of building DNA based computers.

  20. Computer based satellite design

    Science.gov (United States)

    Lashbrook, David D.

    1992-06-01

    A computer program to design geosynchronous spacecraft has been developed. The program consists of four separate but interrelated executable computer programs. The programs are compiled to run on a DOS based personnel computer. The source code is written in DoD mandated Ada programming language. The thesis presents the design technique and design equations used in the program. Detailed analysis is performed in the following areas for both dual spin and three axis stabilized spacecraft configurations: (1) Mass Propellent Budget and Mass Summary; (2) Battery Cell and Solar Cell Requirements for a Payload Power Requirement; and (3) Passive Thermal Control Requirements. A user's manual is included as Appendix A, and the source code for the computer programs as Appendix B.

  1. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  2. Trust Based Pervasive Computing

    Institute of Scientific and Technical Information of China (English)

    LI Shiqun; Shane Balfe; ZHOU Jianying; CHEN Kefei

    2006-01-01

    Pervasive computing environment is a distributed and mobile space. Trust relationship must be established and ensured between devices and the systems in the pervasive computing environment. The trusted computing (TC) technology introduced by trusted computing group is a distributed-system-wide approach to the provisions of integrity protection of resources. The TC' notion of trust and security can be described as conformed system behaviors of a platform environment such that the conformation can be attested to a remote challenger. In this paper the trust requirements in a pervasive/ubiquitous environment are analyzed. Then security schemes for the pervasive computing are proposed using primitives offered by TC technology.

  3. Computational algebraic geometry of epidemic models

    Science.gov (United States)

    Rodríguez Vega, Martín.

    2014-06-01

    Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.

  4. Performance of Buchberger's Improved Algorithm using Prime Based Ordering

    CERN Document Server

    Horan, Peter

    2009-01-01

    Prime-based ordering which is proved to be admissible, is the encoding of indeterminates in power-products with prime numbers and ordering them by using the natural number order. Using Eiffel, four versions of Buchberger's improved algorithm for obtaining Groebner Bases have been developed: two total degree versions, representing power products as strings and the other two as integers based on prime-based ordering. The versions are further distinguished by implementing coefficients as 64-bit integers and as multiple-precision integers. By using primebased power product coding, iterative or recursive operations on power products are replaced with integer operations. It is found that on a series of example polynomial sets, significant reductions in computation time of 30% or more are almost always obtained.

  5. Agent Based Computing Machine

    Science.gov (United States)

    2005-12-09

    be used in Phase 2 to accomplish the following enhancements. Due to the speed and support of MPI for C/C++ on Beowulf clusters , these languages could...1.7 ABC Machine Formal Definition 24 1.8 Computational Analysis 31 1.9 Programming Concepts 34 1.10 Cluster Mapping 38 1.11 Phase 1 Results 43 2...options for hardware implementation are explored including an emulation with a high performance cluster , a high performance silicon chip and the

  6. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  7. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  8. Agent-Based Cloud Computing

    OpenAIRE

    Sim, Kwang Mong

    2012-01-01

    Agent-based cloud computing is concerned with the design and development of software agents for bolstering cloud service\\ud discovery, service negotiation, and service composition. The significance of this work is introducing an agent-based paradigm for\\ud constructing software tools and testbeds for cloud resource management. The novel contributions of this work include: 1) developing\\ud Cloudle: an agent-based search engine for cloud service discovery, 2) showing that agent-based negotiatio...

  9. Hypercomputation based on quantum computing

    CERN Document Server

    Sicard, A; Ospina, J; Sicard, Andr\\'es; V\\'elez, Mario; Ospina, Juan

    2004-01-01

    We present a quantum algorithm for a (classically) incomputable decision problem: the Hilbert's tenth problem; namely, we present a hypercomputation model based on quantum computation. The model is inspired by the one proposed by Tien D. Kieu. Our model exploits the quantum adiabatic process and the characteristics of the representation of the dynamical algebra su(1,1) associated to the infinite square well. Furthermore, it is demonstrated that the model proposed is a universal quantum computation model.

  10. Inversion based on computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  11. Computation of Difference Grobner Bases

    Directory of Open Access Journals (Sweden)

    Vladimir P. Gerdt

    2012-07-01

    Full Text Available This paper is an updated and extended version of our note \\cite{GR'06} (cf.\\ also \\cite{GR-ACAT}. To compute difference \\Gr bases of ideals generated by linear polynomials we adopt to difference polynomial rings the involutive algorithm based on Janet-like division. The algorithm has been implemented in Maple in the form of the package LDA (Linear Difference Algebra and we describe the main features of the package. Its applications are illustrated by generation of finite difference approximations to linear partial differential equations and by reduction of Feynman integrals. We also present the algorithm for an ideal generated by a finite set of nonlinear difference polynomials. If the algorithm terminates, then it constructs a \\Gr basis of the ideal.

  12. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  13. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  14. Web-Based Computing Resource Agent Publishing

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Web-based Computing Resource Publishing is a efficient way to provide additional computing capacity for users who need more computing resources than that they themselves could afford by making use of idle computing resources in the Web.Extensibility and reliability are crucial for agent publishing. The parent-child agent framework and primary-slave agent framework were proposed respectively and discussed in detail.

  15. COMPUTER BASED HEART PULSES MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Ali N. Hamoodi

    2013-05-01

    Full Text Available In this work the measurement and displays of blood oxygen saturation and pulse rate are investigated practically using computer.The analysis involves the variation in blood oxygen saturation ratio and pulse rate. The results obtained are compared with kontron pulse oximeter 7840 device. The value obtained for the same person pulse rate is approximately equal to that obtained by the konton pulse oximeter 7840 device. The sensor used in this work is the finger clip.The advantages of using computer over kontron pulse oximeter 7840 device is that the data of the patient can be saved in the computer for many years and also it can be display at any time so that the doctor get file contains all data for each patient. 

  16. A novel algorithm for computer based assessment

    OpenAIRE

    2012-01-01

    Student learning outcomes have been evaluated through graded assignments and tests by most paper-based assessment systems. But computer based assessments has the opportunity to improve the efficiency of assessments process. The use of internet is also made possible

  17. Immune based computer virus detection approaches

    Institute of Scientific and Technical Information of China (English)

    TAN Ying; ZHANG Pengtao

    2013-01-01

    The computer virus is considered one of the most horrifying threats to the security of computer systems worldwide.The rapid development of evasion techniques used in virus causes the signature based computer virus detection techniques to be ineffective.Many novel computer virus detection approaches have been proposed in the past to cope with the ineffectiveness,mainly classified into three categories:static,dynamic and heuristics techniques.As the natural similarities between the biological immune system (BIS),computer security system (CSS),and the artificial immune system (AIS) were all developed as a new prototype in the community of anti-virus research.The immune mechanisms in the BIS provide the opportunities to construct computer virus detection models that are robust and adaptive with the ability to detect unseen viruses.In this paper,a variety of classic computer virus detection approaches were introduced and reviewed based on the background knowledge of the computer virus history.Next,a variety of immune based computer virus detection approaches were also discussed in detail.Promising experimental results suggest that the immune based computer virus detection approaches were able to detect new variants and unseen viruses at lower false positive rates,which have paved a new way for the anti-virus research.

  18. Distributed measurement-based quantum computation

    CERN Document Server

    Danos, V; Kashefi, E; Panangaden, P; Danos, Vincent; Hondt, Ellie D'; Kashefi, Elham; Panangaden, Prakash

    2005-01-01

    We develop a formal model for distributed measurement-based quantum computations, adopting an agent-based view, such that computations are described locally where possible. Because the network quantum state is in general entangled, we need to model it as a global structure, reminiscent of global memory in classical agent systems. Local quantum computations are described as measurement patterns. Since measurement-based quantum computation is inherently distributed, this allows us to extend naturally several concepts of the measurement calculus, a formal model for such computations. Our goal is to define an assembly language, i.e. we assume that computations are well-defined and we do not concern ourselves with verification techniques. The operational semantics for systems of agents is given by a probabilistic transition system, and we define operational equivalence in a way that it corresponds to the notion of bisimilarity. With this in place, we prove that teleportation is bisimilar to a direct quantum channe...

  19. Reversible Data Hiding Based on DNA Computing

    Directory of Open Access Journals (Sweden)

    Bin Wang

    2017-01-01

    Full Text Available Biocomputing, especially DNA, computing has got great development. It is widely used in information security. In this paper, a novel algorithm of reversible data hiding based on DNA computing is proposed. Inspired by the algorithm of histogram modification, which is a classical algorithm for reversible data hiding, we combine it with DNA computing to realize this algorithm based on biological technology. Compared with previous results, our experimental results have significantly improved the ER (Embedding Rate. Furthermore, some PSNR (peak signal-to-noise ratios of test images are also improved. Experimental results show that it is suitable for protecting the copyright of cover image in DNA-based information security.

  20. Game based learning for computer science education

    NARCIS (Netherlands)

    Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus

    2011-01-01

    Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.

  1. Game based learning for computer science education

    NARCIS (Netherlands)

    Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus

    2011-01-01

    Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.

  2. Snapshot Based Virtualization Mechanism for Cloud Computing

    Directory of Open Access Journals (Sweden)

    A.Rupa

    2012-09-01

    Full Text Available Virtualization in cloud computing has been the latest evolutionary technology in current applications of various industries and IT firms are adopting Cloud Technology. The concept of cloud computing was introduced long back. Since its inception there have been many number of new innovations implemented by different experts and researchers etc. Virtualization in cloud computing is very effective approach to gain different operational advantages in cloud computing. In this paper we have proposed the concept of virtualization using Snapshot based Mechanism, where the Memory virtualization and Storage virtualization are discussed in this paper.

  3. QPSO-based adaptive DNA computing algorithm.

    Science.gov (United States)

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm.

  4. QPSO-Based Adaptive DNA Computing Algorithm

    Directory of Open Access Journals (Sweden)

    Mehmet Karakose

    2013-01-01

    Full Text Available DNA (deoxyribonucleic acid computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO. Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1 parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2 adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3 numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm.

  5. On computer-based assessment of mathematics

    OpenAIRE

    Pead, Daniel

    2010-01-01

    This work explores some issues arising from the widespread use of computer based assessment of Mathematics in primary and secondary education. In particular, it considers the potential of computer based assessment for testing “process skills” and “problem solving”. This is discussed through a case study of the World Class Tests project which set out to test problem solving skills. The study also considers how on-screen “eAssessment” differs from conventional paper tests and how transferri...

  6. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  7. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  8. Computational Scenario-based Capability Planning

    CERN Document Server

    Abbass, Hussein; Dam, Helen; Baker, Stephen; Whitacre, James M; Sarker, Ruhul; 10.1145/1389095.1389378

    2009-01-01

    Scenarios are pen-pictures of plausible futures, used for strategic planning. The aim of this investigation is to expand the horizon of scenario-based planning through computational models that are able to aid the analyst in the planning process. The investigation builds upon the advances of Information and Communication Technology (ICT) to create a novel, flexible and customizable computational capability-based planning methodology that is practical and theoretically sound. We will show how evolutionary computation, in particular evolutionary multi-objective optimization, can play a central role - both as an optimizer and as a source for innovation.

  9. Element-Based Computational Model

    Directory of Open Access Journals (Sweden)

    Conrad Mueller

    2012-02-01

    Full Text Available A variation on the data-flow model is proposed to use for developing parallel architectures. While the model is a data driven model it has significant differences to the data-flow model. The proposed model has an evaluation cycleof processing elements (encapsulated data that is similar to the instruction cycle of the von Neumann model. The elements contain the information required to process them. The model is inherently parallel. An emulation of the model has been implemented. The objective of this paper is to motivate support for taking the research further. Using matrix multiplication as a case study, the element/data-flow based model is compared with the instruction-based model. This is done using complexity analysis followed by empirical testing to verify this analysis. The positive results are given as motivation for the research to be taken to the next stage - that is, implementing the model using FPGAs.

  10. Towards applied theories based on computability logic

    CERN Document Server

    Japaridze, Giorgi

    2008-01-01

    Computability logic (CL) (see http://www.cis.upenn.edu/~giorgi/cl.html) is a recently launched program for redeveloping logic as a formal theory of computability, as opposed to the formal theory of truth that logic has more traditionally been. Formulas in it represent computational problems, "truth" means existence of an algorithmic solution, and proofs encode such solutions. Within the line of research devoted to finding axiomatizations for ever more expressive fragments of CL, the present paper introduces a new deductive system CL12 and proves its soundness and completeness with respect to the semantics of CL. Conservatively extending classical predicate calculus and offering considerable additional expressive and deductive power, CL12 presents a reasonable, computationally meaningful, constructive alternative to classical logic as a basis for applied theories. To obtain a model example of such theories, this paper rebuilds the traditional, classical-logic-based Peano arithmetic into a computability-logic-b...

  11. Moment matrices, border bases and radical computation

    OpenAIRE

    Mourrain, B.; J. B. Lasserre; Laurent, Monique; Rostalski, P.; Trebuchet, Philippe

    2013-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-denite programming. While the border basis algorithms of [17] are ecient and numerically stable for computing complex roots, algorithms based on moment matrices [12] allow the incorporation of additional polynomials, ...

  12. Data Cloud Computing based on LINQ

    Institute of Scientific and Technical Information of China (English)

    Junwen Lu; Yongsheng Hao; Lubin Zheng; Guanfeng Liu

    2015-01-01

    Cloud computing has demonstrated that processing very large datasets over commodity cluster can be done simply given the right programming structure. Work to date, the many choices bring difficulty because it is difficult to make a best selection. The LINQ(Language Integrated Query) programming model can be extended to massively-paralel, data-driven computations. It not only provides a seamless transition path from computing on the top transitional stores like relational databases or XML to computing on the Cloud, but also ofers an object-oriented, compositional model. In this paper, we introduce LINQ into Cloud and discuss LINQ is a good selection for Data Cloud, and then the detail of file system management was described based on LINQ.

  13. Brain emotional learning based Brain Computer Interface

    Directory of Open Access Journals (Sweden)

    Abdolreza Asadi Ghanbari

    2012-09-01

    Full Text Available A brain computer interface (BCI enables direct communication between a brain and a computer translating brain activity into computer commands using preprocessing, feature extraction and classification operations. Classification is crucial as it has a substantial effect on the BCI speed and bit rate. Recent developments of brain-computer interfaces (BCIs bring forward some challenging problems to the machine learning community, of which classification of time-varying electrophysiological signals is a crucial one. Constructing adaptive classifiers is a promising approach to deal with this problem. In this paper, we introduce adaptive classifiers for classify electroencephalogram (EEG signals. The adaptive classifier is brain emotional learning based adaptive classifier (BELBAC, which is based on emotional learning process. The main purpose of this research is to use a structural model based on the limbic system of mammalian brain, for decision making and control engineering applications. We have adopted a network model developed by Moren and Balkenius, as a computational model that mimics amygdala, orbitofrontal cortex, thalamus, sensory input cortex and generally, those parts of the brain thought responsible for processing emotions. The developed method was compared with other methods used for EEG signals classification (support vector machine (SVM and two different neural network types (MLP, PNN. The result analysis demonstrated an efficiency of the proposed approach.

  14. Understanding Computer-Based Digital Video.

    Science.gov (United States)

    Martindale, Trey

    2002-01-01

    Discussion of new educational media and technology focuses on producing and delivering computer-based digital video. Highlights include video standards, including international standards and aspect ratio; camera formats and features, including costs; shooting digital video; editing software; compression; and a list of informative Web sites. (LRW)

  15. Agent based computational model of trust

    NARCIS (Netherlands)

    A. Gorobets (Alexander); B. Nooteboom (Bart)

    2004-01-01

    textabstractThis paper employs the methodology of Agent-Based Computational Economics (ACE) to investigate under what conditions trust can be viable in markets. The emergence and breakdown of trust is modeled in a context of multiple buyers and suppliers. Agents adapt their trust in a partner, the w

  16. Single electron tunneling based arithmetic computation

    NARCIS (Netherlands)

    Lageweg, C.R.

    2004-01-01

    In this dissertation we investigate the implementation of computer arithmetic operations with Single Electron Tunneling (SET) technology based circuits. In our research we focus on the effective utilization of the SET technologys specific characteristic, i.e., the ability to control the transport of

  17. Educator Beliefs Regarding Computer-Based Instruction.

    Science.gov (United States)

    Swann, D. LaDon; Branson, Floyd, Jr.; Talbert, B. Allen

    2003-01-01

    Extension educators (n=17) completed two of five technical sections from an aquaculture CD-ROM tutorial. Evidence from pre/post-training questionnaires, content assessments, and follow-up interviews reveals favorable attitudes toward computer-based inservice training. The ability to spend less time out of their county and to review materials after…

  18. Evaluation of a Computer-Based Narrative

    Science.gov (United States)

    Sharf, Richard S.

    1978-01-01

    A computer-based narrative report integrating results from the Strong Vocational Interest Blank, the Opinion Attitude and Interest Survey, and the Cooperative English Test was compared with a standard profile format. No differences were found between the two methods for male and female. (Author)

  19. WEB BASED LEARNING OF COMPUTER NETWORK COURSE

    Directory of Open Access Journals (Sweden)

    Hakan KAPTAN

    2004-04-01

    Full Text Available As a result of developing on Internet and computer fields, web based education becomes one of the area that many improving and research studies are done. In this study, web based education materials have been explained for multimedia animation and simulation aided Computer Networks course in Technical Education Faculties. Course content is formed by use of university course books, web based education materials and technology web pages of companies. Course content is formed by texts, pictures and figures to increase motivation of students and facilities of learning some topics are supported by animations. Furthermore to help working principles of routing algorithms and congestion control algorithms simulators are constructed in order to interactive learning

  20. A Comparative Evaluation of Computer Based and Non-Computer Based Instructional Strategies.

    Science.gov (United States)

    Emerson, Ian

    1988-01-01

    Compares the computer assisted instruction (CAI) tutorial with its non-computerized pedagogical roots: the Socratic Dialog with Skinner's Programmed Instruction. Tests the effectiveness of a CAI tutorial on diffusion and osmosis against four other interactive and non-interactive instructional strategies. Notes computer based strategies were…

  1. Computer-based and web-based radiation safety training

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  2. Computing negentropy based signatures for texture recognition

    Directory of Open Access Journals (Sweden)

    Daniela COLTUC

    2007-12-01

    Full Text Available The proposed method aims to provide a new tool for texture recognition. For this purpose, a set of texture samples are decomposed by using the FastICA algorithm and characterized by a negentropy based signature. In order to do recognition, the texture signatures are compared by means of Minkowski distance. The recognition rates, computed for a set of 320 texture samples, show a medium recognition accuracy and the method may be further improved.

  3. A polyhedral approach to computing border bases

    CERN Document Server

    Braun, Gábor

    2009-01-01

    Border bases can be considered to be the natural extension of Gr\\"obner bases that have several advantages. Unfortunately, to date the classical border basis algorithm relies on (degree-compatible) term orderings and implicitly on reduced Gr\\"obner bases. We adapt the classical border basis algorithm to allow for calculating border bases for arbitrary degree-compatible order ideals, which is \\emph{independent} from term orderings. Moreover, the algorithm also supports calculating degree-compatible order ideals with \\emph{preference} on contained elements, even though finding a preferred order ideal is NP-hard. Effectively we retain degree-compatibility only to successively extend our computation degree-by-degree. The adaptation is based on our polyhedral characterization: order ideals that support a border basis correspond one-to-one to integral points of the order ideal polytope. This establishes a crucial connection between the ideal and the combinatorial structure of the associated factor spaces.

  4. ICOHR: intelligent computer based oral health record.

    Science.gov (United States)

    Peterson, L C; Cobb, D S; Reynolds, D C

    1995-01-01

    The majority of work on computer use in the dental field has focused on non-clinical practice management information needs. Very few computer-based dental information systems provide management support of the clinical care process, particularly with respect to quality management. Traditional quality assurance methods rely on the paper record and provide only retrospective analysis. Today, proactive quality management initiatives are on the rise. Computer-based dental information systems are being integrated into the care environment, actively providing decision support as patient care is being delivered. These new systems emphasize assessment and improvement of patient care at the time of treatment, thus building internal quality management into the caregiving process. The integration of real time quality management and patient care will be expedited by the introduction of an information system architecture that emulates the gathering and storage of clinical care data currently provided by the paper record. As a proposed solution to the problems associated with existing dental record systems, the computer-based patient record has emerged as a possible alternative to the paper dental record. The Institute of Medicine (IOM) recently conducted a study on improving the efficiency and accuracy of patient record keeping. As a result of this study, the IOM advocates the development and implementation of computer-based patient records as the standard for all patient care records. This project represents the ongoing efforts of The University of Iowa College of Dentistry's collaboration with the University of Uppsala Data Center, Uppsala, Sweden, on a computer-based patient dental record model. ICOHR (Intelligent Computer Based Oral Health Record) is an information system which brings together five important parts of the patient's dental record: medical and dental history; oral status; treatment planning; progress notes; and a Patient Care Database, generated from their

  5. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  6. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  7. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt;

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...

  8. Secure information transfer based on computing reservoir

    Science.gov (United States)

    Szmoski, R. M.; Ferrari, F. A. S.; de S. Pinto, S. E.; Baptista, M. S.; Viana, R. L.

    2013-04-01

    There is a broad area of research to ensure that information is transmitted securely. Within this scope, chaos-based cryptography takes a prominent role due to its nonlinear properties. Using these properties, we propose a secure mechanism for transmitting data that relies on chaotic networks. We use a nonlinear on-off device to cipher the message, and the transfer entropy to retrieve it. We analyze the system capability for sending messages, and we obtain expressions for the operating time. We demonstrate the system efficiency for a wide range of parameters. We find similarities between our method and the reservoir computing.

  9. Prestandardisation Activities for Computer Based Safety Systems

    DEFF Research Database (Denmark)

    Taylor, J. R.; Bologna, S.; Ehrenberger, W.

    1981-01-01

    Questions of technical safety become more and more important. Due to the higher complexity of their functions computer based safety systems have special problems. Researchers, producers, licensing personnel and customers have met on a European basis to exchange knowledge and formulate positions....... The Commission of the european Community supports the work. Major topics comprise hardware configuration and self supervision, software design, verification and testing, documentation, system specification and concurrent processing. Preliminary results have been used for the draft of an IEC standard and for some...

  10. Computer Profiling Based Model for Investigation

    Directory of Open Access Journals (Sweden)

    Neeraj Choudhary

    2011-10-01

    Full Text Available Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a computer system. The computer profiling object model can be implemented so as to support automated analysis to provide an investigator with the informationneeded to decide whether manual analysis is required.

  11. Computer Animation Based on Particle Methods

    Directory of Open Access Journals (Sweden)

    Rafal Wcislo

    1999-01-01

    Full Text Available The paper presents the main issues of a computer animation of a set of elastic macroscopic objects based on the particle method. The main assumption of the generated animations is to achieve very realistic movements in a scene observed on the computer display. The objects (solid bodies interact mechanically with each other, The movements and deformations of solids are calculated using the particle method. Phenomena connected with the behaviour of solids in the gravitational field, their defomtations caused by collisions and interactions with the optional liquid medium are simulated. The simulation ofthe liquid is performed using the cellular automata method. The paper presents both simulation schemes (particle method and cellular automata rules an the method of combining them in the single animation program. ln order to speed up the execution of the program the parallel version based on the network of workstation was developed. The paper describes the methods of the parallelization and it considers problems of load-balancing, collision detection, process synchronization and distributed control of the animation.

  12. HiPPI-based parallel computing

    Science.gov (United States)

    Jung, Charles C.

    1993-02-01

    The IBM Enhanced Clustered Fortran (ECF) advanced technology project combines parallel computing technology with a HiPPI-based LAN network. The ECF environment is a clustered, parallel computing environment which consists of IBM ES/90001 complexes and possibly other parallel machines connected by HiPPI. The ECF software, including the language processor, is independent of hardware architectures, operating systems, and the Fortran compiler and runtime library. The ECF software is highly portable because it is based on well-known, standard technology and transport protocols such as Remote Procedure Call (RPC), X/Open Transport Interface (XTI), and TCP/IP. The ECF software is transport-independent, and can accommodate other transport protocols concurrently. This paper describes the IBM ECF environment including the language extensions, the programming model, and the software layers and components. Also, this paper explains how to achieve portability and scalability. Lastly, this paper describes how effective task communication is accomplished in ECF through RPC, XTI, TCP/IP, and a customized enhancement over HiPPI. An analysis of network performance in terms of bottleneck conditions is presented, and empirical data indicating improved throughput is provided. Comparisons to alternative methodologies and technologies are also presented.

  13. Noise-based communication and computing

    CERN Document Server

    Kish, Laszlo B

    2008-01-01

    We discuss the speed-error-heat triangle and related problems with rapidly increasing energy dissipation and error rate during miniaturization. These and the independently growing need of unconditional data security have provoked non-conventional approaches in the physics of informatics. Noise-based informatics is a potentially promising possibility which is the way how biological brains process the information. Recently, it has been shown that thermal noise and its electronically enhanced versions (Johnson-like noises) can be utilized as information carrier with peculiar properties. Relevant examples are Zero power (stealth) communication, Unconditionally secure communication with Johnson(-like) noise and Kirchhoff loop and Noise-driven computing. The zero power communication utilizes the equilibrium background noise in the channel to transfer information. The unconditionally secure communication is based on the properties of Johnson(-like) noise and those of a simple Kirchhoff's loop. The scheme utilizes on...

  14. Computational algebraic topology-based video restoration

    Science.gov (United States)

    Rochel, Alban; Ziou, Djemel; Auclair-Fortier, Marie-Flavie

    2005-03-01

    This paper presents a scheme for video denoising by diffusion of gray levels, based on the Computational Algebraic Topology (CAT) image model. The diffusion approach is similar to the one used to denoise static images. Rather than using the heat transfer partial differential equation, discretizing it and solving it by a purely mathematical process, the CAT approach considers the global expression of the heat transfer and decomposes it into elementary physical laws. Some of these laws describe conservative relations, leading to error-free expressions, whereas others depend on metric quantities and require approximation. This scheme allows for a physical interpretation for each step of the resolution process. We propose a nonlinear and an anisotropic diffusion algorithms based on the extension to video of an existing 2D algorithm thanks to the flexibility of the topological support. Finally it is validated with experimental results.

  15. Computer Based Information Systems and the Middle Manager.

    Science.gov (United States)

    Why do some computer based information systems succeed while others fail. It concludes with eleven recommended areas that middle management must...understand in order to effectively use computer based information systems . (Modified author abstract)

  16. Object Based Middleware for Grid Computing

    Directory of Open Access Journals (Sweden)

    S. Muruganantham

    2010-01-01

    Full Text Available Problem statement: “Grid” computing has emerged as an important new field, distinguished from conventional distributed computing by its focus on large-scale resource sharing, innovative applications and, in some cases, high-performance orientation. The role of middleware is to ease the task of designing, programming and managing distributed applications by providing a simple, consistent and integrated distributed programming environment. Essentially, middleware is a distributed software layer, which abstracts over the complexity and heterogeneity of the underlying distributed environment with its multitude of network technologies, machine architectures, operating systems and programming languages. Approach: This study brought out the development of supportive middleware to manage resources and distributed workload across multiple administrative boundaries is of central importance to Grid computing. Active middleware services that perform look-up, scheduling and staging are being developed that allow users to identify and utilize appropriate resources that provide sustainable system and user-level qualities of service. Results: Different middleware platforms support different programming models. Perhaps the most popular model is object-based middleware in which applications are structured into objects that interact via location transparent method invocation. Conclusion: The Object Management Group’s CORBA platform offer an Interface Definition Language (IDL which is used to abstract over the fact that objects can be implemented in any suitable programming language, an object request broker which is responsible for transparently directing method invocations to the appropriate target object and a set of services such as naming, time, transactions, replication which further enhance the programming environment.

  17. Computing Gröbner Bases within Linear Algebra

    Science.gov (United States)

    Suzuki, Akira

    In this paper, we present an alternative algorithm to compute Gröbner bases, which is based on computations on sparse linear algebra. Both of S-polynomial computations and monomial reductions are computed in linear algebra simultaneously in this algorithm. So it can be implemented to any computational system which can handle linear algebra. For a given ideal in a polynomial ring, it calculates a Gröbner basis along with the corresponding term order appropriately.

  18. COMPUTER-BASED REASONING SYSTEMS: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    CIPRIAN CUCU

    2012-12-01

    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  19. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  20. Intelligent Image Based Computer Aided Education (IICAE)

    Science.gov (United States)

    David, Amos A.; Thiery, Odile; Crehange, Marion

    1989-03-01

    Artificial Intelligence (AI) has found its way into Computer Aided Education (CAE), and there are several systems constructed to put in evidence its interesting advantages. We believe that images (graphic or real) play an important role in learning. However, the use of images, outside their use as illustration, makes it necessary to have applications such as AI. We shall develop the application of AI in an image based CAE and briefly present the system under construction to put in evidence our concept. We shall also elaborate a methodology for constructing such a system. Futhermore we shall briefly present the pedagogical and psychological activities in a learning process. Under the pedagogical and psychological aspect of learning, we shall develop areas such as the importance of image in learning both as pedagogical objects as well as means for obtaining psychological information about the learner. We shall develop the learner's model, its use, what to build into it and how. Under the application of AI in an image based CAE, we shall develop the importance of AI in exploiting the knowledge base in the learning environment and its application as a means of implementing pedagogical strategies.

  1. Demystifying the GMAT: Computer-Based Testing Terms

    Science.gov (United States)

    Rudner, Lawrence M.

    2012-01-01

    Computer-based testing can be a powerful means to make all aspects of test administration not only faster and more efficient, but also more accurate and more secure. While the Graduate Management Admission Test (GMAT) exam is a computer adaptive test, there are other approaches. This installment presents a primer of computer-based testing terms.

  2. A Granular Computing Model Based on Tolerance relation

    Institute of Scientific and Technical Information of China (English)

    WANG Guo-yin; HU Feng; HUANG Hai; WU Yu

    2005-01-01

    Granular computing is a new intelligent computing theory based on partition of problem concepts. It is an important problem in Rough Set theory to process incomplete information systems directly. In this paper, a granular computing model based on tolerance relation for processing incomplete information systems is developed. Furthermore, a criteria condition for attribution necessity is proposed in this model.

  3. Measurement-Based and Universal Blind Quantum Computation

    Science.gov (United States)

    Broadbent, Anne; Fitzsimons, Joseph; Kashefi, Elham

    Measurement-based quantum computation (MBQC) is a novel approach to quantum computation where the notion of measurement is the main driving force of computation. This is in contrast with the more traditional circuit model which is based on unitary operation. We review here the mathematical model underlying MBQC and the first quantum cryptographic protocol designed using the unique features of MBQC.

  4. Computer-Based Cognitive Tools: Description and Design.

    Science.gov (United States)

    Kennedy, David; McNaught, Carmel

    With computers, tangible tools are represented by the hardware (e.g., the central processing unit, scanners, and video display unit), while intangible tools are represented by the software. There is a special category of computer-based software tools (CBSTs) that have the potential to mediate cognitive processes--computer-based cognitive tools…

  5. Hamilton Graph Based on DNA Computing

    Institute of Scientific and Technical Information of China (English)

    ZHANGJia-xiu

    2004-01-01

    DNA computing is a novel method for solving a class of intractable computationalproblems in which the computing can grow exponentially with problem size. Up to now, manyaccomplishments have been achieved to improve its performance and increase its reliability.Hamilton Graph Problem has been solved by means of molecular biology techniques. A smallgraph was encoded in molecules of DNA, and the “operations” of the computation wereperformed with standard protocols and enzymes. This work represents further evidence forthe ability of DNA computing to solve NP-complete search problems.

  6. A quantum computer based on recombination processes in microelectronic devices

    Science.gov (United States)

    Theodoropoulos, K.; Ntalaperas, D.; Petras, I.; Konofaos, N.

    2005-01-01

    In this paper a quantum computer based on the recombination processes happening in semiconductor devices is presented. A "data element" and a "computational element" are derived based on Schokley-Read-Hall statistics and they can later be used to manifest a simple and known quantum computing process. Such a paradigm is shown by the application of the proposed computer onto a well known physical system involving traps in semiconductor devices.

  7. A quantum computer based on recombination processes in microelectronic devices

    Energy Technology Data Exchange (ETDEWEB)

    Theodoropoulos, K [Computer Engineering and Informatics Department, University of Patras, Patras (Greece); Ntalaperas, D [Computer Engineering and Informatics Department, University of Patras, Patras (Greece); Research Academic Computer Technology Institute, Riga Feraiou 61, 26110, Patras (Greece); Petras, I [Computer Engineering and Informatics Department, University of Patras, Patras (Greece); Konofaos, N [Computer Engineering and Informatics Department, University of Patras, Patras (Greece)

    2005-01-01

    In this paper a quantum computer based on the recombination processes happening in semiconductor devices is presented. A 'data element' and a 'computational element' are derived based on Schokley-Read-Hall statistics and they can later be used to manifest a simple and known quantum computing process. Such a paradigm is shown by the application of the proposed computer onto a well known physical system involving traps in semiconductor devices.

  8. Natural Languages Processing for Building Computer-based Learning Tools

    Institute of Scientific and Technical Information of China (English)

    张颖; 李娜

    2015-01-01

    This paper outlines a framework to use computer and natural language techniques for various levels of learners to learn foreign languages in Computer-based Learning environment. We propose some ideas for using the computer as a practical tool for learning foreign language where the most of courseware is generated automatically. We then describe how to build Computer-based Learning tools, discuss its effectiveness, and conclude with some possibilities using on-line resources.

  9. Data Mining Based on Computational Intelligence

    Institute of Scientific and Technical Information of China (English)

    WANG Yuan-zhen; ZHANG Zhi-bing; YI Bao-lin; LI Hua-yang

    2005-01-01

    This paper combines computational intelligence tools: neural network, fuzzy logic, and genetic algorithm to develop a data mining architecture (NFGDM), which discov ers patterns and represents them in understandable forms. In the NFGDM, input data are preprocessed by fuzzification, the preprocessed data of input variables are then used to train a radial basis probabilistic neural network to classify the dataset according to the classes considered. A rule extraction technique is then applied in order to extract explicit knowledge from the trained neural networks and represent it in the form of fuzzy if-then rules. In the final stage, genetic algorithm is used as a rule-pruning module to eliminate those weak rules that are still in the rule bases. Comparison with some knownneural network classifier, the architecture has fast learning speed, and it is characterized by the incorporation of the possibility information into the consequents of classification rules in human understandable forms. The experiments show that the NFGDM is more efficient and more robust than traditional decision tree method.

  10. Progress in silicon-based quantum computing.

    Science.gov (United States)

    Clark, R G; Brenner, R; Buehler, T M; Chan, V; Curson, N J; Dzurak, A S; Gauja, E; Goan, H S; Greentree, A D; Hallam, T; Hamilton, A R; Hollenberg, L C L; Jamieson, D N; McCallum, J C; Milburn, G J; O'Brien, J L; Oberbeck, L; Pakes, C I; Prawer, S D; Reilly, D J; Ruess, F J; Schofield, S R; Simmons, M Y; Stanley, F E; Starrett, R P; Wellard, C; Yang, C

    2003-07-15

    We review progress at the Australian Centre for Quantum Computer Technology towards the fabrication and demonstration of spin qubits and charge qubits based on phosphorus donor atoms embedded in intrinsic silicon. Fabrication is being pursued via two complementary pathways: a 'top-down' approach for near-term production of few-qubit demonstration devices and a 'bottom-up' approach for large-scale qubit arrays with sub-nanometre precision. The 'top-down' approach employs a low-energy (keV) ion beam to implant the phosphorus atoms. Single-atom control during implantation is achieved by monitoring on-chip detector electrodes, integrated within the device structure. In contrast, the 'bottom-up' approach uses scanning tunnelling microscope lithography and epitaxial silicon overgrowth to construct devices at an atomic scale. In both cases, surface electrodes control the qubit using voltage pulses, and dual single-electron transistors operating near the quantum limit provide fast read-out with spurious-signal rejection.

  11. Computer Based Training Authors' and Designers' training

    Directory of Open Access Journals (Sweden)

    Frédéric GODET

    2016-03-01

    Full Text Available This communication, through couple of studies driven since 10 years, tries to show how important is the training of authors in Computer Based Training (CBT. We submit here an approach to prepare designers mastering Interactive Multimedia modules in this domain. Which institutions are really dedicating their efforts in training authors and designers in this area of CBTs? Television devices and broadcast organisations offered since year 60s' a first support for Distance Learning. New media, New Information and Communication Technologies (NICT allowed several public and private organisations to start Distance Learning projects. As usual some of them met their training objectives, other of them failed. Did their really failed? Currently, nobody has the right answer. Today, we do not have enough efficient tools allowing us to evaluate trainees' acquisition in a short term view. Training evaluation needs more than 10 to 20 years of elapsed time to bring reliable measures. Nevertheless, given the high investments already done in this area, we cannot wait until the final results of the pedagogical evaluation. A lot of analyses showed relevant issues which can be used as directions for CBTs authors and designers training. Warning - Our studies and the derived conclusions are mainly based on projects driven in the field. We additionally bring our several years experience in the training of movie film authors in the design of interactive multimedia products. Some of our examples are extracting from vocational training projects where we were involved in all development phases from the analysis of needs to the evaluation of the acquisition within the trainee's / employee job's. Obviously, we cannot bring and exhaustive approach in this domain where a lot of parameters are involved as frame for the CBT interactive multimedia modules authors' and designers' training.

  12. Geometric Computing Based on Computerized Descriptive Geometric

    Institute of Scientific and Technical Information of China (English)

    YU Hai-yan; HE Yuan-Jun

    2011-01-01

    Computer-aided Design (CAD), video games and other computer graphic related technology evolves substantial processing to geometric elements. A novel geometric computing method is proposed with the integration of descriptive geometry, math and computer algorithm. Firstly, geometric elements in general position are transformed to a special position in new coordinate system. Then a 3D problem is projected to new coordinate planes. Finally, according to 2D/3D correspondence principle in descriptive geometry, the solution is constructed computerized drawing process with ruler and compasses. In order to make this method a regular operation, a two-level pattern is established. Basic Layer is a set algebraic packaged function including about ten Primary Geometric Functions (PGF) and one projection transformation. In Application Layer, a proper coordinate is established and a sequence of PGFs is sought for to get the final results. Examples illustrate the advantages of our method on dimension reduction, regulatory and visual computing and robustness.

  13. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  14. Measurement Based Quantum Computation on Fractal Lattices

    Directory of Open Access Journals (Sweden)

    Michal Hajdušek

    2010-06-01

    Full Text Available In this article we extend on work which establishes an analology between one-way quantum computation and thermodynamics to see how the former can be performed on fractal lattices. We find fractals lattices of arbitrary dimension greater than one which do all act as good resources for one-way quantum computation, and sets of fractal lattices with dimension greater than one all of which do not. The difference is put down to other topological factors such as ramification and connectivity. This work adds confidence to the analogy and highlights new features to what we require for universal resources for one-way quantum computation.

  15. Effectiveness of Computer-Based Education in Elementary Schools.

    Science.gov (United States)

    Kulik, James A.; And Others

    1985-01-01

    This metaanalysis of 32 comparative studies shows that computer-based education has generally had positive effects on the achievement of elementary school pupils. However, these effects are different for off-line computer managed instruction and interactive computer assisted instruction (CAI); interactive CAI produces greater increases in student…

  16. Design & implementation of distributed spatial computing node based on WPS

    Science.gov (United States)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-03-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.

  17. Efficient Tate pairing computation using double-base chains

    Institute of Scientific and Technical Information of China (English)

    ZHAO ChangAn; ZHANG FangGuo; HUANG JiWu

    2008-01-01

    Pairing-based cryptosystems have developed very fast in the last few years. The effi-ciencies of these cryptosystems depend on the computation of the bilinear pairings. In this paper, a new efficient algorithm based on double-base chains for computing the Tate pairing is proposed for odd characteristic p > 3. The inherent sparseness of double-base number system reduces the computational cost for computing the Tate pairing evidently. The new algorithm is 9% faster than the previous fastest method for the embedding degree k = 6.

  18. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  19. Computational anatomy based on whole body imaging basic principles of computer-assisted diagnosis and therapy

    CERN Document Server

    Masutani, Yoshitaka

    2017-01-01

    This book deals with computational anatomy, an emerging discipline recognized in medical science as a derivative of conventional anatomy. It is also a completely new research area on the boundaries of several sciences and technologies, such as medical imaging, computer vision, and applied mathematics. Computational Anatomy Based on Whole Body Imaging highlights the underlying principles, basic theories, and fundamental techniques in computational anatomy, which are derived from conventional anatomy, medical imaging, computer vision, and applied mathematics, in addition to various examples of applications in clinical data. The book will cover topics on the basics and applications of the new discipline. Drawing from areas in multidisciplinary fields, it provides comprehensive, integrated coverage of innovative approaches to computational anatomy. As well,Computational Anatomy Based on Whole Body Imaging serves as a valuable resource for researchers including graduate students in the field and a connection with ...

  20. Evaluation of computer-based ultrasonic inservice inspection systems

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R.V. Jr.; Angel, L.J.; Doctor, S.R.; Park, W.R.; Schuster, G.J.; Taylor, T.T. [Pacific Northwest Lab., Richland, WA (United States)

    1994-03-01

    This report presents the principles, practices, terminology, and technology of computer-based ultrasonic testing for inservice inspection (UT/ISI) of nuclear power plants, with extensive use of drawings, diagrams, and LTT images. The presentation is technical but assumes limited specific knowledge of ultrasonics or computers. The report is divided into 9 sections covering conventional LTT, computer-based LTT, and evaluation methodology. Conventional LTT topics include coordinate axes, scanning, instrument operation, RF and video signals, and A-, B-, and C-scans. Computer-based topics include sampling, digitization, signal analysis, image presentation, SAFI, ultrasonic holography, transducer arrays, and data interpretation. An evaluation methodology for computer-based LTT/ISI systems is presented, including questions, detailed procedures, and test block designs. Brief evaluations of several computer-based LTT/ISI systems are given; supplementary volumes will provide detailed evaluations of selected systems.

  1. Telemedicine Based on Mobile Devices and Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lidong Wang

    2014-04-01

    Full Text Available Mobile devices such as smartphones and tablets support kinds of mobile computing and services. They can access to the cloud or offload the computation-intensive part to the cloud computing resources. Mobile cloud computing (MCC integrates the cloud computing into the mobile environment, which extends mobile devices’ battery lifetime, improves their data storage capacity and processing power, and improves their reliability and information security. In this paper, the applications of smartphones in telemedicine and MCC-based telemedicine were presented. Issues on the information security of smartphones and tablets, challenges of smartphones in telemedicine and challenges of MCC-based telemedicine were also introduced.

  2. Evolving technologies for Space Station Freedom computer-based workstations

    Science.gov (United States)

    Jensen, Dean G.; Rudisill, Marianne

    1990-01-01

    Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.

  3. A DNA based model for addition computation

    Institute of Scientific and Technical Information of China (English)

    GAO Lin; YANG Xiao; LIU Wenbin; XU Jin

    2004-01-01

    Much effort has been made to solve computing problems by using DNA-an organic simulating method, which in some cases is preferable to the current electronic computer. However, No one at present has proposed an effective and applicable method to solve addition problem with molecular algorithm due to the difficulty in solving the carry problem which can be easily solved by hardware of an electronic computer. In this article, we solved this problem by employing two kinds of DNA strings, one is called result and operation string while the other is named carrier. The result and operation string contains some carry information by its own and denotes the ultimate result while the carrier is just for carrying use. The significance of this algorithm is the original code, the fairly easy steps to follow and the feasibility under current molecular biological technology.

  4. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...

  5. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  6. Computer-based speech therapy for childhood speech sound disorders.

    Science.gov (United States)

    Furlong, Lisa; Erickson, Shane; Morris, Meg E

    2017-07-01

    With the current worldwide workforce shortage of Speech-Language Pathologists, new and innovative ways of delivering therapy to children with speech sound disorders are needed. Computer-based speech therapy may be an effective and viable means of addressing service access issues for children with speech sound disorders. To evaluate the efficacy of computer-based speech therapy programs for children with speech sound disorders. Studies reporting the efficacy of computer-based speech therapy programs were identified via a systematic, computerised database search. Key study characteristics, results, main findings and details of computer-based speech therapy programs were extracted. The methodological quality was evaluated using a structured critical appraisal tool. 14 studies were identified and a total of 11 computer-based speech therapy programs were evaluated. The results showed that computer-based speech therapy is associated with positive clinical changes for some children with speech sound disorders. There is a need for collaborative research between computer engineers and clinicians, particularly during the design and development of computer-based speech therapy programs. Evaluation using rigorous experimental designs is required to understand the benefits of computer-based speech therapy. The reader will be able to 1) discuss how computerbased speech therapy has the potential to improve service access for children with speech sound disorders, 2) explain the ways in which computer-based speech therapy programs may enhance traditional tabletop therapy and 3) compare the features of computer-based speech therapy programs designed for different client populations. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Computer-based studies on enzyme catalysis

    NARCIS (Netherlands)

    Ridder, L.

    2000-01-01

    Theoretical simulations are becoming increasingly important for our understanding of how enzymes work. The aim of the research presented in this thesis is to contribute to this development by applying various computational methods to three enzymes of theβ-ketoadipate pathway, and to validate the mod

  8. Cloud Computing Based E-Learning System

    Science.gov (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  9. Computer-Based Instruction in Elementary Hindi.

    Science.gov (United States)

    Kachru, Yamuna; And Others

    1981-01-01

    Computer-assisted instruction for Hindi courses at the University of Illinois is described in relation to the technical aspects of programming Hindi on the PLATO system and the curriculum components. The program focuses on review of the materials already covered in class and building understanding of a number of grammatical constructions by using…

  10. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    Mourrain, B.; Lasserre, J.B.; Laurent, M.; Rostalski, P.; Trebuchet, P.

    2011-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-denite programming.

  11. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    Mourrain, B.; Lasserre, J.B.; Laurent, M.; Rostalski, P.; Trebuchet, P.

    2013-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-denite programming.

  12. Efficient GPU-based skyline computation

    DEFF Research Database (Denmark)

    Bøgh, Kenneth Sejdenfaden; Assent, Ira; Magnani, Matteo

    2013-01-01

    The skyline operator for multi-criteria search returns the most interesting points of a data set with respect to any monotone preference function. Existing work has almost exclusively focused on efficiently computing skylines on one or more CPUs, ignoring the high parallelism possible in GPUs. In...

  13. Cloud Computing Based E-Learning System

    Science.gov (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  14. An Overview of Innovative Computer-Based Testing

    NARCIS (Netherlands)

    Klerk, Sebastiaan; Eggen, Theodorus Johannes Hendrikus Maria; Veldkamp, Bernard P.

    2012-01-01

    Driven by the technological revolution, computer-based testing (CBT) has witnessed an explosive rise the last decades, in both psychological and educational assessment. Many paper-and-pencil tests now have a computer-based equivalent. Innovations in CBT are almost innumerable, and innovative and new

  15. Computer-Based Integrated Learning Systems: Research and Theory.

    Science.gov (United States)

    Hativa, Nira, Ed.; Becker, Henry Jay, Ed.

    1994-01-01

    The eight chapters of this theme issue discuss recent research and theory concerning computer-based integrated learning systems. Following an introduction about their theoretical background and current use in schools, the effects of using computer-based integrated learning systems in the elementary school classroom are considered. (SLD)

  16. Computer-Based Science Education. CERL Report X-37.

    Science.gov (United States)

    Bitzer, Donald L.; And Others

    The PLATO IV system of computer-based education developed at the University of Illinois is discussed. A brief description of the PLATO system operation is given, and lesson examples are provided for the areas of biology, geometry, chemistry, and physics. Basic problems in the field of computer-based education are discussed, along with possible…

  17. Computer-Based Instruction in Basic Medical Science Education.

    Science.gov (United States)

    Marion, Roger; And Others

    1982-01-01

    Literature on computer-based instruction shows student performance improves with this method, although students spend less time studying. It is recommended that future research be designed to better detect the influence of computer-based instruction and that greater attention be given to methodological issues like test construction and research…

  18. Chaos-based Cryptography for Cloud Computing

    OpenAIRE

    Tobin, Paul; Tobin, Lee; McKeever, Michael; Blackledge, Jonathan

    2017-01-01

    Cloud computing and poor security issues have quadrupled over the last six years and with the alleged presence of backdoors in common encryption ciphers, has created a need for personalising the encryption process by the client. In 2007, two Microsoft employees gave a presentation ``On the Possibility of a backdoor in the NIST SP800-90 Dual Elliptic Curve Pseudo Random Number Generators'' and was linked in 2013 by the New York Times with notes leaked by Edward Snowden. This confirmed backdoor...

  19. Parallel CFD design on network-based computer

    Science.gov (United States)

    Cheung, Samson

    1995-01-01

    Combining multiple engineering workstations into a network-based heterogeneous parallel computer allows application of aerodynamic optimization with advanced computational fluid dynamics codes, which can be computationally expensive on mainframe supercomputers. This paper introduces a nonlinear quasi-Newton optimizer designed for this network-based heterogeneous parallel computing environment utilizing a software called Parallel Virtual Machine. This paper will introduce the methodology behind coupling a Parabolized Navier-Stokes flow solver to the nonlinear optimizer. This parallel optimization package is applied to reduce the wave drag of a body of revolution and a wing/body configuration with results of 5% to 6% drag reduction.

  20. Inquiry-Based Learning Case Studies for Computing and Computing Forensic Students

    Science.gov (United States)

    Campbell, Jackie

    2012-01-01

    Purpose: The purpose of this paper is to describe and discuss the use of specifically-developed, inquiry-based learning materials for Computing and Forensic Computing students. Small applications have been developed which require investigation in order to de-bug code, analyse data issues and discover "illegal" behaviour. The applications…

  1. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  2. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    Science.gov (United States)

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  3. Learning with Computer-Based Learning Environments: A Literature Review of Computer Self-Efficacy

    Science.gov (United States)

    Moos, Daniel C.; Azevedo, Roger

    2009-01-01

    Although computer-based learning environments (CBLEs) are becoming more prevalent in the classroom, empirical research has demonstrated that some students have difficulty learning with these environments. The motivation construct of computer-self efficacy plays an integral role in learning with CBLEs. This literature review synthesizes research…

  4. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  5. Algebraic and computational aspects of real tensor ranks

    CERN Document Server

    Sakata, Toshio; Miyazaki, Mitsuhiro

    2016-01-01

    This book provides comprehensive summaries of theoretical (algebraic) and computational aspects of tensor ranks, maximal ranks, and typical ranks, over the real number field. Although tensor ranks have been often argued in the complex number field, it should be emphasized that this book treats real tensor ranks, which have direct applications in statistics. The book provides several interesting ideas, including determinant polynomials, determinantal ideals, absolutely nonsingular tensors, absolutely full column rank tensors, and their connection to bilinear maps and Hurwitz-Radon numbers. In addition to reviews of methods to determine real tensor ranks in details, global theories such as the Jacobian method are also reviewed in details. The book includes as well an accessible and comprehensive introduction of mathematical backgrounds, with basics of positive polynomials and calculations by using the Groebner basis. Furthermore, this book provides insights into numerical methods of finding tensor ranks through...

  6. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...... context spanning heterogeneous devices, multiple applications, services, and information sources. In this article, we present ABC as an approach to contextualize information, and present our research into designing activity-centric computing technologies.......An important research topic in artificial intelligence is automatic sensing and inferencing of contextual information, which is used to build computer models of the user’s activity. One approach to build such activity-aware systems is the notion of activity-based computing (ABC). ABC is a computing...

  7. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    An important research topic in artificial intelligence is automatic sensing and inferencing of contextual information, which is used to build computer models of the user’s activity. One approach to build such activity-aware systems is the notion of activity-based computing (ABC). ABC is a computing...... paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...... context spanning heterogeneous devices, multiple applications, services, and information sources. In this article, we present ABC as an approach to contextualize information, and present our research into designing activity-centric computing technologies....

  8. A Logistics Distribution Plan Based on Cloud Computing

    OpenAIRE

    2013-01-01

    Aiming at the problems of lowing informatization level and degree of specialization, high consumption and low efficiency in logistics and distribution industry, this paper analyzes the characteristics of cloud computing and the actual needs of enterprise logistics. On this basis, depth study of the logistics and distribution needs of the cloud computing architecture, depth study of the cloud computing architecture in the logistics and distribution needs, and then propose a cloud-based modern ...

  9. Physical Optics Based Computational Imaging Systems

    Science.gov (United States)

    Olivas, Stephen Joseph

    There is an ongoing demand on behalf of the consumer, medical and military industries to make lighter weight, higher resolution, wider field-of-view and extended depth-of-focus cameras. This leads to design trade-offs between performance and cost, be it size, weight, power, or expense. This has brought attention to finding new ways to extend the design space while adhering to cost constraints. Extending the functionality of an imager in order to achieve extraordinary performance is a common theme of computational imaging, a field of study which uses additional hardware along with tailored algorithms to formulate and solve inverse problems in imaging. This dissertation details four specific systems within this emerging field: a Fiber Bundle Relayed Imaging System, an Extended Depth-of-Focus Imaging System, a Platform Motion Blur Image Restoration System, and a Compressive Imaging System. The Fiber Bundle Relayed Imaging System is part of a larger project, where the work presented in this thesis was to use image processing techniques to mitigate problems inherent to fiber bundle image relay and then, form high-resolution wide field-of-view panoramas captured from multiple sensors within a custom state-of-the-art imager. The Extended Depth-of-Focus System goals were to characterize the angular and depth dependence of the PSF of a focal swept imager in order to increase the acceptably focused imaged scene depth. The goal of the Platform Motion Blur Image Restoration System was to build a system that can capture a high signal-to-noise ratio (SNR), long-exposure image which is inherently blurred while at the same time capturing motion data using additional optical sensors in order to deblur the degraded images. Lastly, the objective of the Compressive Imager was to design and build a system functionally similar to the Single Pixel Camera and use it to test new sampling methods for image generation and to characterize it against a traditional camera. These computational

  10. Activity-based computing for medical work in hospitals

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2009-01-01

    Studies have revealed that people organize and think of their work in terms of activities that are carried out in pursuit of some overall objective, often in collaboration with others. Nevertheless, modern computer systems are typically single-user oriented, that is, designed to support individual...... tasks such as word processing while sitting at a desk. This article presents the concept of Activity-Based Computing (ABC), which seeks to create computational support for human activities. The ABC approach has been designed to address activity-based computing support for clinical work in hospitals....... In a hospital, the challenges arising from the management of parallel activities and interruptions are amplified because multitasking is now combined with a high degree of mobility, collaboration, and urgency. The article presents the empirical and theoretical background for activity-based computing, its...

  11. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  12. Robust speech features representation based on computational auditory model

    Institute of Scientific and Technical Information of China (English)

    LU Xugang; JIA Chuan; DANG Jianwu

    2004-01-01

    A speech signal processing and features extracting method based on computational auditory model is proposed. The computational model is based on psychological, physiological knowledge and digital signal processing methods. In each stage of a hearing perception system, there is a corresponding computational model to simulate its function. Based on this model, speech features are extracted. In each stage, the features in different kinds of level are extracted. A further processing for primary auditory spectrum based on lateral inhibition is proposed to extract much more robust speech features. All these features can be regarded as the internal representations of speech stimulation in hearing system. The robust speech recognition experiments are conducted to test the robustness of the features. Results show that the representations based on the proposed computational auditory model are robust representations for speech signals.

  13. High Available COTS Based Computer for Space

    Science.gov (United States)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  14. Milestones Toward Majorana-Based Quantum Computing

    Science.gov (United States)

    Aasen, David; Hell, Michael; Mishmash, Ryan V.; Higginbotham, Andrew; Danon, Jeroen; Leijnse, Martin; Jespersen, Thomas S.; Folk, Joshua A.; Marcus, Charles M.; Flensberg, Karsten; Alicea, Jason

    2016-07-01

    We introduce a scheme for preparation, manipulation, and read out of Majorana zero modes in semiconducting wires with mesoscopic superconducting islands. Our approach synthesizes recent advances in materials growth with tools commonly used in quantum-dot experiments, including gate control of tunnel barriers and Coulomb effects, charge sensing, and charge pumping. We outline a sequence of milestones interpolating between zero-mode detection and quantum computing that includes (1) detection of fusion rules for non-Abelian anyons using either proximal charge sensors or pumped current, (2) validation of a prototype topological qubit, and (3) demonstration of non-Abelian statistics by braiding in a branched geometry. The first two milestones require only a single wire with two islands, and additionally enable sensitive measurements of the system's excitation gap, quasiparticle poisoning rates, residual Majorana zero-mode splittings, and topological-qubit coherence times. These pre-braiding experiments can be adapted to other manipulation and read out schemes as well.

  15. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  16. Touch-based Brain Computer Interfaces: State of the art

    NARCIS (Netherlands)

    Erp, J.B.F. van; Brouwer, A.M.

    2014-01-01

    Brain Computer Interfaces (BCIs) rely on the user's brain activity to control equipment or computer devices. Many BCIs are based on imagined movement (called active BCIs) or the fact that brain patterns differ in reaction to relevant or attended stimuli in comparison to irrelevant or unattended

  17. Individual Differences and Learning Performance in Computer-based Training

    Science.gov (United States)

    2011-02-01

    Navigation in hypermedia learning systems: experts vs. novices. Computers in Human Behavior , 22, 251–266. Chi, M. T. H., Glaser, R., & Rees, E...communication technologies on performance in a Web-based learning program. Computers in Human Behavior , 22(6), 962-970. Shivpuri, S., Schmitt, N., Oswald

  18. Computer-Based Self-Instructional Modules. Final Technical Report.

    Science.gov (United States)

    Weinstock, Harold

    Reported is a project involving seven chemists, six mathematicians, and six physicists in the production of computer-based, self-study modules for use in introductory college courses in chemistry, physics, and mathematics. These modules were designed to be used by students and instructors with little or no computer backgrounds, in institutions…

  19. Determination of Absolute Zero Using a Computer-Based Laboratory

    Science.gov (United States)

    Amrani, D.

    2007-01-01

    We present a simple computer-based laboratory experiment for evaluating absolute zero in degrees Celsius, which can be performed in college and undergraduate physical sciences laboratory courses. With a computer, absolute zero apparatus can help demonstrators or students to observe the relationship between temperature and pressure and use…

  20. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  1. Severe Neglect and Computer-based Home Training

    DEFF Research Database (Denmark)

    Wilms, Inge Linda

    2014-01-01

    . For this reason, this case study tests the possibility of using computer-based training in the rehabilitation efforts for a patient with severe neglect who had no previous skills in computer usage. The article describes the results of the training both in terms of neuro-psychological tests and the reading ability...

  2. Touch-based Brain Computer Interfaces: State of the art

    NARCIS (Netherlands)

    Erp, J.B.F. van; Brouwer, A.M.

    2014-01-01

    Brain Computer Interfaces (BCIs) rely on the user's brain activity to control equipment or computer devices. Many BCIs are based on imagined movement (called active BCIs) or the fact that brain patterns differ in reaction to relevant or attended stimuli in comparison to irrelevant or unattended stim

  3. A Swarm Intelligence Based Model for Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ahmed S. Salama

    2015-01-01

    Full Text Available Mobile Computing (MC provides multi services and a lot of advantages for millions of users across the world over the internet. Millions of business customers have leveraged cloud computing services through mobile devices to get what is called Mobile Cloud Computing (MCC. MCC aims at using cloud computing techniques for storage and processing of data on mobile devices, thereby reducing their limitations. This paper proposes architecture for a Swarm Intelligence Based Mobile Cloud Computing Model (SIBMCCM. A model that uses a proposed Parallel Particle Swarm Optimization (PPSO algorithm to enhance the access time for the mobile cloud computing services which support different E Commerce models and to better secure the communication through the mobile cloud and the mobile commerce transactions.

  4. Sustainable manufacturing for obsolete computers based on 3R engineering

    Institute of Scientific and Technical Information of China (English)

    SHI Pei-jing; XU Yi; WANG Hong-mei; XU Bin-shi

    2005-01-01

    The volume tendency of in-use and end-of-life computers in China were analyzed; the emerging danger of obsolete computers by incorrect treatment was summarized; the integration disposal technologies based on 3R (recycle, remanufacture and reuse) engineering aiming at monitors, electronic devices, metals, plastics materials, and overall computers were put forward; the economic and social benefits were also analyzed. The results show that the integration disposal process of obsolete computer is an optimum approach to save the resource of electromechanical products. Remanufacturing and disposal 100 thousand obsolete computers per year can create profits about RMB10 million yuan and provide employment for 300 persons. It can be deduced that there are great potential opportunities for the obsolete computers disposal industry containing recycle, remanufacture and reuse engineering.

  5. Intricacies of Feedback in Computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    whether the PAT method can be executed with similar effect using a computer with a touch screen.   62 healthy subjects were subjected to two experimental conditions: 1) pointing out at targets using the original box, 2) pointing out at targets on a computer attached touch screen. In both conditions...... on the touch screen (indirect feedback), 2) the feedback was provided by seeing one's own pointing finger, with no graphical feedback on the computer screen (direct feedback).   The results show that it is possible to obtain similar aftereffects from PAT by using a computer method but only when providing...... a direct feedback (physical finger) on pointing precision. Attempts to provide feedback indirectly via icons on the computer screen fail to create the aftereffects observed in the original PAT. The findings have direct implications for future implementations of computer-based methods in treatment...

  6. All-optical reservoir computer based on saturation of absorption.

    Science.gov (United States)

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  7. A Computer-based Course in Classical Mechanics.

    Science.gov (United States)

    Kane, D.; Sherwood, B.

    1980-01-01

    Describes and illustrates the tutorial and homework exercise lessons, student routing, course organization, administration, and evaluation of a PLATO computer-based course in classical mechanics. An appendix lists 41 lessons developed for the course. (CMV)

  8. [Computational chemistry in structure-based drug design].

    Science.gov (United States)

    Cao, Ran; Li, Wei; Sun, Han-Zi; Zhou, Yu; Huang, Niu

    2013-07-01

    Today, the understanding of the sequence and structure of biologically relevant targets is growing rapidly and researchers from many disciplines, physics and computational science in particular, are making significant contributions to modern biology and drug discovery. However, it remains challenging to rationally design small molecular ligands with desired biological characteristics based on the structural information of the drug targets, which demands more accurate calculation of ligand binding free-energy. With the rapid advances in computer power and extensive efforts in algorithm development, physics-based computational chemistry approaches have played more important roles in structure-based drug design. Here we reviewed the newly developed computational chemistry methods in structure-based drug design as well as the elegant applications, including binding-site druggability assessment, large scale virtual screening of chemical database, and lead compound optimization. Importantly, here we address the current bottlenecks and propose practical solutions.

  9. Computer-based learning for the enhancement of breastfeeding ...

    African Journals Online (AJOL)

    Computer-based learning for the enhancement of breastfeeding training. ... Methods and materials: The Indian module was adapted to suit the South African ... Results: All reviewers rated their information technology (IT) skills as suffi cient and ...

  10. Problems of Indexing Classes of News Based on the Computed ...

    African Journals Online (AJOL)

    Problems of Indexing Classes of News Based on the Computed Importance of Words. ... Interestingly most readers and patrons of newspapers adopt the rule of the thumb in choosing a suitable newspaper to read/buy. ... Article Metrics.

  11. Milestones Toward Majorana-Based Quantum Computing

    Directory of Open Access Journals (Sweden)

    David Aasen

    2016-08-01

    Full Text Available We introduce a scheme for preparation, manipulation, and read out of Majorana zero modes in semiconducting wires with mesoscopic superconducting islands. Our approach synthesizes recent advances in materials growth with tools commonly used in quantum-dot experiments, including gate control of tunnel barriers and Coulomb effects, charge sensing, and charge pumping. We outline a sequence of milestones interpolating between zero-mode detection and quantum computing that includes (1 detection of fusion rules for non-Abelian anyons using either proximal charge sensors or pumped current, (2 validation of a prototype topological qubit, and (3 demonstration of non-Abelian statistics by braiding in a branched geometry. The first two milestones require only a single wire with two islands, and additionally enable sensitive measurements of the system’s excitation gap, quasiparticle poisoning rates, residual Majorana zero-mode splittings, and topological-qubit coherence times. These pre-braiding experiments can be adapted to other manipulation and read out schemes as well.

  12. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  13. An overview of computer-based natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  15. A novel bit-quad-based Euler number computing algorithm.

    Science.gov (United States)

    Yao, Bin; He, Lifeng; Kang, Shiying; Chao, Yuyan; Zhao, Xiao

    2015-01-01

    The Euler number of a binary image is an important topological property in computer vision and pattern recognition. This paper proposes a novel bit-quad-based Euler number computing algorithm. Based on graph theory and analysis on bit-quad patterns, our algorithm only needs to count two bit-quad patterns. Moreover, by use of the information obtained during processing the previous bit-quad, the average number of pixels to be checked for processing a bit-quad is only 1.75. Experimental results demonstrated that our method outperforms significantly conventional Euler number computing algorithms.

  16. Domain Decomposition Based High Performance Parallel Computing

    CERN Document Server

    Raju, Mandhapati P

    2009-01-01

    The study deals with the parallelization of finite element based Navier-Stokes codes using domain decomposition and state-ofart sparse direct solvers. There has been significant improvement in the performance of sparse direct solvers. Parallel sparse direct solvers are not found to exhibit good scalability. Hence, the parallelization of sparse direct solvers is done using domain decomposition techniques. A highly efficient sparse direct solver PARDISO is used in this study. The scalability of both Newton and modified Newton algorithms are tested.

  17. Safeguards instrumentation: a computer-based catalog

    Energy Technology Data Exchange (ETDEWEB)

    Fishbone, L.G.; Keisch, B.

    1981-08-01

    The information contained in this catalog is needed to provide a data base for safeguards studies and to help establish criteria and procedures for international safeguards for nuclear materials and facilities. The catalog primarily presents information on new safeguards equipment. It also describes entire safeguards systems for certain facilities, but it does not describe the inspection procedures. Because IAEA safeguards do not include physical security, devices for physical protection (as opposed to containment and surveillance) are not included. An attempt has been made to list capital costs, annual maintenance costs, replacement costs, and useful lifetime for the equipment. For equipment which is commercially available, representative sources have been listed whenever available.

  18. Towards a fullerene-based quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Benjamin, Simon C [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Ardavan, Arzhang [Clarendon Laboratory, University of Oxford, Parks Road, Oxford OX1 3PU (United Kingdom); Briggs, G Andrew D [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Britz, David A [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Gunlycke, Daniel [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jefferson, John [QinetiQ, St Andrews Road, Malvern, WR14 3PS (United Kingdom); Jones, Mark A G [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Leigh, David F [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Lovett, Brendon W [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Khlobystov, Andrei N [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Lyon, S A [Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 (United States); Morton, John J L [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Porfyrakis, Kyriakos [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Sambrook, Mark R [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Tyryshkin, Alexei M [Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 (United States)

    2006-05-31

    Molecular structures appear to be natural candidates for a quantum technology: individual atoms can support quantum superpositions for long periods, and such atoms can in principle be embedded in a permanent molecular scaffolding to form an array. This would be true nanotechnology, with dimensions of order of a nanometre. However, the challenges of realizing such a vision are immense. One must identify a suitable elementary unit and demonstrate its merits for qubit storage and manipulation, including input/output. These units must then be formed into large arrays corresponding to an functional quantum architecture, including a mechanism for gate operations. Here we report our efforts, both experimental and theoretical, to create such a technology based on endohedral fullerenes or 'buckyballs'. We describe our successes with respect to these criteria, along with the obstacles we are currently facing and the questions that remain to be addressed.

  19. Risk-Based Computational Prototyping (Briefing Charts)

    Science.gov (United States)

    2010-10-01

    2010) Prof Huang, Director  Dr. Beran, PM • Prof Missoum, Mr. Basudhar (UA, Tucson) and Dr. Lambe (MSSRC) – RBDO  with LCO  • Prof Dong and Mr. Gaston...periodic formulation 21 Approved for public release. Reliability‐Based Design Optimization ( RBDO ) Goal: Examine use of...amp   x = response variables d = design variables    aluminum g(x(d, E, M∞ )) > 0; side constraints on d RBDO M∞ Generally, the designed

  20. Towards a fullerene-based quantum computer

    CERN Document Server

    Benjamin, S C; Briggs, G A D; Britz, D A; Gunlycke, D; Jefferson, J; Jones, M A G; Khlobystov, A N; Leigh, D F; Lovett, B W; Lyon, S A; Morton, J J L; Porfyrakis, K; Sambrook, M R; Tyryshkin, A M; Ardavan, Arzhang; Benjamin, Simon C; Britz, David A; Gunlycke, Daniel; Jefferson, John; Jones, Mark A G; Khlobystov, Andrei N; Leigh, David F; Lovett, Brendon W; Morton, John J L; Porfyrakis, Kyriakos; Sambrook, Mark R; Tyryshkin, Alexei M

    2005-01-01

    Molecular structures appear to be natural candidates for a quantum technology: individual atoms can support quantum superpositions for long periods, and such atoms can in principle be embedded in a permanent molecular scaffolding to form an array. This would be true nanotechnology, with dimensions of order of a nanometre. However, the challenges of realising such a vision are immense. One must identify a suitable elementary unit and demonstrate its merits for qubit storage and manipulation, including input / output. These units must then be formed into large arrays corresponding to an functional quantum architecture, including a mechanism for gate operations. Here we report our efforts, both experimental and theoretical, to create such a technology based on endohedral fullerenes or 'buckyballs'. We describe our successes with respect to these criteria, along with the obstacles we are currently facing and the questions that remain to be addressed.

  1. A community-based study of asthenopia in computer operators

    Directory of Open Access Journals (Sweden)

    Bhanderi Dinesh

    2008-01-01

    Full Text Available Context: There is growing body of evidence that use of computers can adversely affect the visual health. Considering the rising number of computer users in India, computer-related asthenopia might take an epidemic form. In view of that, this study was undertaken to find out the magnitude of asthenopia in computer operators and its relationship with various personal and workplace factors. Aims: To study the prevalence of asthenopia among computer operators and its association with various epidemiological factors. Settings and Design: Community-based cross-sectional study of 419 subjects who work on computer for varying period of time. Materials and Methods: Four hundred forty computer operators working in different institutes were selected randomly. Twenty-one did not participate in the study, making the nonresponse rate 4.8%. Rest of the subjects (n = 419 were asked to fill a pre-tested questionnaire, after obtaining their verbal consent. Other relevant information was obtained by personal interview and inspection of workstation. Statistical Analysis Used: Simple proportions and Chi-square test. Results: Among the 419 subjects studied, 194 (46.3% suffered from asthenopia during or after work on computer. Marginally higher proportion of asthenopia was noted in females compared to males. Occurrence of asthenopia was significantly associated with age of starting use of computer, presence of refractive error, viewing distance, level of top of the computer screen with respect to eyes, use of antiglare screen and adjustment of contrast and brightness of monitor screen. Conclusions: Prevalence of asthenopia was noted to be quite high among computer operators, particularly in those who started its use at an early age. Individual as well as work-related factors were found to be predictive of asthenopia.

  2. Redesigning Computer-based Learning Environments: Evaluation as Communication

    CERN Document Server

    Brust, Matthias R; Ricarte, Ivan M L

    2007-01-01

    In the field of evaluation research, computer scientists live constantly upon dilemmas and conflicting theories. As evaluation is differently perceived and modeled among educational areas, it is not difficult to become trapped in dilemmas, which reflects an epistemological weakness. Additionally, designing and developing a computer-based learning scenario is not an easy task. Advancing further, with end-users probing the system in realistic settings, is even harder. Computer science research in evaluation faces an immense challenge, having to cope with contributions from several conflicting and controversial research fields. We believe that deep changes must be made in our field if we are to advance beyond the CBT (computer-based training) learning model and to build an adequate epistemology for this challenge. The first task is to relocate our field by building upon recent results from philosophy, psychology, social sciences, and engineering. In this article we locate evaluation in respect to communication s...

  3. Micromechanics-Based Computational Simulation of Ceramic Matrix Composites

    Science.gov (United States)

    Murthy, Pappu L. N.; Mutal, Subodh K.; Duff, Dennis L. (Technical Monitor)

    2003-01-01

    Advanced high-temperature Ceramic Matrix Composites (CMC) hold an enormous potential for use in aerospace propulsion system components and certain land-based applications. However, being relatively new materials, a reliable design properties database of sufficient fidelity does not yet exist. To characterize these materials solely by testing is cost and time prohibitive. Computational simulation then becomes very useful to limit the experimental effort and reduce the design cycle time, Authors have been involved for over a decade in developing micromechanics- based computational simulation techniques (computer codes) to simulate all aspects of CMC behavior including quantification of scatter that these materials exhibit. A brief summary/capability of these computer codes with typical examples along with their use in design/analysis of certain structural components is the subject matter of this presentation.

  4. Grid Computing based on Game Optimization Theory for Networks Scheduling

    Directory of Open Access Journals (Sweden)

    Peng-fei Zhang

    2014-05-01

    Full Text Available The resource sharing mechanism is introduced into grid computing algorithm so as to solve complex computational tasks in heterogeneous network-computing problem. However, in the Grid environment, it is required for the available resource from network to reasonably schedule and coordinate, which can get a good workflow and an appropriate network performance and network response time. In order to improve the performance of resource allocation and task scheduling in grid computing method, a game model based on non-cooperation game is proposed. Setting the time and cost of user’s resource allocation can increase the performance of networks, and incentive resource of networks uses an optimization scheduling algorithm, which minimizes the time and cost of resource scheduling. Simulation experiment results show the feasibility and suitability of model. In addition, we can see from the experiment result that model-based genetic algorithm is the best resource scheduling algorithm

  5. An Introduction to the Computer-based TOEFL

    Institute of Scientific and Technical Information of China (English)

    CHEN Jing

    2001-01-01

    TOEFL,which aims to measure the English proficiency of test-takers whose first language is not English, is a familiar test to students around the world. The number of people who take the TEOFL is growing rapidly as the influence of the TOEFL is expanding. In 2002-2003, the computer-based TOEFL test will be introduced into China to replace the old paper-based test. So it is quite necessary for people who are preparing to take the computer-based TOEFL test to learn something about it.

  6. Identifying barriers for implementation of computer based nursing documentation.

    Science.gov (United States)

    Vollmer, Anna-Maria; Prokosch, Hans-Ulrich; Bürkle, Thomas

    2014-01-01

    This study was undertaken in the planning phase for the introduction of a comprehensive computer based nursing documentation system at Erlangen University Hospital. There, we expect a wide range of difficult organizational changes, because the nurses currently neither used computer based nursing documentation nor did they follow strongly the nursing process model within paper based documentation. Thus we were eager to recognize potential pitfalls early and to identify potential barriers for digital nursing documentation. In a questionnaire study we surveyed all German university hospitals for their experience with the implementation of computer based nursing documentation implementation. We received answers from 11 of the 23 hospitals. Furthermore we performed a questionnaire study about expectations and fears among the nurses of four pilot wards of our hospital. Most respondents stated a positive attitude towards the nursing process documentation, but many respondents note technical (e.g. bad performance of the software) and organizational barriers (e.g. lack of time).

  7. Randomized benchmarking in measurement-based quantum computing

    Science.gov (United States)

    Alexander, Rafael N.; Turner, Peter S.; Bartlett, Stephen D.

    2016-09-01

    Randomized benchmarking is routinely used as an efficient method for characterizing the performance of sets of elementary logic gates in small quantum devices. In the measurement-based model of quantum computation, logic gates are implemented via single-site measurements on a fixed universal resource state. Here we adapt the randomized benchmarking protocol for a single qubit to a linear cluster state computation, which provides partial, yet efficient characterization of the noise associated with the target gate set. Applying randomized benchmarking to measurement-based quantum computation exhibits an interesting interplay between the inherent randomness associated with logic gates in the measurement-based model and the random gate sequences used in benchmarking. We consider two different approaches: the first makes use of the standard single-qubit Clifford group, while the second uses recently introduced (non-Clifford) measurement-based 2-designs, which harness inherent randomness to implement gate sequences.

  8. Sensitive Data Protection Based on Intrusion Tolerance in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jingyu Wang

    2011-02-01

    Full Text Available Service integration and supply on-demand coming from cloud computing can significantly improve the utilization of computing resources and reduce power consumption of per service, and effectively avoid the error of computing resources. However, cloud computing is still facing the problem of intrusion tolerance of the cloud computing platform and sensitive data of new enterprise data center. In order to address the problem of intrusion tolerance of cloud computing platform and sensitive data in new enterprise data center, this paper constructs a virtualization intrusion tolerance system based on cloud computing by researching on the existing virtualization technology, and then presents a method of intrusion tolerance to protect sensitive data in cloud data center based on virtual adversary structure by utilizing secret sharing. This system adopts the method of hybrid fault model, active and passive replicas, state update and transfer, proactive recovery and diversity, and initially implements to tolerate F faulty replicas in N=2F+1 replicas and ensure that only F+1 active replicas to execute during the intrusion-free stage. The remaining replicas are all put into passive mode, which significantly reduces the resource consuming in cloud platform. At last we prove the reconstruction and confidentiality property of sensitive data by utilizing secret sharing.

  9. Radiation Tolerant, FPGA-Based SmallSat Computer System

    Science.gov (United States)

    LaMeres, Brock J.; Crum, Gary A.; Martinez, Andres; Petro, Andrew

    2015-01-01

    The Radiation Tolerant, FPGA-based SmallSat Computer System (RadSat) computing platform exploits a commercial off-the-shelf (COTS) Field Programmable Gate Array (FPGA) with real-time partial reconfiguration to provide increased performance, power efficiency and radiation tolerance at a fraction of the cost of existing radiation hardened computing solutions. This technology is ideal for small spacecraft that require state-of-the-art on-board processing in harsh radiation environments but where using radiation hardened processors is cost prohibitive.

  10. Agent-based computational economics using NetLogo

    CERN Document Server

    Damaceanu, Romulus-Catalin

    2013-01-01

    Agent-based Computational Economics using NetLogo explores how researchers can create, use and implement multi-agent computational models in Economics by using NetLogo software platform. Problems of economic science can be solved using multi-agent modelling (MAM). This technique uses a computer model to simulate the actions and interactions of autonomous entities in a network, in order to analyze the effects on the entire economic system. MAM combines elements of game theory, complex systems, emergence and evolutionary programming. The Monte Carlo method is also used in this e-book to introduc

  11. A Logistics Distribution Plan Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhou Feng

    2013-12-01

    Full Text Available Aiming at the problems of lowing informatization level and degree of specialization, high consumption and low efficiency in logistics and distribution industry, this paper analyzes the characteristics of cloud computing and the actual needs of enterprise logistics. On this basis, depth study of the logistics and distribution needs of the cloud computing architecture, depth study of the cloud computing architecture in the logistics and distribution needs, and then propose a cloud-based modern logistics solutions, for the development of modern logistics provides a new operating mode

  12. Protecting Terminals by Security Domain Mechanism Based on Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    ZHOU Zheng; ZHANG Jun; LI Jian; LIU Yi

    2006-01-01

    Networks are composed with servers and rather larger amounts of terminals and most menace of attack and virus come from terminals. Eliminating malicious code and access or breaking the conditions only under witch attack or virus can be invoked in those terminals would be the most effective way to protect information systems. The concept of trusted computing was first introduced into terminal virus immunity. Then a model of security domain mechanism based on trusted computing to protect computers from proposed from abstracting the general information systems. The principle of attack resistant and venture limitation of the model was demonstrated by means of mathematical analysis, and the realization of the model was proposed.

  13. Nanophotonic quantum computer based on atomic quantum transistor

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, S N [Institute of Advanced Research, Academy of Sciences of the Republic of Tatarstan, Kazan (Russian Federation); Moiseev, S A [Kazan E. K. Zavoisky Physical-Technical Institute, Kazan Scientific Center, Russian Academy of Sciences, Kazan (Russian Federation)

    2015-10-31

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  14. Assessment of Clinical Competence: Written and Computer-Based Simulations.

    Science.gov (United States)

    Swanson, David B.; And Others

    1987-01-01

    Literature concerning the validity and reliability of both written and computer-based simulations in assessing clinical competence in the health professions is reviewed, and suggestions are given for the improvement of the psychometric qualities of simulation-based tests. (MSE)

  15. The Mediated Museum: Computer-Based Technology and Museum Infrastructure.

    Science.gov (United States)

    Sterman, Nanette T.; Allen, Brockenbrough S.

    1991-01-01

    Describes the use of computer-based tools and techniques in museums. The integration of realia with media-based advice and interpretation is described, electronic replicas of ancient Greek vases in the J. Paul Getty Museum are explained, examples of mediated exhibits are presented, and the use of hypermedia is discussed. (five references) (LRW)

  16. Pervasive Computing Location-aware Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    PU Fang; CAI Hai-bin; CAO Qi-ying; SUN Dao-qing; LI Tong

    2008-01-01

    In order to integrate heterogeneous location-aware systems into pervasive computing environment, a novel pervasive computing location-aware model based on ontology is presented. A location-aware model ontology (LMO) is constructed. The location-aware model has the capabilities of sharing knowledge, reasoning and adjusting the usage policies of services dynamically through a unified semantic location manner. At last, the work process of our proposed location-aware model is explained by an application scenario.

  17. Data Mining Based on Cloud-Computing Technology

    Directory of Open Access Journals (Sweden)

    Ren Ying

    2016-01-01

    Full Text Available There are performance bottlenecks and scalability problems when traditional data-mining system is used in cloud computing. In this paper, we present a data-mining platform based on cloud computing. Compared with a traditional data mining system, this platform is highly scalable, has massive data processing capacities, is service-oriented, and has low hardware cost. This platform can support the design and applications of a wide range of distributed data-mining systems.

  18. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal

    2017-01-01

    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  19. Moment Matrices, Border Bases and Real Radical Computation

    OpenAIRE

    Lasserre, Jean-Bernard; Laurent, Monique; Mourrain, Bernard; Rostalski, Philipp; Trébuchet, Philippe

    2013-01-01

    International audience; In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is finite. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-definite programming. While the border basis algorithms of [17] are efficient and numerically stable for computing complex roots, algorithms based on moment matrices [12] allow the incorpora...

  20. Establishing performance requirements of computer based systems subject to uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, D.

    1997-02-01

    An organized systems design approach is dictated by the increasing complexity of computer based systems. Computer based systems are unique in many respects but share many of the same problems that have plagued design engineers for decades. The design of complex systems is difficult at best, but as a design becomes intensively dependent on the computer processing of external and internal information, the design process quickly borders chaos. This situation is exacerbated with the requirement that these systems operate with a minimal quantity of information, generally corrupted by noise, regarding the current state of the system. Establishing performance requirements for such systems is particularly difficult. This paper briefly sketches a general systems design approach with emphasis on the design of computer based decision processing systems subject to parameter and environmental variation. The approach will be demonstrated with application to an on-board diagnostic (OBD) system for automotive emissions systems now mandated by the state of California and the Federal Clean Air Act. The emphasis is on an approach for establishing probabilistically based performance requirements for computer based systems.

  1. Designing for learner engagement with computer-based testing

    Directory of Open Access Journals (Sweden)

    Richard Walker

    2016-12-01

    Full Text Available The issues influencing student engagement with high-stakes computer-based exams were investigated, drawing on feedback from two cohorts of international MA Education students encountering this assessment method for the first time. Qualitative data from surveys and focus groups on the students’ examination experience were analysed, leading to the identification of engagement issues in the delivery of high-stakes computer-based assessments.The exam combined short-answer open-response questions with multiple-choice-style items to assess knowledge and understanding of research methods. The findings suggest that engagement with computer-based testing depends, to a lesser extent, on students’ general levels of digital literacy and, to a greater extent, on their information technology (IT proficiency for assessment and their ability to adapt their test-taking strategies, including organisational and cognitive strategies, to the online assessment environment. The socialisation and preparation of students for computer-based testing therefore emerge as key responsibilities for instructors to address, with students requesting increased opportunities for practice and training to develop the IT skills and test-taking strategies necessary to succeed in computer-based examinations. These findings and their implications in terms of instructional responsibilities form the basis of a proposal for a framework for Learner Engagement with e-Assessment Practices.

  2. Moment Matrices, Border Bases and Real Radical Computation

    CERN Document Server

    Lasserre, Jean-Bernard; Mourrain, Bernard; Rostalki, Philipp; Trébuchet, Philippe

    2011-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is finite. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-definite programming. While the border basis algorithms of [17] are efficient and numerically stable for computing complex roots, algorithms based on moment matrices [12] allow the incorporation of additional polynomials, e.g., to re- strict the computation to real roots or to eliminate multiple solutions. The proposed algorithm can be used to compute a border basis of the input ideal and, as opposed to other approaches, it can also compute the quotient structure of the (real) radical ideal directly, i.e., without prior algebraic techniques such as Gr\\"obner bases. It thus combines the strength of existing algorithms and provides a unified treatment for the computation of border bases for the ideal, the radical ideal and the real r...

  3. The Validation of Computer-based Models in Engineering: Some Lessons from Computing Science

    Directory of Open Access Journals (Sweden)

    D. J. Murray-Smith

    2001-01-01

    Full Text Available Questions of the quality of computer-based models and the formal processes of model testing, involving internal verification and external validation, are usually given only passing attention in engineering reports and in technical publications. However, such models frequently provide a basis for analysis methods, design calculations or real-time decision-making in complex engineering systems. This paper reviews techniques used for external validation of computer-based models and contrasts the somewhat casual approach which is usually adopted in this field with the more formal approaches to software testing and documentation recommended for large software projects. Both activities require intimate knowledge of the intended application, a systematic approach and considerable expertise and ingenuity in the design of tests. It is concluded that engineering degree courses dealing with modelling techniques and computer simulation should put more emphasis on model limitations, testing and validation.

  4. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kevin C. Tseng

    2014-01-01

    Full Text Available This paper presents an expert diagnosis system based on cloud computing. It classifies a user’s fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user’s physiological data, such as age, gender, and body mass index (BMI. In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8% and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  5. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    Science.gov (United States)

    Tseng, Kevin C.; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service. PMID:24723842

  6. Computer-aided design–computer-aided engineering associative feature-based heterogeneous object modeling

    Directory of Open Access Journals (Sweden)

    Jikai Liu

    2015-12-01

    Full Text Available Conventionally, heterogeneous object modeling methods paid limited attention to the concurrent modeling of geometry design and material composition distribution. Procedural method was normally employed to generate the geometry first and then determine the heterogeneous material distribution, which ignores the mutual influence. Additionally, limited capability has been established about irregular material composition distribution modeling with strong local discontinuities. This article overcomes these limitations by developing the computer-aided design–computer-aided engineering associative feature-based heterogeneous object modeling method. Level set functions are applied to model the geometry within computer-aided design module, which enables complex geometry modeling. Finite element mesh is applied to store the local material compositions within computer-aided engineering module, which allows any local discontinuities. Then, the associative feature concept builds the correspondence relationship between these modules. Additionally, the level set geometry and material optimization method are developed to concurrently generate the geometry and material information which fills the contents of the computer-aided design–computer-aided engineering associative feature model. Micro-geometry is investigated as well, instead of only the local material composition. A few cases are studied to prove the effectiveness of this new heterogeneous object modeling method.

  7. Computer Literacy and the Construct Validity of a High-Stakes Computer-Based Writing Assessment

    Science.gov (United States)

    Jin, Yan; Yan, Ming

    2017-01-01

    One major threat to validity in high-stakes testing is construct-irrelevant variance. In this study we explored whether the transition from a paper-and-pencil to a computer-based test mode in a high-stakes test in China, the College English Test, has brought about variance irrelevant to the construct being assessed in this test. Analyses of the…

  8. A novel bit-quad-based Euler number computing algorithm

    OpenAIRE

    Yao, Bin; He, Lifeng; Kang, Shiying; Chao, Yuyan; Xiao ZHAO

    2015-01-01

    The Euler number of a binary image is an important topological property in computer vision and pattern recognition. This paper proposes a novel bit-quad-based Euler number computing algorithm. Based on graph theory and analysis on bit-quad patterns, our algorithm only needs to count two bit-quad patterns. Moreover, by use of the information obtained during processing the previous bit-quad, the average number of pixels to be checked for processing a bit-quad is only 1.75. Experimental results ...

  9. A Separated Domain-Based Kernel Model for Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    FANG Yanxiang; SHEN Changxiang; XU Jingdong; WU Gongyi

    2006-01-01

    This paper fist gives an investigation on trusted computing on mainstream operation system (OS). Based on the observations, it is pointed out that Trusted Computing cannot be achieved due to the lack of separation mechanism of the components in mainstream OS. In order to provide a kind of separation mechanism, this paper proposes a separated domain-based kernel model (SDBKM), and this model is verified by non-interference theory. By monitoring and simplifying the trust dependence between domains, this model can solve problems in trust measurement such as deny of service (DoS) attack, Host security, and reduce the overhead of measurement.

  10. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  11. Remote Sensing Image Deblurring Based on Grid Computation

    Institute of Scientific and Technical Information of China (English)

    LI Sheng-yang; ZHU Chong-guang; GE Ping-ju

    2006-01-01

    In general, there is a demand for real-time processing of mass quantity remote sensing images. However, the task is not only data-intensive but also computating-intensive. Distributed processing is a hot topic in remote sensing processing and image deblurring is also one of the most important needs. In order to satisfy the demand for quick processing and deblurring of mass quantity satellite images, we developed a distributed, grid computation-based platform as well as a corresponding middleware for grid computation. Both a constrained power spectrum equalization algorithm and effective block processing measures, which can avoid boundary effect, were applied during the processing. The result is satisfactory since computation efficiency and visual effect were greatly improved. It can be concluded that the technology of spatial information grids is effective for mass quantity remote sensing image processing.

  12. GPU-based high-performance computing for radiation therapy.

    Science.gov (United States)

    Jia, Xun; Ziegenhein, Peter; Jiang, Steve B

    2014-02-21

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented.

  13. Entanglement-based machine learning on a quantum computer.

    Science.gov (United States)

    Cai, X-D; Wu, D; Su, Z-E; Chen, M-C; Wang, X-L; Li, Li; Liu, N-L; Lu, C-Y; Pan, J-W

    2015-03-20

    Machine learning, a branch of artificial intelligence, learns from previous experience to optimize performance, which is ubiquitous in various fields such as computer sciences, financial analysis, robotics, and bioinformatics. A challenge is that machine learning with the rapidly growing "big data" could become intractable for classical computers. Recently, quantum machine learning algorithms [Lloyd, Mohseni, and Rebentrost, arXiv.1307.0411] were proposed which could offer an exponential speedup over classical algorithms. Here, we report the first experimental entanglement-based classification of two-, four-, and eight-dimensional vectors to different clusters using a small-scale photonic quantum computer, which are then used to implement supervised and unsupervised machine learning. The results demonstrate the working principle of using quantum computers to manipulate and classify high-dimensional vectors, the core mathematical routine in machine learning. The method can, in principle, be scaled to larger numbers of qubits, and may provide a new route to accelerate machine learning.

  14. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  15. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  16. Linearized Aeroelastic Computations in the Frequency Domain Based on Computational Fluid Dynamics

    CERN Document Server

    Amsallem, David; Choi, Youngsoo; Farhat, Charbel

    2015-01-01

    An iterative, CFD-based approach for aeroelastic computations in the frequency domain is presented. The method relies on a linearized formulation of the aeroelastic problem and a fixed-point iteration approach and enables the computation of the eigenproperties of each of the wet aeroelastic eigenmodes. Numerical experiments on the aeroelastic analysis and design optimization of two wing configurations illustrate the capability of the method for the fast and accurate aeroelastic analysis of aircraft configurations and its advantage over classical time-domain approaches.

  17. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  18. An E-learning System based on Affective Computing

    Science.gov (United States)

    Duo, Sun; Song, Lu Xue

    In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.

  19. A Computationally Based Approach to Homogenizing Advanced Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  20. Formalization for Granular Computing Based on Logical Formulas

    Institute of Scientific and Technical Information of China (English)

    Lin Yan; Qing Liu

    2006-01-01

    In order to make formalization for granular computing, some kinds of formulas are constructed on a universe by a logical method. Every formula expresses a property, and can separate a semantic set which consists of all of the objects satisfying the formula. Therefore a granular space on the universe is produced based on the formulas,and the semantic sets separated by the formulas are taken as a formal definition for granules ,and are called abstract granules. Furthermore, it is proved that any specific granule from an extended mathematical system can be formalized into an abstract granule ,the conclusions is obtained that specific granules from approximate spaces and information systems can also be formalized into abstract granules. Based on a granular space and abstract granules, granular computing is defined, which finally realizes the goal of formalization for granular computing.

  1. A Security Kernel Architecture Based Trusted Computing Platform

    Institute of Scientific and Technical Information of China (English)

    CHEN You-lei; SHEN Chang-xiang

    2005-01-01

    A security kernel architecture built on trusted computing platform in the light of thinking about trusted computing is presented. According to this architecture, a new security module TCB (Trusted Computing Base) is added to the operation system kernel and two operation interface modes are provided for the sake of self-protection. The security kernel is divided into two parts and trusted mechanism is separated from security functionality. The TCB module implements the trusted mechanism such as measurement and attestation,while the other components of security kernel provide security functionality based on these mechanisms. This architecture takes full advantage of functions provided by trusted platform and clearly defines the security perimeter of TCB so as to assure self-security from architectural vision. We also present function description of TCB and discuss the strengths and limitations comparing with other related researches.

  2. Improved Computational Model of Grid Cells Based on Column Structure

    Institute of Scientific and Technical Information of China (English)

    Yang Zhou; Dewei Wu; Weilong Li; Jia Du

    2016-01-01

    To simulate the firing pattern of biological grid cells, this paper presents an improved computational model of grid cells based on column structure. In this model, the displacement along different directions is processed by modulus operation, and the obtained remainder is associated with firing rate of grid cell. Compared with the original model, the improved parts include that: the base of modulus operation is changed, and the firing rate in firing field is encoded by Gaussian⁃like function. Simulation validates that the firing pattern generated by the improved computational model is more consistent with biological characteristic than original model. Besides, the firing pattern is badly influenced by the cumulative positioning error, but the computational model can also generate the regularly hexagonal firing pattern when the real⁃time positioning results are modified.

  3. PHOTOREALISTIC COMPUTER GRAPHICS FORENSICS BASED ON LEADING DIGIT LAW

    Institute of Scientific and Technical Information of China (English)

    Xu Bo; Wang Junwen; Liu Guangjie; Dai Yuewei

    2011-01-01

    As the advent and growing popularity of image rendering software,photorealistic computer graphics are becoming more and more perceptually indistinguishable from photographic images.If the faked images are abused,it may lead to potential social,legal or private consequences.To this end,it is very necessary and also challenging to find effective methods to differentiate between them.In this paper,a novel leading digit law,also called Benford's law,based method to identify computer graphics is proposed.More specifically,statistics of the most significant digits are extracted from image's Discrete Cosine Transform (DCT) coefficients and magnitudes of image's gradient,and then the Support Vector Machine (SVM) based classifiers are built.Results of experiments on the image datasets indicate that the proposed method is comparable to prior works.Besides,it possesses low dimensional features and low computational complexity.

  4. Computational challenges of structure-based approaches applied to HIV.

    Science.gov (United States)

    Forli, Stefano; Olson, Arthur J

    2015-01-01

    Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.

  5. Cluster based parallel database management system for data intensive computing

    Institute of Scientific and Technical Information of China (English)

    Jianzhong LI; Wei ZHANG

    2009-01-01

    This paper describes a computer-cluster based parallel database management system (DBMS), InfiniteDB, developed by the authors. InfiniteDB aims at efficiently sup-port data intensive computing in response to the rapid grow-ing in database size and the need of high performance ana-lyzing of massive databases. It can be efficiently executed in the computing system composed by thousands of computers such as cloud computing system. It supports the parallelisms of intra-query, inter-query, intra-operation, inter-operation and pipelining. It provides effective strategies for managing massive databases including the multiple data declustering methods, the declustering-aware algorithms for relational operations and other database operations, and the adaptive query optimization method. It also provides the functions of parallel data warehousing and data mining, the coordinator-wrapper mechanism to support the integration of heteroge-neous information resources on the Internet, and the fault tol-erant and resilient infrastructures. It has been used in many applications and has proved quite effective for data intensive computing.

  6. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  7. Evaluating Computer-Based Test Accommodations for English Learners

    Science.gov (United States)

    Roohr, Katrina Crotts; Sireci, Stephen G.

    2017-01-01

    Test accommodations for English learners (ELs) are intended to reduce the language barrier and level the playing field, allowing ELs to better demonstrate their true proficiencies. Computer-based accommodations for ELs show promising results for leveling that field while also providing us with additional data to more closely investigate the…

  8. Impact of Computer Based Online Entrepreneurship Distance Education in India

    Science.gov (United States)

    Shree Ram, Bhagwan; Selvaraj, M.

    2012-01-01

    The success of Indian enterprises and professionals in the computer and information technology (CIT) domain during the twenty year has been spectacular. Entrepreneurs, bureaucrats and technocrats are now advancing views about how India can ride CIT bandwagon and leapfrog into a knowledge-based economy in the area of entrepreneurship distance…

  9. The Health of the Computer-Based Patient Record.

    Science.gov (United States)

    Frisse, Mark E.

    1992-01-01

    The newly incorporated Computer-Based Patient Record Institute (CPRI) is discussed in the context of the history of medical records, the need for change (mainly because of health care reimbursement and regulation), and the need for involvement by all medical professionals in the development of standards of data collection which reflect public…

  10. A computer-based registration system for geological collections

    NARCIS (Netherlands)

    Germeraad, J.H.; Freudenthal, M.; Boogaard, van den M.; Arps, C.E.S.

    1972-01-01

    The new computer-based registration system, a project of the National Museum of Geology and Mineralogy in the Netherlands, will considerably increase the accessibility of the Museum collection. This greater access is realized by computerisation of the data in great detail, so that an almost unlimite

  11. Students' Motivation toward Computer-Based Language Learning

    Science.gov (United States)

    Genc, Gulten; Aydin, Selami

    2011-01-01

    The present article examined some factors affecting the motivation level of the preparatory school students in using a web-based computer-assisted language-learning course. The sample group of the study consisted of 126 English-as-a-foreign-language learners at a preparatory school of a state university. After performing statistical analyses…

  12. Interface Design in Computer-Based Language Testing.

    Science.gov (United States)

    Fulcher, Glenn

    2003-01-01

    Describes a three-phase process model for interface design, drawing on practices developed in the software industry and adapting them for computer-based languages tests. Describes good practice in initial design, emphasizes the importance of usability testing, and argues that only through following a principled approach to interface design can the…

  13. pyro: Python-based tutorial for computational methods for hydrodynamics

    Science.gov (United States)

    Zingale, Michael

    2015-07-01

    pyro is a simple python-based tutorial on computational methods for hydrodynamics. It includes 2-d solvers for advection, compressible, incompressible, and low Mach number hydrodynamics, diffusion, and multigrid. It is written with ease of understanding in mind. An extensive set of notes that is part of the Open Astrophysics Bookshelf project provides details of the algorithms.

  14. Status Report on the NBME's Computer-Based Testing.

    Science.gov (United States)

    Clyman, Stephen G.; Orr, Nancy A.

    1990-01-01

    The process proposed for the development and use of computer-based testing, including simulation and multiple-choice questions, as part of the National Board of Medical Examiners' certification sequence is outlined. Summary reports of first-phase pilot testing in six medical schools are appended. (MSE)

  15. Computer-Based Dynamic Assessment of Multidigit Multiplication.

    Science.gov (United States)

    Gerber, Michael M.; And Others

    1994-01-01

    Design details, operation, and initial field test results are reported for DynaMath, a computer-based dynamic assessment system that provides individually tailored, instructionally useful assessment of students with disabilities. DynaMath organizes and outputs student performance data, graphically shows the "zone of proximal…

  16. A Micro-Computer Based Tutor for Teaching Arithmetic Skills.

    Science.gov (United States)

    Attisha, M.; Yazdani, M.

    1983-01-01

    Describes a knowledge-based tutoring system which provides pupil interaction with the microcomputer to diagnose pupils' errors in subtraction operations. Current subtraction methods; nature and origin of subtraction errors; and the structure, achievements, and future developments of the computer system are included. Thirteen references and a…

  17. Computed tomography of the human developing anterior skull base

    NARCIS (Netherlands)

    J. van Loosen (J.); A.I.J. Klooswijk (A. I J); D. van Velzen (D.); C.D.A. Verwoerd (Carel)

    1990-01-01

    markdownabstractAbstract The ossification of the anterior skull base, especially the lamina cribrosa, has been studied by computed tomography and histopathology. Sixteen human fetuses, (referred to our laboratory for pathological examination after spontaneous abortion between 18 and 32 weeks of ge

  18. Marking Strategies in Metacognition-Evaluated Computer-Based Testing

    Science.gov (United States)

    Chen, Li-Ju; Ho, Rong-Guey; Yen, Yung-Chin

    2010-01-01

    This study aimed to explore the effects of marking and metacognition-evaluated feedback (MEF) in computer-based testing (CBT) on student performance and review behavior. Marking is a strategy, in which students place a question mark next to a test item to indicate an uncertain answer. The MEF provided students with feedback on test results…

  19. Secure data structures based on multi-party computation

    DEFF Research Database (Denmark)

    Toft, Tomas

    2011-01-01

    This work considers data structures based on multi-party computation (MPC) primitives: structuring secret (e.g. secret shared and potentially unknown) data such that it can both be queried and updated efficiently. Implementing an oblivious RAM (ORAM) using MPC allows any existing data structure...

  20. Replica-Based High-Performance Tuple Space Computing

    DEFF Research Database (Denmark)

    Andric, Marina; De Nicola, Rocco; Lluch Lafuente, Alberto

    2015-01-01

    We present the tuple-based coordination language RepliKlaim, which enriches Klaim with primitives for replica-aware coordination. Our overall goal is to offer suitable solutions to the challenging problems of data distribution and locality in large-scale high performance computing. In particular,...

  1. Computer-Based Information Services in Medicine: A Feasibility Study.

    Science.gov (United States)

    Cox, P. H.; And Others

    The objectives of this study were to examine the need and potential demand for computer-based information services in the University of Otago medical libraries, to evaluate the various databases of interest, and to recommend the best means of access to such services. Data were collected through user and library surveys, an extensive literature…

  2. THE STRATEGY OF RESOURCE MANAGEMENT BASED ON GRID COMPUTING

    Institute of Scientific and Technical Information of China (English)

    Wang Ruchuan; Han Guangfa; Wang Haiyan

    2006-01-01

    This paper analyzes the defaults of traditional method according to the resource management method of grid computing based on virtual organization. It supports the concept to ameliorate the resource management with mobile agent and gives the ameliorated resource management model. Also pointed out is the methodology of ameliorating resource management and the way to realize in reality.

  3. Improving Computer Based Speech Therapy Using a Fuzzy Expert System

    OpenAIRE

    Ovidiu Andrei Schipor; Stefan Gheorghe Pentiuc; Maria Doina Schipor

    2012-01-01

    In this paper we present our work about Computer Based Speech Therapy systems optimization. We focus especially on using a fuzzy expert system in order to determine specific parameters of personalized therapy, i.e. the number, length and content of training sessions. The efficiency of this new approach was tested during an experiment performed with our CBST, named LOGOMON.

  4. An Intelligent Computer-Based System for Sign Language Tutoring

    Science.gov (United States)

    Ritchings, Tim; Khadragi, Ahmed; Saeb, Magdy

    2012-01-01

    A computer-based system for sign language tutoring has been developed using a low-cost data glove and a software application that processes the movement signals for signs in real-time and uses Pattern Matching techniques to decide if a trainee has closely replicated a teacher's recorded movements. The data glove provides 17 movement signals from…

  5. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  6. Replication-based Inference Algorithms for Hard Computational Problems

    OpenAIRE

    Alamino, Roberto C.; Neirotti, Juan P.; Saad, David

    2013-01-01

    Inference algorithms based on evolving interactions between replicated solutions are introduced and analyzed on a prototypical NP-hard problem - the capacity of the binary Ising perceptron. The efficiency of the algorithm is examined numerically against that of the parallel tempering algorithm, showing improved performance in terms of the results obtained, computing requirements and simplicity of implementation.

  7. Tree Decomposition based Steiner Tree Computation over Large Graphs

    OpenAIRE

    2013-01-01

    In this paper, we present an exact algorithm for the Steiner tree problem. The algorithm is based on certain pre-computed index structures. Our algorithm offers a practical solution for the Steiner tree problems on graphs of large size and bounded number of terminals.

  8. Computer-Based Technologies in Dentistry: Types and Applications

    Directory of Open Access Journals (Sweden)

    Rajaa Mahdi Musawi

    2016-10-01

    Full Text Available During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR simulators, augmented reality (AR and computer aided design/computer aided manufacturing (CAD/CAM systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established.This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.Keywords: Virtual Reality Exposure Therapy; Immersion; Computer-Aided Design; Dentistry; Education

  9. Robot Animals Based on Brain-Computer Interface

    Institute of Scientific and Technical Information of China (English)

    Yang Xia; Lei Lei; Tie-Jun Liu; De-Zhong Yao

    2009-01-01

    The study of robot animals based on brain-computer interface (BCI) technology is an important field in robots and neuroscience at present.In this paper,the development status at home and abroad of the motion control of robot based on BCI and principle of robot animals are introduced,then a new animals' behavior control method by photostimulation is presented.At last,the application prospect is provided.

  10. Tomato classification based on laser metrology and computer algorithms

    Science.gov (United States)

    Igno Rosario, Otoniel; Muñoz Rodríguez, J. Apolinar; Martínez Hernández, Haydeé P.

    2011-08-01

    An automatic technique for tomato classification is presented based on size and color. The size is determined based on surface contouring by laser line scanning. Here, a Bezier network computes the tomato height based on the line position. The tomato color is determined by CIELCH color space and the components red and green. Thus, the tomato size is classified in large, medium and small. Also, the tomato is classified into six colors associated with its maturity. The performance and accuracy of the classification system is evaluated based on methods reported in the recent years. The technique is tested and experimental results are presented.

  11. Noise-based deterministic logic and computing: a brief survey

    CERN Document Server

    Kish, Laszlo B; Bezrukov, Sergey M; Peper, Ferdinand; Gingl, Zoltan; Horvath, Tamas

    2010-01-01

    A short survey is provided about our recent explorations of the young topic of noise-based logic. After outlining the motivation behind noise-based computation schemes, we present a short summary of our ongoing efforts in the introduction, development and design of several noise-based deterministic multivalued logic schemes and elements. In particular, we describe classical, instantaneous, continuum, spike and random-telegraph-signal based schemes with applications such as circuits that emulate the brain's functioning and string verification via a slow communication channel.

  12. Component-based software for high-performance scientific computing

    Science.gov (United States)

    Alexeev, Yuri; Allan, Benjamin A.; Armstrong, Robert C.; Bernholdt, David E.; Dahlgren, Tamara L.; Gannon, Dennis; Janssen, Curtis L.; Kenny, Joseph P.; Krishnan, Manojkumar; Kohl, James A.; Kumfert, Gary; Curfman McInnes, Lois; Nieplocha, Jarek; Parker, Steven G.; Rasmussen, Craig; Windus, Theresa L.

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  13. Using computer-based tests for information science

    Directory of Open Access Journals (Sweden)

    David Callear

    1997-12-01

    Full Text Available The introduction of objective testing using computer software does not necessarily represent innovative assessment. Where tests occur as an add-on to a course, are timeconstrained, closed-book, invigilated, and where there is little (or no feedback of results to the students, such testing is best regarded as an innovative technique for traditional summative assessment. A computer-based examination of this nature using the commercial software Question Mark has been operating for a number of years in the Department of Information Science at Portsmouth, in the second-year unit for Logic Programming, with student numbers up to 160.

  14. An Indoor Ubiquitous Computing Environment Based on Location awareness

    Institute of Scientific and Technical Information of China (English)

    PU Fang; SUN Dao-qing; CAO Qi-ying; CAI Hai-bin; LI Yong-ning

    2006-01-01

    To provide the right services or information to the right users, at the right time and in the right place in ubiquitous computing environment, an Indoor Ubiquitous Computing Environment based on Location-Awareness, IUCELA, is presented in this paper. A general architecture of IUCELA is designed to connect multiple sensing devices with locationaware applications. Then the function of location-aware middleware which is the core component of the proposed architecture is elaborated. Finally an indoor forum is taken as an example scenario to demonstrate the security,usefulness, flexibility and robustness of IUCELA.

  15. Intelligent Financial Portfolio Composition based on Evolutionary Computation Strategies

    CERN Document Server

    Gorgulho, Antonio; Horta, Nuno C G

    2013-01-01

    The management of financial portfolios or funds constitutes a widely known problematic in financial markets which normally requires a rigorous analysis in order to select the most profitable assets. This subject is becoming popular among computer scientists which try to adapt known Intelligent Computation techniques to the market’s domain. This book proposes a potential system based on Genetic Algorithms, which aims to manage a financial portfolio by using technical analysis indicators. The results are promising since the approach clearly outperforms the remaining approaches during the recent market crash.

  16. SLA for E-Learning System Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Doaa Elmatary

    2015-10-01

    Full Text Available The Service Level Agreement (SLA becomes an important issue especially over the Cloud Computing and online services that based on the ‘pay-as-you-use’ fashion. Establishing the Service level agreements (SLAs, which can be defined as a negotiation between the service provider and the user, is needed for many types of current applications as the E-Learning systems. The work in this paper presents an idea of optimizing the SLA parameters to serve any E-Learning system over the Cloud Computing platform, with defining the negotiation process, the suitable frame work, and the sequence diagram to accommodate the E-Learning systems.

  17. 2.5D dictionary learning based computed tomography reconstruction

    Science.gov (United States)

    Luo, Jiajia; Eri, Haneda; Can, Ali; Ramani, Sathish; Fu, Lin; De Man, Bruno

    2016-05-01

    A computationally efficient 2.5D dictionary learning (DL) algorithm is proposed and implemented in the model- based iterative reconstruction (MBIR) framework for low-dose CT reconstruction. MBIR is based on the minimization of a cost function containing data-fitting and regularization terms to control the trade-off between data-fidelity and image noise. Due to the strong denoising performance of DL, it has previously been considered as a regularizer in MBIR, and both 2D and 3D DL implementations are possible. Compared to the 2D case, 3D DL keeps more spatial information and generates images with better quality although it requires more computation. We propose a novel 2.5D DL scheme, which leverages the computational advantage of 2D-DL, while attempting to maintain reconstruction quality similar to 3D-DL. We demonstrate the effectiveness of this new 2.5D DL scheme for MBIR in low-dose CT. By applying the 2D DL method in three different orthogonal planes and calculating the sparse coefficients accordingly, much of the 3D spatial information can be preserved without incurring the computational penalty of the 3D DL method. For performance evaluation, we use baggage phantoms with different number of projection views. In order to quantitatively compare the performance of different algorithms, we use PSNR, SSIM and region based standard deviation to measure the noise level, and use the edge response to calculate the resolution. Experimental results with full view datasets show that the different DL based algorithms have similar performance and 2.5D DL has the best resolution. Results with sparse view datasets show that 2.5D DL outperforms both 2D and 3D DL in terms of noise reduction. We also compare the computational costs, and 2.5D DL shows strong advantage over 3D DL in both full-view and sparse-view cases.

  18. Analog Optical Computing Based on Dielectric Meta-reflect-array

    CERN Document Server

    Chizari, Ata; Jamali, Mohammad Vahid; Salehi, Jawad A

    2016-01-01

    In this paper, we realize the concept of analog computing using an array of engineered gradient dielectric meta-reflect-array. The proposed configuration consists of individual subwavelength silicon nanobricks in combination with fused silica spacer and silver ground plane realizing a reflection beam with full phase coverage $2\\pi$ degrees as well as amplitude range $0$ to $1$. Spectrally overlapping electric and magnetic dipole resonances, such high-index dielectric metasurfaces can locally and independently manipulate the amplitude and phase of the incident electromagnetic wave. This practically feasible structure overcomes substantial limitations imposed by plasmonic metasurfaces such as absorption losses and low polarization conversion efficiency in the visible range. Using such CMOS-compatible and easily integrable platforms promises highly efficient ultrathin planar wave-based computing systems which circumvent the drawbacks of conventional bulky lens-based signal processors. Based on these key properti...

  19. Multi-pattern Matching Methods Based on Numerical Computation

    Directory of Open Access Journals (Sweden)

    Lu Jun

    2013-01-01

    Full Text Available Multi-pattern matching methods based on numerical computation are advanced in this paper. Firstly it advanced the multiple patterns matching algorithm based on added information. In the process of accumulating of information, the select method of byte-accumulate operation will affect the collision odds , which means that the methods or bytes involved in the different matching steps should have greater differences as much as possible. In addition, it can use balanced binary tree to manage index to reduce the average searching times, and use the characteristics of a given pattern set by setting the collision field to eliminate collision further. In order to reduce the collision odds in the initial step, the information splicing method is advanced, which has greater value space than added information method, thus greatly reducing the initial collision odds. Multiple patterns matching methods based on numerical computation fits for large multi-pattern matching.

  20. An Evaluation of Computer-Based Instruction in Microbiology

    Directory of Open Access Journals (Sweden)

    Susan M. Merkel

    2009-12-01

    Full Text Available There has been a tremendous increase in the availability of computer-based instructional (CBI materials. Some studies have shown an improvement in learning when CBI is used. However, many researchers believe the current studies are inadequate. While CBI software should be thoroughly tested by developers, as educators, we should be concerned about whether or not the CBI materials we use are improving learning in our classrooms with our students. We present an evaluation of a computer-based hypermedia tutorial that was delivered over our General Microbiology website. We found that CBI was at least as effective as text-based material. However, of all students who used CBI, only those who explored most of the site benefited from using the site. Tracking each student's use of the CBI was critical for understanding who was learning and why.

  1. An Evaluation of Computer-Based Instruction in Microbiology

    Directory of Open Access Journals (Sweden)

    Jerry S. Leventhal

    2000-12-01

    Full Text Available There has been a tremendous increase in the availability of computer-based instructional (CBI materials. Some studies have shown an improvement in learning when CBI is used. However, many researchers believe the current studies are inadequate. While CBI software should be thoroughly tested by developers, as educators, we should be concerned about whether or not the CBI materials we use are improving learning in our classrooms with our students. We present an evaluation of a computer-based hypermedia tutorial that was delivered over our General Microbiology website. We found that CBI was at least as effective as text-based material. However, of all students who used CBI, only those who explored most of the site benefited from using the site. Tracking each student's use of the CBI was critical for understanding who was learning and why.

  2. A PRESSURE-BASED ALGORITHM FOR CAVITATING FLOW COMPUTATIONS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ling-xin; ZHAO Wei-guo; SHAO Xue-ming

    2011-01-01

    A pressure-based algorithm for the prediction of cavitating flows is presented. The algorithm employs a set of equations including the Navier-Stokes equations and a cavitation model explaining the phase change between liquid and vapor. A pressure-based method is used to construct the algorithm and the coupling between pressure and velocity is considered. The pressure correction equation is derived from a new continuity equation which employs a source term related to phase change rate instead of the material derivative of density Dp/Dt.Thispressure-based algorithm allows for the computation of steady or unsteady,2-Dor 3-D cavitating flows. Two 2-D cases, flows around a flat-nose cylinder and around a NACA0015 hydrofoil, are simulated respectively, and the periodic cavitation behaviors associated with the re-entrant jets are captured. This algorithm shows good capability of computating time-dependent cavitating flows.

  3. Computer-based Training in Medicine and Learning Theories.

    Science.gov (United States)

    Haag, Martin; Bauch, Matthias; Garde, Sebastian; Heid, Jörn; Weires, Thorsten; Leven, Franz-Josef

    2005-01-01

    Computer-based training (CBT) systems can efficiently support modern teaching and learning environments. In this paper, we demonstrate on the basis of the case-based CBT system CAMPUS that current learning theories and design principles (Bloom's Taxonomy and practice fields) are (i) relevant to CBT and (ii) are feasible to implement using computer-based training and adequate learning environments. Not all design principles can be fulfilled by the system alone, the integration of the system in adequate teaching and learning environments therefore is essential. Adequately integrated, CBT programs become valuable means to build or support practice fields for learners that build domain knowledge and problem-solving skills. Learning theories and their design principles can support in designing these systems as well as in assessing their value.

  4. Template based parallel checkpointing in a massively parallel computer system

    Science.gov (United States)

    Archer, Charles Jens; Inglett, Todd Alan

    2009-01-13

    A method and apparatus for a template based parallel checkpoint save for a massively parallel super computer system using a parallel variation of the rsync protocol, and network broadcast. In preferred embodiments, the checkpoint data for each node is compared to a template checkpoint file that resides in the storage and that was previously produced. Embodiments herein greatly decrease the amount of data that must be transmitted and stored for faster checkpointing and increased efficiency of the computer system. Embodiments are directed to a parallel computer system with nodes arranged in a cluster with a high speed interconnect that can perform broadcast communication. The checkpoint contains a set of actual small data blocks with their corresponding checksums from all nodes in the system. The data blocks may be compressed using conventional non-lossy data compression algorithms to further reduce the overall checkpoint size.

  5. Dynamic detection for computer virus based on immune system

    Institute of Scientific and Technical Information of China (English)

    LI Tao

    2008-01-01

    Inspired by biological immune system,a new dynamic detection model for computer virus based on immune system is proposed.The quantitative description of the model is given.The problem of dynamic description for self and nonself in a computer virus immune system is solved,which reduces the size of self set.The new concept of dynamic tolerance,as well as the new mechanisms of gene evolution and gene coding for immature detectors is presented,improving the generating efficiency of mature detectors,reducing the false-negative and false-positive rates.Therefore,the difficult problem,in which the detector training cost is exponentially related to the size of self-set in a traditional computer immune system,is thus overcome.The theory analysis and experimental results show that the proposed model has better time efficiency and detecting ability than the classic model ARTIS.

  6. Description of the computer-based patient record and computer-based patient record system. CPRI Work Group on CPR Description.

    Science.gov (United States)

    1996-01-01

    Computer-based patient records and computer-based patient record systems support health care effectiveness and efficiency with appropriate safeguards for confidentiality. Achieving a health information infrastructure with computer-based patient records supported by fully integrated computer-based patient record systems is obviously a process of incremental steps. However, CPRI believes significant benefits in health care delivery are certain to be realized over the full course of this process.

  7. Parallel processing using an optical delay-based reservoir computer

    Science.gov (United States)

    Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy

    2016-04-01

    Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015

  8. FPGA Based Quadruple Precision Floating Point Arithmetic for Scientific Computations

    Directory of Open Access Journals (Sweden)

    Mamidi Nagaraju

    2012-09-01

    Full Text Available In this project we explore the capability and flexibility of FPGA solutions in a sense to accelerate scientific computing applications which require very high precision arithmetic, based on IEEE 754 standard 128-bit floating-point number representations. Field Programmable Gate Arrays (FPGA is increasingly being used to design high end computationally intense microprocessors capable of handling floating point mathematical operations. Quadruple Precision Floating-Point Arithmetic is important in computational fluid dynamics and physical modelling, which require accurate numerical computations. However, modern computers perform binary arithmetic, which has flaws in representing and rounding the numbers. As the demand for quadruple precision floating point arithmetic is predicted to grow, the IEEE 754 Standard for Floating-Point Arithmetic includes specifications for quadruple precision floating point arithmetic. We implement quadruple precision floating point arithmetic unit for all the common operations, i.e. addition, subtraction, multiplication and division. While previous work has considered circuits for low precision floating-point formats, we consider the implementation of 128-bit quadruple precision circuits. The project will provide arithmetic operation, simulation result, hardware design, Input via PS/2 Keyboard interface and results displayed on LCD using Xilinx virtex5 (XC5VLX110TFF1136 FPGA device.

  9. The Relative Effectiveness of Computer-Based and Traditional Resources for Education in Anatomy

    Science.gov (United States)

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R.; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic…

  10. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...... to delays in recruitment. This delay has not had any impact on the results obtain; on the contrary. From a research management point-of-view, the project has learned us several lessons, which are being incorporated into the management of current research project at ITU. The research on the ABC concepts...

  11. Real-time computing without stable states: a new framework for neural computation based on perturbations.

    Science.gov (United States)

    Maass, Wolfgang; Natschläger, Thomas; Markram, Henry

    2002-11-01

    A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.

  12. A survey of GPU-based medical image computing techniques.

    Science.gov (United States)

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine.

  13. A Knowledge-Based Analysis of Global Function Computation

    CERN Document Server

    Halpern, Joseph Y

    2007-01-01

    Consider a distributed system N in which each agent has an input value and each communication link has a weight. Given a global function, that is, a function f whose value depends on the whole network, the goal is for every agent to eventually compute the value f(N). We call this problem global function computation. Various solutions for instances of this problem, such as Boolean function computation, leader election, (minimum) spanning tree construction, and network determination, have been proposed, each under particular assumptions about what processors know about the system and how this knowledge can be acquired. We give a necessary and sufficient condition for the problem to be solvable that generalizes a number of well-known results. We then provide a knowledge-based (kb) program (like those of Fagin, Halpern, Moses, and Vardi) that solves global function computation whenever possible. Finally, we improve the message overhead inherent in our initial kb program by giving a counterfactual belief-based pro...

  14. Developing Educational Computer Animation Based on Human Personality Types

    Directory of Open Access Journals (Sweden)

    Sajid Musa

    2015-03-01

    Full Text Available Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered to be one of the most elegant ways for preparing materials for teaching, and its importance in assisting learners to process, understand and remember information efficiently has vastly grown since the advent of powerful graphics-oriented computers era. Based on theories and facts of psychology, colour science, computer animation, geometric modelling and technical aesthetics, this study intends to establish an inter-disciplinary area of research towards a greater educational effectiveness. With today’s high educational demands as well as the lack of time provided for certain courses, classical educational methods have shown deficiencies in keeping up with the drastic changes observed in the digital era. Generally speaking, without taking into account various significant factors as, for instance, gender, age, level of interest and memory level, educational animations may turn out to be insufficient for learners or fail to meet their needs. Though, we have noticed that the applications of animation for education have been given only inadequate attention, and students’ personality types of temperaments (sanguine, choleric, melancholic, phlegmatic, etc. have never been taken into account. We suggest there is an interesting relationship here, and propose essential factors in creating educational animations based on students’ personality types. Particularly, we study how information in computer animation may be presented in a more preferable way based on font types and their families, colours and colour schemes, emphasizing texts, shapes of characters designed by planar quadratic Bernstein-Bézier curves

  15. A Hybrid Brain-Computer Interface-Based Mail Client

    Directory of Open Access Journals (Sweden)

    Tianyou Yu

    2013-01-01

    Full Text Available Brain-computer interface-based communication plays an important role in brain-computer interface (BCI applications; electronic mail is one of the most common communication tools. In this study, we propose a hybrid BCI-based mail client that implements electronic mail communication by means of real-time classification of multimodal features extracted from scalp electroencephalography (EEG. With this BCI mail client, users can receive, read, write, and attach files to their mail. Using a BCI mouse that utilizes hybrid brain signals, that is, motor imagery and P300 potential, the user can select and activate the function keys and links on the mail client graphical user interface (GUI. An adaptive P300 speller is employed for text input. The system has been tested with 6 subjects, and the experimental results validate the efficacy of the proposed method.

  16. A Reputation-Based Identity Management Model for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lifa Wu

    2015-01-01

    Full Text Available In the field of cloud computing, most research on identity management has concentrated on protecting user data. However, users typically leave a trail when they access cloud services, and the resulting user traceability can potentially lead to the leakage of sensitive user information. Meanwhile, malicious users can do harm to cloud providers through the use of pseudonyms. To solve these problems, we introduce a reputation mechanism and design a reputation-based identity management model for cloud computing. In the model, pseudonyms are generated based on a reputation signature so as to guarantee the untraceability of pseudonyms, and a mechanism that calculates user reputation is proposed, which helps cloud service providers to identify malicious users. Analysis verifies that the model can ensure that users access cloud services anonymously and that cloud providers assess the credibility of users effectively without violating user privacy.

  17. Could one make a diamond-based quantum computer?

    Science.gov (United States)

    Stoneham, A Marshall; Harker, A H; Morley, Gavin W

    2009-09-09

    We assess routes to a diamond-based quantum computer, where we specifically look towards scalable devices, with at least 10 linked quantum gates. Such a computer should satisfy the deVincenzo rules and might be used at convenient temperatures. The specific examples that we examine are based on the optical control of electron spins. For some such devices, nuclear spins give additional advantages. Since there have already been demonstrations of basic initialization and readout, our emphasis is on routes to two-qubit quantum gate operations and the linking of perhaps 10-20 such gates. We analyse the dopant properties necessary, especially centres containing N and P, and give results using simple scoping calculations for the key interactions determining gate performance. Our conclusions are cautiously optimistic: it may be possible to develop a useful quantum information processor that works above cryogenic temperatures.

  18. The effects of format in computer-based procedure displays

    Science.gov (United States)

    Desaulniers, David R.; Gillan, Douglas J.; Rudisill, Marianne

    1988-01-01

    Two experiments were conducted to investigate display variables likely to influence the effectiveness of computer-based procedure displays. In experiment 1, procedures were presented in three formats, text, extended-text, and flowchart. Text and extended-text are structured prose formats which differ in the spatial density of presentation. The flowchart format differs from the text format in both syntax and spatial representation. Subjects were required to use the procedures to diagnose a hypothetical system anomaly. The results indicate that performance was most accurate with the flowchart format. In experiment 2, procedure window size was varied (6-line, 12-line, and 24-line) in addition to procedure format. In the six line window condition, experiment 2 replicated the findings of experiment 1. As predicted, completion times for flowchart procedures decreased with increasing window size; however, accuracy of performance decreased substantially. Implications for the design of computer-based procedure displays are discussed.

  19. Distance Based Asynchronous Recovery Approach In Mobile Computing Environment

    Directory of Open Access Journals (Sweden)

    Yogita Khatri,

    2012-06-01

    Full Text Available A mobile computing system is a distributed system in which at least one of the processes is mobile. They are constrained by lack of stable storage, low network bandwidth, mobility, frequent disconnection andlimited battery life. Checkpointing is one of the commonly used techniques to provide fault tolerance in mobile computing environment. In order to suit the mobile environment a distance based recovery schemeis proposed which is based on checkpointing and message logging. After the system recovers from failures, only the failed processes rollback and restart from their respective recent checkpoints, independent of the others. The salient feature of this scheme is to reduce the transfer and recovery cost. While the mobile host moves with in a specific range, recovery information is not moved and thus only be transferred nearby if the mobile host moves out of certain range.

  20. Using Case-Based Reasoning for detecting computer virus

    Directory of Open Access Journals (Sweden)

    Abdellatif Berkat

    2011-07-01

    Full Text Available The typical antivirus approach consists of waiting for a number of computers to be infected, detecting the virus, designing a solution, delivering and deploying a solution. In such a situation, it is very difficult to prevent every machine from being compromised by viruses. In this paper, we propose a new method, for detecting computer viruses, that is based on the technique of Case-Based Reasoning (CBR. In this method: (1 even new viruses that do not exist in the database can be detected (2 The updating of the virus database is done automatically without connecting to the Internet. Whenever a new virus is detected, it will be automatically added to the database used by our application. This presents a major advantage

  1. Agent-Based Service Composition in Cloud Computing

    Science.gov (United States)

    Gutierrez-Garcia, J. Octavio; Sim, Kwang-Mong

    In a Cloud-computing environment, consumers, brokers, and service providers interact to achieve their individual purposes. In this regard, service providers offer a pool of resources wrapped as web services, which should be composed by broker agents to provide a single virtualized service to Cloud consumers. In this study, an agent-based test bed for simulating Cloud-computing environments is developed. Each Cloud participant is represented by an agent, whose behavior is defined by means of colored Petri nets. The relationship between web services and service providers is modeled using object Petri nets. Both Petri net formalisms are combined to support a design methodology for defining concurrent and parallel service choreographies. This results in the creation of a dynamic agent-based service composition algorithm. The simulation results indicate that service composition is achieved with a linear time complexity despite dealing with interleaving choreographies and synchronization of heterogeneous services.

  2. An Improved Multiple Faults Reassignment based Recovery in Cluster Computing

    CERN Document Server

    Bansal, Sanjay

    2011-01-01

    In case of multiple node failures performance becomes very low as compare to single node failure. Failures of nodes in cluster computing can be tolerated by multiple fault tolerant computing. Existing recovery schemes are efficient for single fault but not with multiple faults. Recovery scheme proposed in this paper having two phases; sequentially phase, concurrent phase. In sequentially phase, loads of all working nodes are uniformly and evenly distributed by proposed dynamic rank based and load distribution algorithm. In concurrent phase, loads of all failure nodes as well as new job arrival are assigned equally to all available nodes by just finding the least loaded node among the several nodes by failure nodes job allocation algorithm. Sequential and concurrent executions of algorithms improve the performance as well better resource utilization. Dynamic rank based algorithm for load redistribution works as a sequential restoration algorithm and reassignment algorithm for distribution of failure nodes to l...

  3. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    This report describes the results of the Activity-Based Computing (ABC) project granted by the Danish Strategic Re- search Council, grant no. #2106-04-0019. In summary, we conclude that the ABC project has been highly successful. Not only has it meet all of its objectives and expected results......, but have been able to pull additional resources and move beyond what was originally planned in the project. From a research perspective, all of the original research objectives of the project have been met and published in 4 journal articles, 13 peer-reviewed conference papers, and two book chapters......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...

  4. Arcade: A Web-Java Based Framework for Distributed Computing

    Science.gov (United States)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  5. MCPLOTS. A particle physics resource based on volunteer computing

    Energy Technology Data Exchange (ETDEWEB)

    Karneyeu, A. [Joint Inst. for Nuclear Research, Moscow (Russian Federation); Mijovic, L. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Irfu/SPP, CEA-Saclay, Gif-sur-Yvette (France); Prestel, S. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Lund Univ. (Sweden). Dept. of Astronomy and Theoretical Physics; Skands, P.Z. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2013-07-15

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  6. MCPLOTS: a particle physics resource based on volunteer computing

    CERN Document Server

    Karneyeu, A; Prestel, S; Skands, P Z

    2014-01-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.

  7. Discovery of technical methanation catalysts based on computational screening

    DEFF Research Database (Denmark)

    Sehested, Jens; Larsen, Kasper Emil; Kustov, Arkadii

    2007-01-01

    Methanation is a classical reaction in heterogeneous catalysis and significant effort has been put into improving the industrially preferred nickel-based catalysts. Recently, a computational screening study showed that nickel-iron alloys should be more active than the pure nickel catalyst...... and at the same time less expensive. This was previously verified experimentally for pure CO hydrogenation. In this study, the improved activity is also verified for CO2 hydrogenation as well as for simultaneous CO and CO2 hydrogenation....

  8. PEA: Polymorphic Encryption Algorithm based on quantum computation

    OpenAIRE

    Komninos, N.; Mantas, G.

    2011-01-01

    In this paper, a polymorphic encryption algorithm (PEA), based on basic quantum computations, is proposed for the encryption of binary bits. PEA is a symmetric key encryption algorithm that applies different combinations of quantum gates to encrypt binary bits. PEA is also polymorphic since the states of the shared secret key control the different combinations of the ciphertext. It is shown that PEA achieves perfect secrecy and is resilient to eavesdropping and Trojan horse attacks. A securit...

  9. The Unknown Computer Viruses Detection Based on Similarity

    OpenAIRE

    Liu, Zhongda; NAKAYA, Naoshi; KOUI, Yuuji

    2009-01-01

    New computer viruses are continually being generated and they cause damage all over the world. In general, current anti-virus software detects viruses by matching a pattern based on the signature; thus, unknown viruses without any signature cannot be detected. Although there are some static analysis technologies that do not depend on signatures, virus writers often use code obfuscation techniques, which make it difficult to execute a code analysis. As is generally known, unknown viruses and k...

  10. CONSTRUCTION COST INTEGRATED CONTROL BASED ON COMPUTER SIMULATION

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Construction cost control is a complex system engineering. Thetraditional controlling method cannot dynamically control in advance the construction cost because of its hysteresis. This paper proposes a computer simulation based construction cost integrated control method, which combines the cost with PERT systematically, so that the construction cost can be predicted and optimized systematically and effectively. The new method overcomes the hysteresis of the traditional systems, and is a distinct improvement over them in effect and practicality.

  11. A scalable PC-based parallel computer for lattice QCD

    CERN Document Server

    Fodor, Z; Papp, G

    2002-01-01

    A PC-based parallel computer for medium/large scale lattice QCD simulations is suggested. The Eotvos Univ., Inst. Theor. Phys. cluster consists of 137 Intel P4-1.7GHz nodes. Gigabit Ethernet cards are used for nearest neighbor communication in a two-dimensional mesh. The sustained performance for dynamical staggered(wilson) quarks on large lattices is around 70(110) GFlops. The exceptional price/performance ratio is below $1/Mflop.

  12. 3D measurement system based on computer-generated gratings

    Science.gov (United States)

    Zhu, Yongjian; Pan, Weiqing; Luo, Yanliang

    2010-08-01

    A new kind of 3D measurement system has been developed to achieve the 3D profile of complex object. The principle of measurement system is based on the triangular measurement of digital fringe projection, and the fringes are fully generated from computer. Thus the computer-generated four fringes form the data source of phase-shifting 3D profilometry. The hardware of system includes the computer, video camera, projector, image grabber, and VGA board with two ports (one port links to the screen, another to the projector). The software of system consists of grating projection module, image grabbing module, phase reconstructing module and 3D display module. A software-based synchronizing method between grating projection and image capture is proposed. As for the nonlinear error of captured fringes, a compensating method is introduced based on the pixel-to-pixel gray correction. At the same time, a least square phase unwrapping is used to solve the problem of phase reconstruction by using the combination of Log Modulation Amplitude and Phase Derivative Variance (LMAPDV) as weight. The system adopts an algorithm from Matlab Tool Box for camera calibration. The 3D measurement system has an accuracy of 0.05mm. The execution time of system is 3~5s for one-time measurement.

  13. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  14. [Forensic evidence-based medicine in computer communication networks].

    Science.gov (United States)

    Qiu, Yun-Liang; Peng, Ming-Qi

    2013-12-01

    As an important component of judicial expertise, forensic science is broad and highly specialized. With development of network technology, increasement of information resources, and improvement of people's legal consciousness, forensic scientists encounter many new problems, and have been required to meet higher evidentiary standards in litigation. In view of this, evidence-based concept should be established in forensic medicine. We should find the most suitable method in forensic science field and other related area to solve specific problems in the evidence-based mode. Evidence-based practice can solve the problems in legal medical field, and it will play a great role in promoting the progress and development of forensic science. This article reviews the basic theory of evidence-based medicine and its effect, way, method, and evaluation in the forensic medicine in order to discuss the application value of forensic evidence-based medicine in computer communication networks.

  15. Wearable Computing System with Input-Output Devices Based on Eye-Based Human Computer Interaction Allowing Location Based Web Services

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-08-01

    Full Text Available Wearable computing with Input-Output devices Base on Eye-Based Human Computer Interaction: EBHCI which allows location based web services including navigation, location/attitude/health condition monitoring is proposed. Through implementation of the proposed wearable computing system, all the functionality is confirmed. It is also found that the system does work well. It can be used easily and also is not expensive. Experimental results for EBHCI show excellent performance in terms of key-in accuracy as well as input speed. It is accessible to internet, obviously, and has search engine capability.

  16. Development of a personal-computer-based intelligent tutoring system

    Science.gov (United States)

    Mueller, Stephen J.

    1988-01-01

    A large number of Intelligent Tutoring Systems (ITSs) have been built since they were first proposed in the early 1970's. Research conducted on the use of the best of these systems has demonstrated their effectiveness in tutoring in selected domains. A prototype ITS for tutoring students in the use of CLIPS language: CLIPSIT (CLIPS Intelligent Tutor) was developed. For an ITS to be widely accepted, not only must it be effective, flexible, and very responsive, it must also be capable of functioning on readily available computers. While most ITSs have been developed on powerful workstations, CLIPSIT is designed for use on the IBM PC/XT/AT personal computer family (and their clones). There are many issues to consider when developing an ITS on a personal computer such as the teaching strategy, user interface, knowledge representation, and program design methodology. Based on experiences in developing CLIPSIT, results on how to address some of these issues are reported and approaches are suggested for maintaining a powerful learning environment while delivering robust performance within the speed and memory constraints of the personal computer.

  17. Image Interpretation Instruction Via A Computer-Based-Training System

    Science.gov (United States)

    Weisman, Melanie

    1988-02-01

    As newer and more sophisticated imagery collection systems rapidly increase the volume of imagery requiring thorough exploitation, the need for imagery analysts to acquire and maintain expertise increases accordingly. In response, Loral Systems Group (Arizona) has produced a computer-based-training (CBT) system that presents a series of lessons on radar imaging principles and their application to the various orders of battle. The training system is composed of two host computers, four student/instructor workstations, a printer, and lesson material. The computers control the imagery presentation, deliver twenty-eight interactive lessons of computer-assisted instruction, and generate reports. Each dual-screen workstation presents lessons consisting of instructional text coupled with representative imagery annotated with color graphics. Although the system is designed for the unique characteristics of radar interpretation, alternative courseware could instruct interpretation techniques for other imagery (photographic, electro-optical, infrared). Regardless of the sensor type and amount of available imagery, both commercial and military segments of the interpretation community will benefit only if the interpreter/analyst is successfully trained to translate image information into useful terms.

  18. The computer-based control system of the NAC accelerator

    Science.gov (United States)

    Burdzik, G. F.; Bouckaert, R. F. A.; Cloete, I.; Dutoit, J. S.; Kohler, I. H.; Truter, J. N. J.; Visser, K.; Wikner, V. C. S. J.

    The National Accelerator Center (NAC) of the CSIR is building a two-stage accelerator which will provide charged-particle beams for use in medical and research applications. The control system for this accelerator is based on three mini-computers and a CAMAC interfacing network. Closed-loop control is being relegated to the various subsystems of the accelerators, and the computers and CAMAC network will be used in the first instance for data transfer, monitoring and servicing of the control consoles. The processing power of the computers will be utilized for automating start-up and beam-change procedures, for providing flexible and convenient information at the control consoles, for fault diagnosis and for beam-optimizing procedures. Tasks of a localized or dedicated nature are being off-loaded onto microcomputers, which are being used either in front-end devices or as slaves to the mini-computers. On the control consoles only a few instruments for setting and monitoring variables are being provided, but these instruments are universally-linkable to any appropriate machine variable.

  19. Computer Crime Forensics Based on Improved Decision Tree Algorithm

    Directory of Open Access Journals (Sweden)

    Ying Wang

    2014-04-01

    Full Text Available To find out the evidence of crime-related evidence and association rules among massive data, the classic decision tree algorithms such as ID3 for classification analysis have appeared in related prototype systems. So how to make it more suitable for computer forensics in variable environments becomes a hot issue. When selecting classification attributes, ID3 relies on computation of information entropy. Then the attributes owning more value are selected as classification nodes of the decision tress. Such classification is unrealistic under many cases. During the process of ID3 algorithm there are too many logarithms, so it is complicated to handle with the dataset which has various classification attributes. Therefore, contraposing the special demand for computer crime forensics, ID3 algorithm is improved and a novel classification attribute selection method based on Maclaurin-Priority Value First method is proposed. It adopts the foot changing formula and infinitesimal substitution to simplify the logarithms in ID3. For the errors generated in this process, an apposite constant is introduced to be multiplied by the simplified formulas for compensation. The idea of Priority Value First is introduced to solve the problems of value deviation. The performance of improved method is strictly proved in theory. Finally, the experiments verify that our scheme has advantage in computation time and classification accuracy, compared to ID3 and two existing algorithms

  20. Research on Cloud Computing Resources Provisioning Based on Reinforcement Learning

    Directory of Open Access Journals (Sweden)

    Zhiping Peng

    2015-01-01

    Full Text Available As one of the core issues for cloud computing, resource management adopts virtualization technology to shield the underlying resource heterogeneity and complexity which makes the massive distributed resources form a unified giant resource pool. It can achieve efficient resource provisioning by using the rational implementing resource management methods and techniques. Therefore, how to manage cloud computing resources effectively becomes a challenging research topic. By analyzing the executing progress of a user job in the cloud computing environment, we proposed a novel resource provisioning scheme based on the reinforcement learning and queuing theory in this study. With the introduction of the concepts of Segmentation Service Level Agreement (SSLA and Utilization Unit Time Cost (UUTC, we viewed the resource provisioning problem in cloud computing as a sequential decision issue, and then we designed a novel optimization object function and employed reinforcement learning to solve it. Experiment results not only demonstrated the effectiveness of the proposed scheme, but also proved to outperform the common methods of resource utilization rate in terms of SLA collision avoidance and user costs.

  1. Rough K-means Outlier Factor Based on Entropy Computation

    Directory of Open Access Journals (Sweden)

    Djoko Budiyanto Setyohadi

    2014-07-01

    Full Text Available Many studies of outlier detection have been developed based on the cluster-based outlier detection approach, since it does not need any prior knowledge of the dataset. However, the previous studies only regard the outlier factor computation with respect to a single point or a small cluster, which reflects its deviates from a common cluster. Furthermore, all objects within outlier cluster are assumed to be similar. The outlier objects intuitively can be grouped into the outlier clusters and the outlier factors of each object within the outlier cluster should be different gradually. It is not natural if the outlierness of each object within outlier cluster is similar. This study proposes the new outlier detection method based on the hybrid of the Rough K-Means clustering algorithm and the entropy computation. We introduce the outlier degree measure namely the entropy outlier factor for the cluster based outlier detection. The proposed algorithm sequentially finds the outlier cluster and calculates the outlier factor degree of the objects within outlier cluster. Each object within outlier cluster is evaluated using entropy cluster-based to a whole cluster. The performance of the algorithm has been tested on four UCI benchmark data sets and show outperform especially in detection rate.

  2. Some Computational Aspects of the Brain Computer Interfaces Based on Inner Music

    Science.gov (United States)

    Klonowski, Wlodzimierz; Duch, Wlodzisław; Perovic, Aleksandar; Jovanovic, Aleksandar

    2009-01-01

    We discuss the BCI based on inner tones and inner music. We had some success in the detection of inner tones, the imagined tones which are not sung aloud. Rather easily imagined and controlled, they offer a set of states usable for BCI, with high information capacity and high transfer rates. Imagination of sounds or musical tunes could provide a multicommand language for BCI, as if using the natural language. Moreover, this approach could be used to test musical abilities. Such BCI interface could be superior when there is a need for a broader command language. Some computational estimates and unresolved difficulties are presented. PMID:19503802

  3. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  4. Demonstration of optical computing logics based on binary decision diagram.

    Science.gov (United States)

    Lin, Shiyun; Ishikawa, Yasuhiko; Wada, Kazumi

    2012-01-16

    Optical circuits are low power consumption and fast speed alternatives for the current information processing based on transistor circuits. However, because of no transistor function available in optics, the architecture for optical computing should be chosen that optics prefers. One of which is Binary Decision Diagram (BDD), where signal is processed by sending an optical signal from the root through a serial of switching nodes to the leaf (terminal). Speed of optical computing is limited by either transmission time of optical signals from the root to the leaf or switching time of a node. We have designed and experimentally demonstrated 1-bit and 2-bit adders based on the BDD architecture. The switching nodes are silicon ring resonators with a modulation depth of 10 dB and the states are changed by the plasma dispersion effect. The quality, Q of the rings designed is 1500, which allows fast transmission of signal, e.g., 1.3 ps calculated by a photon escaping time. A total processing time is thus analyzed to be ~9 ps for a 2-bit adder and would scales linearly with the number of bit. It is two orders of magnitude faster than the conventional CMOS circuitry, ~ns scale of delay. The presented results show the potential of fast speed optical computing circuits.

  5. Dataflow-Based Mapping of Computer Vision Algorithms onto FPGAs

    Directory of Open Access Journals (Sweden)

    Schlessman Jason

    2007-01-01

    Full Text Available We develop a design methodology for mapping computer vision algorithms onto an FPGA through the use of coarse-grain reconfigurable dataflow graphs as a representation to guide the designer. We first describe a new dataflow modeling technique called homogeneous parameterized dataflow (HPDF, which effectively captures the structure of an important class of computer vision applications. This form of dynamic dataflow takes advantage of the property that in a large number of image processing applications, data production and consumption rates can vary, but are equal across dataflow graph edges for any particular application iteration. After motivating and defining the HPDF model of computation, we develop an HPDF-based design methodology that offers useful properties in terms of verifying correctness and exposing performance-enhancing transformations; we discuss and address various challenges in efficiently mapping an HPDF-based application representation into target-specific HDL code; and we present experimental results pertaining to the mapping of a gesture recognition application onto the Xilinx Virtex II FPGA.

  6. Energy based Efficient Resource Scheduling in Green Computing

    Directory of Open Access Journals (Sweden)

    B.Vasumathi,

    2015-11-01

    Full Text Available Cloud Computing is an evolving area of efficient utilization of computing resources. Data centers accommodating Cloud applications ingest massive quantities of energy, contributing to high functioning expenditures and carbon footprints to the atmosphere. Hence, Green Cloud computing resolutions are required not only to save energy for the environment but also to decrease operating charges. In this paper, we emphasis on the development of energy based resource scheduling framework and present an algorithm that consider the synergy between various data center infrastructures (i.e., software, hardware, etc., and performance. In specific, this paper proposes (a architectural principles for energy efficient management of Clouds; (b energy efficient resource allocation strategies and scheduling algorithm considering Quality of Service (QoS outlooks. The performance of the proposed algorithm has been evaluated with the existing energy based scheduling algorithms. The experimental results demonstrate that this approach is effective in minimizing the cost and energy consumption of Cloud applications thus moving towards the achievement of Green Clouds.

  7. [Computer work and De Quervain's tenosynovitis: an evidence based approach].

    Science.gov (United States)

    Gigante, M R; Martinotti, I; Cirla, P E

    2012-01-01

    The debate around the role of the work at personal computer as cause of De Quervain's Tenosynovitis was developed partially, without considering multidisciplinary available data. A systematic review of the literature, using an evidence-based approach, was performed. In disorders associated with the use of VDU, we must distinguish those at the upper limbs and among them those related to an overload. Experimental studies on the occurrence of De Quervain's Tenosynovitis are quite limited, as well as clinically are quite difficult to prove the professional etiology, considering the interference due to other activities of daily living or to the biological susceptibility (i.e. anatomical variability, sex, age, exercise). At present there is no evidence of any connection between De Quervain syndrome and time of use of the personal computer or keyboard, limited evidence of correlation is found with time using a mouse. No data are available regarding the use exclusively or predominantly for personal laptops or mobile "smart phone".

  8. Optical image hiding based on computational ghost imaging

    Science.gov (United States)

    Wang, Le; Zhao, Shengmei; Cheng, Weiwen; Gong, Longyan; Chen, Hanwu

    2016-05-01

    Imaging hiding schemes play important roles in now big data times. They provide copyright protections of digital images. In the paper, we propose a novel image hiding scheme based on computational ghost imaging to have strong robustness and high security. The watermark is encrypted with the configuration of a computational ghost imaging system, and the random speckle patterns compose a secret key. Least significant bit algorithm is adopted to embed the watermark and both the second-order correlation algorithm and the compressed sensing (CS) algorithm are used to extract the watermark. The experimental and simulation results show that the authorized users can get the watermark with the secret key. The watermark image could not be retrieved when the eavesdropping ratio is less than 45% with the second-order correlation algorithm, whereas it is less than 20% with the TVAL3 CS reconstructed algorithm. In addition, the proposed scheme is robust against the 'salt and pepper' noise and image cropping degradations.

  9. Parallel computing-based sclera recognition for human identification

    Science.gov (United States)

    Lin, Yong; Du, Eliza Y.; Zhou, Zhi

    2012-06-01

    Compared to iris recognition, sclera recognition which uses line descriptor can achieve comparable recognition accuracy in visible wavelengths. However, this method is too time-consuming to be implemented in a real-time system. In this paper, we propose a GPU-based parallel computing approach to reduce the sclera recognition time. We define a new descriptor in which the information of KD tree structure and sclera edge are added. Registration and matching task is divided into subtasks in various sizes according to their computation complexities. Every affine transform parameters are generated by searching on KD tree. Texture memory, constant memory, and shared memory are used to store templates and transform matrixes. The experiment results show that the proposed method executed on GPU can dramatically improve the sclera matching speed in hundreds of times without accuracy decreasing.

  10. A MODEL BASED ALGORITHM FOR FAST DPIV COMPUTING

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Traditional DPIV (Digital Particle Image Velocimetry) methods aremostly based on area-correlation (Willert,C.E.,1991).Though proven to be very time-consuming and very much error prone,they are widely adopted because of they are conceptually simple and easily implemented,and also because there are few alternatives.This paper proposes a non-correlation,conceptually new,fast and efficient approach for DPIV,which takes the nature of flow into consideration.An Incompressible Affined Flow Model (IAFM) is introduced to describe a flow that incorporates rational restraints into the computation.This IAFM,combined with a modified optical flow method-named Total Optical Flow Computation (TOFC),provides a linear system solution to DPIV.Experimental results on real images showed our method to be a very promising approach for DPIV.

  11. A Method for Weight Multiplicity Computation Based on Berezin Quantization

    Directory of Open Access Journals (Sweden)

    David Bar-Moshe

    2009-09-01

    Full Text Available Let G be a compact semisimple Lie group and T be a maximal torus of G. We describe a method for weight multiplicity computation in unitary irreducible representations of G, based on the theory of Berezin quantization on G/T. Let Γ_{hol}(L^λ be the reproducing kernel Hilbert space of holomorphic sections of the homogeneous line bundle L^λ over G/T associated with the highest weight λ of the irreducible representation π_λ of G. The multiplicity of a weight m in π_λ is computed from functional analytical structure of the Berezin symbol of the projector in Γ_{hol}(L^λ onto subspace of weight m. We describe a method of the construction of this symbol and the evaluation of the weight multiplicity as a rank of a Hermitian form. The application of this method is described in a number of examples.

  12. Indoor scene classification of robot vision based on cloud computing

    Science.gov (United States)

    Hu, Tao; Qi, Yuxiao; Li, Shipeng

    2016-07-01

    For intelligent service robots, indoor scene classification is an important issue. To overcome the weak real-time performance of conventional algorithms, a new method based on Cloud computing is proposed for global image features in indoor scene classification. With MapReduce method, global PHOG feature of indoor scene image is extracted in parallel. And, feature eigenvector is used to train the decision classifier through SVM concurrently. Then, the indoor scene is validly classified by decision classifier. To verify the algorithm performance, we carried out an experiment with 350 typical indoor scene images from MIT LabelMe image library. Experimental results show that the proposed algorithm can attain better real-time performance. Generally, it is 1.4 2.1 times faster than traditional classification methods which rely on single computation, while keeping stable classification correct rate as 70%.

  13. Computer Vision-Based Image Analysis of Bacteria.

    Science.gov (United States)

    Danielsen, Jonas; Nordenfelt, Pontus

    2017-01-01

    Microscopy is an essential tool for studying bacteria, but is today mostly used in a qualitative or possibly semi-quantitative manner often involving time-consuming manual analysis. It also makes it difficult to assess the importance of individual bacterial phenotypes, especially when there are only subtle differences in features such as shape, size, or signal intensity, which is typically very difficult for the human eye to discern. With computer vision-based image analysis - where computer algorithms interpret image data - it is possible to achieve an objective and reproducible quantification of images in an automated fashion. Besides being a much more efficient and consistent way to analyze images, this can also reveal important information that was previously hard to extract with traditional methods. Here, we present basic concepts of automated image processing, segmentation and analysis that can be relatively easy implemented for use with bacterial research.

  14. Efficient Model for Distributed Computing based on Smart Embedded Agent

    Directory of Open Access Journals (Sweden)

    Hassna Bensag

    2017-02-01

    Full Text Available Technological advances of embedded computing exposed humans to an increasing intrusion of computing in their day-to-day life (e.g. smart devices. Cooperation, autonomy, and mobility made the agent a promising mechanism for embedded devices. The work aims to present a new model of an embedded agent designed to be implemented in smart devices in order to achieve parallel tasks in a distribute environment. To validate the proposed model, a case study was developed for medical image segmentation using Cardiac Magnetic Resonance Image (MRI. In the first part of this paper, we focus on implementing the parallel algorithm of classification using C-means method in embedded systems. We propose then a new concept of distributed classification using multi-agent systems based on JADE and Raspberry PI 2 devices.

  15. A Computer-based Tutorial on Double-Focusing Spectrometers

    Science.gov (United States)

    Silbar, Richard R.; Browman, Andrew A.; Mead, William C.; Williams, Robert A.

    1998-10-01

    WhistleSoft is developing a set of computer-based, self-paced tutorials on particle accelerators that targets a broad audience, including undergraduate science majors and industrial technicians. (See http://www.whistlesoft.com/s~ilbar/.) We use multimedia techniques to enhance the student's rate of learning and retention of the material. The tutorials feature interactive On-Screen Laboratories and use hypertext, colored graphics, two- and three-dimensional animations, video, and sound. Parts of our Dipoles module deal with the double-focusing spectrometer and occur throughout the piece. Radial focusing occurs in the section on uniform magnets, while vertical focusing is in the non-uniform magnets section. The student can even understand the √2π bend angle on working through the (intermediate-level) discussion on the Kerst-Serber equations. This talk will present our discussion of this spectrometer, direct to you from the computer screen.

  16. Nanotube devices based crossbar architecture: toward neuromorphic computing.

    Science.gov (United States)

    Zhao, W S; Agnus, G; Derycke, V; Filoramo, A; Bourgoin, J-P; Gamrat, C

    2010-04-30

    Nanoscale devices such as carbon nanotube and nanowires based transistors, memristors and molecular devices are expected to play an important role in the development of new computing architectures. While their size represents a decisive advantage in terms of integration density, it also raises the critical question of how to efficiently address large numbers of densely integrated nanodevices without the need for complex multi-layer interconnection topologies similar to those used in CMOS technology. Two-terminal programmable devices in crossbar geometry seem particularly attractive, but suffer from severe addressing difficulties due to cross-talk, which implies complex programming procedures. Three-terminal devices can be easily addressed individually, but with limited gain in terms of interconnect integration. We show how optically gated carbon nanotube devices enable efficient individual addressing when arranged in a crossbar geometry with shared gate electrodes. This topology is particularly well suited for parallel programming or learning in the context of neuromorphic computing architectures.

  17. Microstereolithography-based computer-aided manufacturing for tissue engineering.

    Science.gov (United States)

    Cho, Dong-Woo; Kang, Hyun-Wook

    2012-01-01

    Various solid freeform fabrication technologies have been introduced for constructing three-dimensional (3-D) freeform structures. Of these, microstereolithography (MSTL) technology performs the best in 3-D space because it not only has high resolution, but also fast fabrication speed. Using this technology, 3-D structures with mesoscale size and microscale resolution are achievable. Many researchers have been trying to apply this technology to tissue engineering to construct medically applicable scaffolds, which require a 3-D shape that fits a defect with a mesoscale size and microscale inner architecture for efficient regeneration of artificial tissue. This chapter introduces the principles of MSTL technology and representative systems. It includes fabrication and computer-aided design/computer-aided manufacturing (CAD/CAM) processes to show the automation process by which measurements from medical images are used to fabricate the required 3-D shape. Then, various tissue engineering applications based on MSTL are summarized.

  18. Computer Based Collaborative Problem Solving for Introductory Courses in Physics

    Science.gov (United States)

    Ilie, Carolina; Lee, Kevin

    2010-03-01

    We discuss collaborative problem solving computer-based recitation style. The course is designed by Lee [1], and the idea was proposed before by Christian, Belloni and Titus [2,3]. The students find the problems on a web-page containing simulations (physlets) and they write the solutions on an accompanying worksheet after discussing it with a classmate. Physlets have the advantage of being much more like real-world problems than textbook problems. We also compare two protocols for web-based instruction using simulations in an introductory physics class [1]. The inquiry protocol allowed students to control input parameters while the worked example protocol did not. We will discuss which of the two methods is more efficient in relation to Scientific Discovery Learning and Cognitive Load Theory. 1. Lee, Kevin M., Nicoll, Gayle and Brooks, Dave W. (2004). ``A Comparison of Inquiry and Worked Example Web-Based Instruction Using Physlets'', Journal of Science Education and Technology 13, No. 1: 81-88. 2. Christian, W., and Belloni, M. (2001). Physlets: Teaching Physics With Interactive Curricular Material, Prentice Hall, Englewood Cliffs, NJ. 3. Christian,W., and Titus,A. (1998). ``Developing web-based curricula using Java Physlets.'' Computers in Physics 12: 227--232.

  19. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  20. Security considerations and recommendations in computer-based testing.

    Science.gov (United States)

    Al-Saleem, Saleh M; Ullah, Hanif

    2014-01-01

    Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee.

  1. Resistive content addressable memory based in-memory computation architecture

    KAUST Repository

    Salama, Khaled N.

    2016-12-08

    Various examples are provided examples related to resistive content addressable memory (RCAM) based in-memory computation architectures. In one example, a system includes a content addressable memory (CAM) including an array of cells having a memristor based crossbar and an interconnection switch matrix having a gateless memristor array, which is coupled to an output of the CAM. In another example, a method, includes comparing activated bit values stored a key register with corresponding bit values in a row of a CAM, setting a tag bit value to indicate that the activated bit values match the corresponding bit values, and writing masked key bit values to corresponding bit locations in the row of the CAM based on the tag bit value.

  2. Implementing security in computer based patient records clinical experiences.

    Science.gov (United States)

    Iversen, K R; Heimly, V; Lundgren, T I

    1995-01-01

    In Norway, organizational changes in hospitals and a stronger focus on patient safety have changed the way of organizing and managing paper based patient records. Hospital-wide patient records tend to replace department based records. Since not only clinicians, but also other non-medical staff have access to the paper records, they also have easy access to all the information which is available on a specific patient; such a system has obvious 'side effects' on privacy and security. Computer based patient records (CPRs) can provide the solution to this apparent paradox if the complex aspects of security, privacy, effectiveness, and user friendliness are focused on jointly from the outset in designing such systems. Clinical experiences in Norway show that it is possible to design patient record systems that provide a very useful tool for clinicians and other health care personnel (HCP) while fully complying with comprehensive security and privacy requirements.

  3. Computerbasiert prüfen [Computer-based Assessment

    Directory of Open Access Journals (Sweden)

    Frey, Peter

    2006-08-01

    Full Text Available [english] Computer-based testing in medical education offers new perspectives. Advantages are sequential or adaptive testing, integration of movies or sound, rapid feedback to candidates and management of web-based question banks. Computer-based testing can also be implemented in an OSCE examination. In e-learning environments formative self-assessment are often implemented and gives helpful feedbacks to learners. Disadvantages in high-stake exams are the high requirements as well for the quality of testing (e.g. standard setting as additionally for the information technology and especially for security. [german] Computerbasierte Prüfungen im Medizinstudium eröffnen neue Möglichkeiten. Vorteile solcher Prüfungen liegen im sequentiellen oder adaptiven Prüfen, in der Integration von Bewegtbildern oder Ton, der raschen Auswertung und zentraler Verwaltung der Prüfungsfragen via Internet. Ein Einsatzgebiet mit vertretbarem Aufwand sind Prüfungen mit mehreren Stationen wie beispielsweise die OSCE-Prüfung. Computerbasierte formative Selbsttests werden im Bereiche e-learning häufig angeboten. Das hilft den Lernenden ihren Wissensstand besser einzuschätzen oder sich mit den Leistungen anderer zu vergleichen. Grenzen zeigen sich bei den summativen Prüfungen beim Prüfungsort, da zuhause Betrug möglich ist. Höhere ärztliche Kompetenzen wie Untersuchungstechnik oder Kommunikation eigenen sich kaum für rechnergestützte Prüfungen.

  4. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  5. Social psychology: new directions in computer-based learning

    Directory of Open Access Journals (Sweden)

    Lesley J. Allinson

    1996-12-01

    Full Text Available Perhaps surprisingly, psychology has been a discipline eager to capitalize on the application of computers for teaching. Traditionally, this has been for statistical calculations, and the presentation of experimental stimuli and the automatic collection of timed events (e.g., reaction times, choice-decision times. Here, the traditional capabilities of computers are being exploited - namely, their accurate temporal sequencing, graphical performance, and, above all, their number crunching. As such, they have been powerful and essential tools for those involved in the more psychophysical or cognitive areas of psychology. Computer-based learning (CBL remains very much a preserve of these more formal domains. The arrival of hypermedia has opened the way for CBL to be exploited within the less formal domains of psychology; but the level of interactivity is usually very restricted, and the constrained presentational styles means that even this technological progression fails to meet the contextual richness needed in the teaching of much of the behavioural sciences. The advent of multimedia has for the first time provided the potential to explore, within the normal undergraduate learning environment, real behaviour using the observational techniques that form the basic methodology of the practising social psychologist.

  6. Integrating Reconfigurable Hardware-Based Grid for High Performance Computing

    Directory of Open Access Journals (Sweden)

    Julio Dondo Gazzano

    2015-01-01

    Full Text Available FPGAs have shown several characteristics that make them very attractive for high performance computing (HPC. The impressive speed-up factors that they are able to achieve, the reduced power consumption, and the easiness and flexibility of the design process with fast iterations between consecutive versions are examples of benefits obtained with their use. However, there are still some difficulties when using reconfigurable platforms as accelerator that need to be addressed: the need of an in-depth application study to identify potential acceleration, the lack of tools for the deployment of computational problems in distributed hardware platforms, and the low portability of components, among others. This work proposes a complete grid infrastructure for distributed high performance computing based on dynamically reconfigurable FPGAs. Besides, a set of services designed to facilitate the application deployment is described. An example application and a comparison with other hardware and software implementations are shown. Experimental results show that the proposed architecture offers encouraging advantages for deployment of high performance distributed applications simplifying development process.

  7. Smart learning services based on smart cloud computing.

    Science.gov (United States)

    Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik

    2011-01-01

    Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  8. Hand Gesture and Neural Network Based Human Computer Interface

    Directory of Open Access Journals (Sweden)

    Aekta Patel

    2014-06-01

    Full Text Available Computer is used by every people either at their work or at home. Our aim is to make computers that can understand human language and can develop a user friendly human computer interfaces (HCI. Human gestures are perceived by vision. The research is for determining human gestures to create an HCI. Coding of these gestures into machine language demands a complex programming algorithm. In this project, We have first detected, recognized and pre-processing the hand gestures by using General Method of recognition. Then We have found the recognized image’s properties and using this, mouse movement, click and VLC Media player controlling are done. After that we have done all these functions thing using neural network technique and compared with General recognition method. From this we can conclude that neural network technique is better than General Method of recognition. In this, I have shown the results based on neural network technique and comparison between neural network method & general method.

  9. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  10. The neural and computational bases of semantic cognition.

    Science.gov (United States)

    Ralph, Matthew A Lambon; Jefferies, Elizabeth; Patterson, Karalyn; Rogers, Timothy T

    2017-01-01

    Semantic cognition refers to our ability to use, manipulate and generalize knowledge that is acquired over the lifespan to support innumerable verbal and non-verbal behaviours. This Review summarizes key findings and issues arising from a decade of research into the neurocognitive and neurocomputational underpinnings of this ability, leading to a new framework that we term controlled semantic cognition (CSC). CSC offers solutions to long-standing queries in philosophy and cognitive science, and yields a convergent framework for understanding the neural and computational bases of healthy semantic cognition and its dysfunction in brain disorders.

  11. The extended RBAC model based on grid computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Jian-gang; WANG Ru-chuan; WANG Hai-yan

    2006-01-01

    This article proposes the extended role-based access control (RBAC) model for solving dynamic and multidomain problems in grid computing, The formulated description of the model has been provided. The introduction of context and the mapping relations of context-to-role and context-to-permission help the model adapt to dynamic property in grid environment.The multidomain role inheritance relation by the authorization agent service realizes the multidomain authorization amongst the autonomy domain. A function has been proposed for solving the role inheritance conflict during the establishment of the multidomain role inheritance relation.

  12. Matrix-based, finite-difference algorithms for computational acoustics

    Science.gov (United States)

    Davis, Sanford

    1990-01-01

    A compact numerical algorithm is introduced for simulating multidimensional acoustic waves. The algorithm is expressed in terms of a set of matrix coefficients on a three-point spatial grid that approximates the acoustic wave equation with a discretization error of O(h exp 5). The method is based on tracking a local phase variable and its implementation suggests a convenient coordinate splitting along with natural intermediate boundary conditions. Results are presented for oblique plane waves and compared with other procedures. Preliminary computations of acoustic diffraction are also considered.

  13. Hybrid slime mould-based system for unconventional computing

    Science.gov (United States)

    Berzina, T.; Dimonte, A.; Cifarelli, A.; Erokhin, V.

    2015-04-01

    Physarum polycephalum is considered to be promising for the realization of unconventional computational systems. In this work, we present results of three slime mould-based systems. We have demonstrated the possibility of transporting biocompatible microparticles using attractors, repellents and a DEFLECTOR. The latter is an external tool that enables to conduct Physarum motion. We also present interactions between slime mould and conducting polymers, resulting in a variation of their colour and conductivity. Finally, incorporation of the Physarum into the organic memristive device resulted in a variation of its electrical characteristics due to the slime mould internal activity.

  14. Intelligent Cost Modeling Based on Soft Computing for Avionics Systems

    Institute of Scientific and Technical Information of China (English)

    ZHU Li-li; LI Zhuang-sheng; XU Zong-ze

    2006-01-01

    In parametric cost estimating, objections to using statistical Cost Estimating Relationships (CERs) and parametric models include problems of low statistical significance due to limited data points, biases in the underlying data, and lack of robustness. Soft Computing (SC) technologies are used for building intelligent cost models. The SC models are systemically evaluated based on their training and prediction of the historical cost data of airborne avionics systems. Results indicating the strengths and weakness of each model are presented. In general, the intelligent cost models have higher prediction precision, better data adaptability, and stronger self-learning capability than the regression CERs.

  15. Naval Computer-Based Instruction: Cost, Implementation and Effectiveness Issues.

    Science.gov (United States)

    1988-03-01

    that precipitated change in the way we do computer-based instruction will be pointed out. Perhaps the most well known of all the CBI projects...Electric (GE) has been showing a newer technology than CD-I called Digital Video Interactive ( DVI ). It uses custom chips to compress high quality...been used to great advantage. The Navy will need to look 70 at CD-I and DVI , interactive extensions of the CD-ROM technology, and decide how we can

  16. Building Computer-Based Experiments in Psychology without Programming Skills.

    Science.gov (United States)

    Ruisoto, Pablo; Bellido, Alberto; Ruiz, Javier; Juanes, Juan A

    2016-06-01

    Research in Psychology usually requires to build and run experiments. However, although this task has required scripting, recent computer tools based on graphical interfaces offer new opportunities in this field for researchers with non-programming skills. The purpose of this study is to illustrate and provide a comparative overview of two of the main free open source "point and click" software packages for building and running experiments in Psychology: PsychoPy and OpenSesame. Recommendations for their potential use are further discussed.

  17. HiFi-MBQC High Fidelitiy Measurement-Based Quantum Computing using Superconducting Detectors

    Science.gov (United States)

    2016-04-04

    computer. We exploit the conceptual framework of measurement - based quantum computation that enables a client to delegate a computation to a quantum...AFRL-AFOSR-UK-TR-2016-0006 HiFi-MBQC High Fidelitiy Measurement - Based Quantum Computing using Superconducting Detectors Philip Walther UNIVERSITT...HiFi-MBQC High Fidelitiy Measurement - Based Quantum Computing using Superconducting Detectors 5a. CONTRACT NUMBER FA8655-11-1-3004 5b. GRANT NUMBER

  18. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lo, P., E-mail: pechinlo@mednet.edu.ucla; Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G. [Center for Computer Vision and Imaging Biomarkers, Department of Radiological Sciences, David Geffen School of Medicine, University of California, Los Angeles, California 90024 (United States); Argula, R.; Strange, C. [Division of Pulmonary and Critical Care Medicine, Medical University of South Carolina, Charleston, South Carolina 29425 (United States)

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  19. GPSIM: A Personal Computer-Based GPS Simulator System

    Science.gov (United States)

    Ibrahim, D.

    Global Positioning Systems (GPS) are now in use in many applications, ranging from GIS to route guidance, automatic vehicle location (AVL), air, land, and marine navigation, and many other transportation and geographical based applications. In many applications, the GPS receiver is connected to some form of intelligent electronic system which receives the positional data from the GPS unit and then performs the required operation. When developing and testing GPS-based systems, one of the problems is that it is usually necessary to create GPS-compatible geographical data to simulate a GPS operation in real time. This paper provides the details of a Personal Computer (PC)-based GPS simulator system called GPSIM. The system receives user way-points and routes from Windows-based screen forms and then simulates a GPS operation in real time by generating most of the commonly used GPS sentences. The user-specified waypoints are divided into a number of small segments, each segment specifying a small distance in the direction of the original waypoint. The GPS sentence corresponding to the geographical coordinates of each segment is then sent out of the PC serial port. The system described is an invaluable testing tool for GPS-based system developers and also for people training to learn to use GPS-based products.

  20. Evaluating Emulation-based Models of Distributed Computing Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T.

    2017-10-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  1. Computer-based Training (CBT in der Humanmedizin [Computer-based training (CBT in an undergraduate medical curriculum

    Directory of Open Access Journals (Sweden)

    Smolle, Josef

    2007-05-01

    Full Text Available [english] Background: Computer based training (CBT is based on chunking of the learning content and follows a highly structured instructional design. The question is addressed whether verbally expressible knowledge can be acquired by CBT in a learning content example in human medicine. Methods: 43 students of human medicine (31 females, 12 males studied a CBT learning object on general tumour pathology comprising 32 frames for three times and wrote short essays on the topic. The main goal was to assess the increase of the frequency of terms and concepts of general tumour pathology between the essays as determined by content analysis.Results: Frequency analysis showed 28 +- 25 hits in the first, 40 +- 19 hits in the second and – after two weeks intermission – 35 +- 17 hits in the third essay, with the increase being highly significant (p < 0.01 The hits in the essay correlated significantly with the performance in the CBT learning object. In the qualitative feedback, positive remarks prevailed (p < 0.01. Conclusion: Computer based training does not simply drill the performance in predefined frames, but facilitates the achievement of knowledge which can be verbally expressed. Performance of the individual student in the CBT learning objects is significantly correlated to the quality of the short essay on the same topic.[german] Zielsetzung: Computer-based Training (CBT als tutorielles e-Learning-Konzept beruht auf einer Zerlegung des Lernstoffs in kleine Einheiten (Chunking und einem stark strukturierten instruktionalen Prozess. Die Studie befasst sich mit der Frage, ob auf diese Art aktiv explizites (sprachlich ausdrückbares Wissen zu einem humanmedizinischen Thema erworben werden kann. Methodik: 43 Studierende der Humanmedizin (31 Frauen, 12 Männer machten ein CBT-Lernobjekt zur Allgemeinen Tumorpathologie mit 32 Frames 3 mal durch und erstellten Kurzessays. Hauptzielgröße war die Differenz der inhaltlichen Frequenzanalyse

  2. IURead: a new computer-based reading test.

    Science.gov (United States)

    Xu, Renfeng; Bradley, Arthur

    2015-09-01

    To develop a computer-based single sentence reading test especially designed for clinical research enabling multiple repeat trials without reusing the same sentences. We initially developed 422 sentences, with an average of 60 characters and 12 words. Presentation controls were improved by employing computer-based testing and the oral reading was recorded by visual inspection of digital audio recordings. Variability in reading speed of normally sighted adults between sentences, between charts, between subjects, between formats, and between display devices was quantified. The impact of display size and pixel resolution on test geometry was assessed, and the impact of reduced retinal image quality and retinal illuminance were compared for reading and standard letter acuities. Eleven visually normal subjects (age: 18-60 years) participated in this study. Stopwatch timing of sentences reliably underestimated reading times by about 0.3 s, and exhibited coefficients of repeatability 17 times larger than those estimated from visual inspection of digital recordings. A slight relaxing of the lexical content constraints had no effect on reading speed; neither did sentence format (single vs three lines) or display size or distance. Within subject standard deviations of reading speed for different sentences were small (between 6% and 9% of the mean speed) requiring only small samples sizes to achieve typical statistical reliability and power when comparing conditions within individual subjects. The greater variability associated with stopwatch timing necessitates larger sample sizes. As defocus and light level were varied, reading acuity and standard letter acuity were highly correlated (r(2)  = 0.99), and reading acuity was slightly better. A computer-based IURead reading test provides a useful reading speed and reading acuity tool for clinical research involving multiple conditions and repeat testing of individual subjects. Ready to use IURead files for use with a

  3. A wireless computational platform for distributed computing based traffic monitoring involving mixed Eulerian-Lagrangian sensing

    KAUST Repository

    Jiang, Jiming

    2013-06-01

    This paper presents a new wireless platform designed for an integrated traffic monitoring system based on combined Lagrangian (mobile) and Eulerian (fixed) sensing. The sensor platform is built around a 32-bit ARM Cortex M4 micro-controller and a 2.4GHz 802.15.4 ISM compliant radio module, and can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. The platform is specially designed and optimized to be integrated in a solar-powered wireless sensor network in which traffic flow maps are computed by the nodes directly using distributed computing. A MPPT circuitry is proposed to increase the power output of the attached solar panel. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debug. An ongoing implementation is briefly discussed, and compared with existing platforms used in wireless sensor networks. © 2013 IEEE.

  4. Comparison of Computed Tomography Scout Based Reference Point Localization to Conventional Film and Axial Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jiang Lan; Templeton, Alistair; Turian, Julius; Kirk, Michael; Zusag, Thomas; Chu, James C.H., E-mail: jchu@rush.edu

    2011-01-01

    Identification of source positions after implantation is an important step in brachytherapy planning. Reconstruction is traditionally performed from films taken by conventional simulators, but these are gradually being replaced in the clinic by computed tomography (CT) simulators. The present study explored the use of a scout image-based reconstruction algorithm that replaces the use of traditional film, while exhibiting low sensitivity to metal-induced artifacts that can appear in 3D CT methods. In addition, the accuracy of an in-house graphical software implementation of scout-based reconstruction was compared with seed location reconstructions for 2 phantoms by conventional simulator and CT measurements. One phantom was constructed using a planar fixed grid of 1.5-mm diameter ball bearings (BBs) with 40-mm spacing. The second was a Fletcher-Suit applicator embedded in Styrofoam (Dow Chemical Co., Midland, MI) with one 3.2-mm-diameter BB inserted into each of 6 surrounding holes. Conventional simulator, kilovoltage CT (kVCT), megavoltage CT, and scout-based methods were evaluated by their ability to calculate the distance between seeds (40 mm for the fixed grid, 30-120 mm in Fletcher-Suit). All methods were able to reconstruct the fixed grid distances with an average deviation of <1%. The worst single deviations (approximately 6%) were exhibited in the 2 volumetric CT methods. In the Fletcher-Suit phantom, the intermodality agreement was within approximately 3%, with the conventional sim measuring marginally larger distances, with kVCT the smallest. All of the established reconstruction methods exhibited similar abilities to detect the distances between BBs. The 3D CT-based methods, with lower axial resolution, showed more variation, particularly with the smaller BBs. With a software implementation, scout-based reconstruction is an appealing approach because it simplifies data acquisition over film-based reconstruction without requiring any specialized equipment

  5. Computer-based controlon mathematical education of the future engineers

    Directory of Open Access Journals (Sweden)

    Катерина Володимирівна Власенко

    2016-04-01

    Full Text Available There was offered the computer-based management of the scientific-cognitive activity of the future engineers and organization of control on mathematical education using the attainments of the modern information technologies. There were analyzed the e-school-books elaborated by the higher mathematics teachers of the Donbass state academy of mechanical engineering and Donetsk national technical university. There was grounded an expediency of the use of aforesaid school-books during the lectures or independent students’ work and at the control on education of the future specialists of engineer branch that is considered as the category of management as the relatively independent final element of managerial cycle of educational process. There were offered methodological recommendations of the use of e-educational technologies at mathematical studies. There was elucidated how the introduction of educational materials from internet-resource with the access mode http://vmdbi.net.ua/ favors the computer-based control on education: oral questioning of students, written control works, tests and so on. There was proved that the use of offered means at the control on mathematical education gives the possibility to establish the reverse connection at studying mathematics and comprehensively examine the level of knowledge and skills of the studied discipline in students

  6. COLLABORATION IN WEB BASED LEARNING: A SOCIAL COMPUTING PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    C.Pooranachandran

    2011-02-01

    Full Text Available The rapid advance of Information Communication Technology [ICT] has enabled Higher Education Institutions to reach out and educate students transcending the barriers of time and space. This technology supports structured, web-based learning activities, and provides diverse multilingual and multicultural settings and also facilities for self assessment. Structured collaboration, in the conventionaleducation system, has proven itself a successful and powerful learning method. Online learners do not enjoy the same collaborative benefits as face-to-face learners because the technology provides no guidance or direction during the online discussion sessions. This paper presents a Web Based Learning Environment [WBLE] from the perspective of social computing to bring collaborative learning benefits to online learners. The paper also highlights how the deployment of social computing tools can support the creation of an open and socially shared information space for better collaboration among the learners. With Social Network Analysis [SNA] techniques, collaboration among twenty learners is explored and metrics such as in-degree, out-degree and Betweenness and collaborative network mapping are presented.

  7. Scenario-based Participatory Design for Pervasive Computing

    Institute of Scientific and Technical Information of China (English)

    LIU Zhi-qiang; DING Peng; SHENG Huan-ye

    2007-01-01

    Lots of pervasive computing researchers are working on how to realize the user-centered intelligent pervasive computing environment as Mark Weiser figured out. Task abstraction is the fundamentation of configuration for pervasive application. Based on task-oriented and descriptive properties of scenario, a scenario-based participatory design model was proposed to realize the task abstraction. The design model provided users and domain experts a useful mechanism to build the customized applications by separating system model into domain model and design model. In this design model, domain experts, together with users, stakeholders focus on the logic rules (domain model) and programmers work on the implementation (design model). In order to formalize the model description, a human-agent interaction language to transform users' goals and domain rules into executable scenarios was also discussed. An agent platform - describer used to link design and implementation of scenarios was developed to realize the configuration of applications according to different requirements. The demand bus application showed the design process and the usability of this model.

  8. A cloud computing based 12-lead ECG telemedicine service

    Science.gov (United States)

    2012-01-01

    Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan. PMID:22838382

  9. Knowledge-Based Systems in Biomedicine and Computational Life Science

    CERN Document Server

    Jain, Lakhmi

    2013-01-01

    This book presents a sample of research on knowledge-based systems in biomedicine and computational life science. The contributions include: ·         personalized stress diagnosis system ·         image analysis system for breast cancer diagnosis ·         analysis of neuronal cell images ·         structure prediction of protein ·         relationship between two mental disorders ·         detection of cardiac abnormalities ·         holistic medicine based treatment ·         analysis of life-science data  

  10. Computational approaches to substrate-based cell motility

    Science.gov (United States)

    Ziebert, Falko; Aranson, Igor S.

    2016-07-01

    Substrate-based crawling motility of eukaryotic cells is essential for many biological functions, both in developing and mature organisms. Motility dysfunctions are involved in several life-threatening pathologies such as cancer and metastasis. Motile cells are also a natural realisation of active, self-propelled 'particles', a popular research topic in nonequilibrium physics. Finally, from the materials perspective, assemblies of motile cells and evolving tissues constitute a class of adaptive self-healing materials that respond to the topography, elasticity and surface chemistry of the environment and react to external stimuli. Although a comprehensive understanding of substrate-based cell motility remains elusive, progress has been achieved recently in its modelling on the whole-cell level. Here we survey the most recent advances in computational approaches to cell movement and demonstrate how these models improve our understanding of complex self-organised systems such as living cells.

  11. The Unknown Computer Viruses Detection Based on Similarity

    Science.gov (United States)

    Liu, Zhongda; Nakaya, Naoshi; Koui, Yuuji

    New computer viruses are continually being generated and they cause damage all over the world. In general, current anti-virus software detects viruses by matching a pattern based on the signature; thus, unknown viruses without any signature cannot be detected. Although there are some static analysis technologies that do not depend on signatures, virus writers often use code obfuscation techniques, which make it difficult to execute a code analysis. As is generally known, unknown viruses and known viruses share a common feature. In this paper we propose a new static analysis technology that can circumvent code obfuscation to extract the common feature and detect unknown viruses based on similarity. The results of evaluation experiments demonstrated that this technique is able to detect unknown viruses without false positives.

  12. Parallel Implementation of Classification Algorithms Based on Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wenbo Wang

    2012-09-01

    Full Text Available As an important task of data mining, Classification has been received considerable attention in many applications, such as information retrieval, web searching, etc. The enlarging volumes of information emerging by the progress of technology and the growing individual needs of data mining, makes classifying of very large scale of data a challenging task. In order to deal with the problem, many researchers try to design efficient parallel classification algorithms. This paper introduces the classification algorithms and cloud computing briefly, based on it analyses the bad points of the present parallel classification algorithms, then addresses a new model of parallel classifying algorithms. And it mainly introduces a parallel Naïve Bayes classification algorithm based on MapReduce, which is a simple yet powerful parallel programming technique. The experimental results demonstrate that the proposed algorithm improves the original algorithm performance, and it can process large datasets efficiently on commodity hardware.

  13. Individual versus Interactive Task-Based Performance through Voice-Based Computer-Mediated Communication

    Science.gov (United States)

    Granena, Gisela

    2016-01-01

    Interaction is a necessary condition for second language (L2) learning (Long, 1980, 1996). Research in computer-mediated communication has shown that interaction opportunities make learners pay attention to form in a variety of ways that promote L2 learning. This research has mostly investigated text-based rather than voice-based interaction. The…

  14. Commentary on: "Toward Computer-Based Support of Metacognitive Skills: A Computational Framework to Coach Self Explanation"

    Science.gov (United States)

    Conati, Cristina

    2016-01-01

    This paper is a commentary on "Toward Computer-Based Support of Meta-Cognitive Skills: a Computational Framework to Coach Self-Explanation", by Cristina Conati and Kurt Vanlehn, published in the "IJAED" in 2000 (Conati and VanLehn 2010). This work was one of the first examples of Intelligent Learning Environments (ILE) that…

  15. Protecting User Privacy for Cloud Computing by Bivariate Polynomial Based Secret Sharing

    OpenAIRE

    Yang, Ching-Nung; Lai, Jia-Bin; Fu, Zhangjie

    2015-01-01

    Cloud computing is an Internet-based computing. In cloud computing, the service is fully served by the provider. Users need nothing but personal devices and Internet access. Computing services, such as data, storage, software, computing, and application, can be delivered to local devices through Internet. The major security issue of cloud computing is that cloud providers must ensure that their infrastructure is secure, and prevent illegal data accesses from outsiders, other clients, or even ...

  16. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  17. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  18. Computer Based Training: Field Deployable Trainer and Shared Virtual Reality

    Science.gov (United States)

    Mullen, Terence J.

    1997-01-01

    Astronaut training has traditionally been conducted at specific sites with specialized facilities. Because of its size and nature the training equipment is generally not portable. Efforts are now under way to develop training tools that can be taken to remote locations, including into orbit. Two of these efforts are the Field Deployable Trainer and Shared Virtual Reality projects. Field Deployable Trainer NASA has used the recent shuttle mission by astronaut Shannon Lucid to the Russian space station, Mir, as an opportunity to develop and test a prototype of an on-orbit computer training system. A laptop computer with a customized user interface, a set of specially prepared CD's, and video tapes were taken to the Mir by Ms. Lucid. Based upon the feedback following the launch of the Lucid flight, our team prepared materials for the next Mir visitor. Astronaut John Blaha will fly on NASA/MIR Long Duration Mission 3, set to launch in mid September. He will take with him a customized hard disk drive and a package of compact disks containing training videos, references and maps. The FDT team continues to explore and develop new and innovative ways to conduct offsite astronaut training using personal computers. Shared Virtual Reality Training NASA's Space Flight Training Division has been investigating the use of virtual reality environments for astronaut training. Recent efforts have focused on activities requiring interaction by two or more people, called shared VR. Dr. Bowen Loftin, from the University of Houston, directs a virtual reality laboratory that conducts much of the NASA sponsored research. I worked on a project involving the development of a virtual environment that can be used to train astronauts and others to operate a science unit called a Biological Technology Facility (BTF). Facilities like this will be used to house and control microgravity experiments on the space station. It is hoped that astronauts and instructors will ultimately be able to share

  19. Using Postfeedback Delays to Improve Retention of Computer-Based Instruction

    Science.gov (United States)

    Johnson, Douglas A.; Dickinson, Alyce M.

    2012-01-01

    Self-pacing, although often seen as one of the primary benefits of computer-based instruction (CBI), can also result in an important problem, namely, computer-based racing. Computer-based racing is when learners respond so quickly within CBI that mistakes are made, even on well-known material. This study compared traditional CBI with two forms of…

  20. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Science.gov (United States)

    2010-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record...

  1. Coronary revascularization treatment based on dual-source computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Dikkers, R.; Willems, T.P.; Jonge, G.J. de; Zaag-Loonen, H.J. van der; Ooijen, P.M.A. van; Oudkerk, M. [University of Groningen, Department of Radiology, Groningen (Netherlands); University Medical Center, Groningen (Netherlands); Piers, L.H.; Tio, R.A.; Zijlstra, F. [University of Groningen, Department of Cardiology, Groningen (Netherlands); University Medical Center, Groningen (Netherlands)

    2008-09-15

    Therapy advice based on dual-source computed tomography (DSCT) in comparison with coronary angiography (CAG) was investigated and the results evaluated after 1-year follow-up. Thirty-three consecutive patients (mean age 61.9 years) underwent DSCT and CAG and were evaluated independently. In an expert reading (the ''gold standard''), CAG and DSCT examinations were evaluated simultaneously by an experienced radiologist and cardiologist. Based on the presence of significant stenosis and current guidelines, therapy advice was given by all readers blinded from the results of other readings and clinical information. Patients were treated based on a multidisciplinary team evaluation including all clinical information. In comparison with the gold standard, CAG had a higher specificity (91%) and positive predictive value (PPV) (95%) compared with DSCT (82% and 91%, respectively). DSCT had a higher sensitivity (96%) and negative predictive value (NPV) (89%) compared with CAG (91% and 83%, respectively). The DSCT-based therapy advice did not lead to any patient being denied the revascularization they needed according to the multidisciplinary team evaluation. During follow-up, two patients needed additional revascularization. The high NPV for DSCT for revascularization assessment indicates that DSCT could be safely used to select patients benefiting from medical therapy only. (orig.)

  2. An MEG-based brain-computer interface (BCI).

    Science.gov (United States)

    Mellinger, Jürgen; Schalk, Gerwin; Braun, Christoph; Preissl, Hubert; Rosenstiel, Wolfgang; Birbaumer, Niels; Kübler, Andrea

    2007-07-01

    Brain-computer interfaces (BCIs) allow for communicating intentions by mere brain activity, not involving muscles. Thus, BCIs may offer patients who have lost all voluntary muscle control the only possible way to communicate. Many recent studies have demonstrated that BCIs based on electroencephalography (EEG) can allow healthy and severely paralyzed individuals to communicate. While this approach is safe and inexpensive, communication is slow. Magnetoencephalography (MEG) provides signals with higher spatiotemporal resolution than EEG and could thus be used to explore whether these improved signal properties translate into increased BCI communication speed. In this study, we investigated the utility of an MEG-based BCI that uses voluntary amplitude modulation of sensorimotor mu and beta rhythms. To increase the signal-to-noise ratio, we present a simple spatial filtering method that takes the geometric properties of signal propagation in MEG into account, and we present methods that can process artifacts specifically encountered in an MEG-based BCI. Exemplarily, six participants were successfully trained to communicate binary decisions by imagery of limb movements using a feedback paradigm. Participants achieved significant mu rhythm self control within 32 min of feedback training. For a subgroup of three participants, we localized the origin of the amplitude modulated signal to the motor cortex. Our results suggest that an MEG-based BCI is feasible and efficient in terms of user training.

  3. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation

    Directory of Open Access Journals (Sweden)

    Xiao Sun

    2015-01-01

    Full Text Available Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it.

  4. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation.

    Science.gov (United States)

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it.

  5. A Spread Willingness Computing-Based Information Dissemination Model

    Directory of Open Access Journals (Sweden)

    Haojing Huang

    2014-01-01

    Full Text Available This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user’s spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.

  6. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation

    Science.gov (United States)

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it. PMID:26221133

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  8. Computing environmental life of electronic products based on failure physics

    Institute of Scientific and Technical Information of China (English)

    Yongqiang Zhang; Zongchang Xu; Chunyang Hu

    2016-01-01

    In some situations, the accelerated life test on en-vironmental stress for electronic products is not easily imple-mented due to various restrictions, and thus engineers are lacking of data of the product life test. Concerning this prob-lem, environmental life of the printed circuit board (PCB) board is calculated by way of physics of failure. Influences of thermal cycle and vibration on PCB and its components are studied. Based on the analysis of force and stress between components and the PCB board in thermal cycle events and vibration events, four life computing models of pins and sol-dered dots are established. The miler damage ratio is used to calculate the accumulated damage of a pin or a soldered dot, and then the environment life of the PCB board can be de-termined by the first failed one. Finaly, an example is used to ilustrate the models and their calculations.

  9. A Study of Electromyogram Based on Human-Computer Interface

    Institute of Scientific and Technical Information of China (English)

    Jun-Ru Ren; Tie-Jun Liu; Yu Huang; De-Zhong Yao

    2009-01-01

    In this paper,a new control system based on forearm electromyogram (EMG) is proposed for computer peripheral control and artificial prosthesis control.This control system intends to realize the commands of six pre-defined hand poses:up,down,left,right,yes,and no.In order to research the possibility of using a unified amplifier for both electro-encephalogram (EEG) and EMG,the surface forearm EMG data is acquired by a 4-channel EEG measure-ment system.The Bayesian classifier is used to classify the power spectral density (PSD) of the signal.The experiment result verifies that this control system can supply a high command recognition rate (average 48%) even the EMG data is collected with an EEG system just with single electrode measurement.

  10. Computing Dialogue Acts from Features with Transformation-Based Learning

    CERN Document Server

    Samuel, K B; Vijay-Shanker, K; Samuel, Ken; Carberry, Sandra

    1998-01-01

    To interpret natural language at the discourse level, it is very useful to accurately recognize dialogue acts, such as SUGGEST, in identifying speaker intentions. Our research explores the utility of a machine learning method called Transformation-Based Learning (TBL) in computing dialogue acts, because TBL has a number of advantages over alternative approaches for this application. We have identified some extensions to TBL that are necessary in order to address the limitations of the original algorithm and the particular demands of discourse processing. We use a Monte Carlo strategy to increase the applicability of the TBL method, and we select features of utterances that can be used as input to improve the performance of TBL. Our system is currently being tested on the VerbMobil corpora of spoken dialogues, producing promising preliminary results.

  11. Overlapped flowers yield detection using computer-based interface

    Directory of Open Access Journals (Sweden)

    Anuradha Sharma

    2016-09-01

    Full Text Available Precision agriculture has always dealt with the accuracy and timely information about agricultural products. With the help of computer hardware and software technology designing a decision support system that could generate flower yield information and serve as base for management and planning of flower marketing is made so easy. Despite such technologies, some problem still arise, for example, a colour homogeneity of a specimen which cannot be obtained similar to actual colour of image and overlapping of image. In this paper implementing a new ‘counting algorithm’ for overlapped flower is being discussed. For implementing this algorithm, some techniques and operations such as colour image segmentation technique, image segmentation, using HSV colour space and morphological operations have been used. In this paper used two most popular colour space; those are RGB and HSV. HSV colour space decouples brightness from a chromatic component in the image, by which it provides better result in case for occlusion and overlapping.

  12. PACS-Based Computer-Aided Detection and Diagnosis

    Science.gov (United States)

    Huang, H. K. (Bernie); Liu, Brent J.; Le, Anh HongTu; Documet, Jorge

    The ultimate goal of Picture Archiving and Communication System (PACS)-based Computer-Aided Detection and Diagnosis (CAD) is to integrate CAD results into daily clinical practice so that it becomes a second reader to aid the radiologist's diagnosis. Integration of CAD and Hospital Information System (HIS), Radiology Information System (RIS) or PACS requires certain basic ingredients from Health Level 7 (HL7) standard for textual data, Digital Imaging and Communications in Medicine (DICOM) standard for images, and Integrating the Healthcare Enterprise (IHE) workflow profiles in order to comply with the Health Insurance Portability and Accountability Act (HIPAA) requirements to be a healthcare information system. Among the DICOM standards and IHE workflow profiles, DICOM Structured Reporting (DICOM-SR); and IHE Key Image Note (KIN), Simple Image and Numeric Report (SINR) and Post-processing Work Flow (PWF) are utilized in CAD-HIS/RIS/PACS integration. These topics with examples are presented in this chapter.

  13. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  14. A shape representation for computer vision based on differential topology.

    Science.gov (United States)

    Blicher, A P

    1995-01-01

    We describe a shape representation for use in computer vision, after a brief review of shape representation and object recognition in general. Our shape representation is based on graph structures derived from level sets whose characteristics are understood from differential topology, particularly singularity theory. This leads to a representation which is both stable and whose changes under deformation are simple. The latter allows smoothing in the representation domain ('symbolic smoothing'), which in turn can be used for coarse-to-fine strategies, or as a discrete analog of scale space. Essentially the same representation applies to an object embedded in 3-dimensional space as to one in the plane, and likewise for a 3D object and its silhouette. We suggest how this can be used for recognition.

  15. Personal Computer (PC) based image processing applied to fluid mechanics

    Science.gov (United States)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes computation.

  16. Computer Based Procedures for Field Workers - FY16 Research Activities

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven job aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages – Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to

  17. Comparison of computed tomography scout based reference point localization to conventional film and axial computed tomography.

    Science.gov (United States)

    Jiang, Lan; Templeton, Alistair; Turian, Julius; Kirk, Michael; Zusag, Thomas; Chu, James C H

    2011-01-01

    Identification of source positions after implantation is an important step in brachytherapy planning. Reconstruction is traditionally performed from films taken by conventional simulators, but these are gradually being replaced in the clinic by computed tomography (CT) simulators. The present study explored the use of a scout image-based reconstruction algorithm that replaces the use of traditional film, while exhibiting low sensitivity to metal-induced artifacts that can appear in 3D CT methods. In addition, the accuracy of an in-house graphical software implementation of scout-based reconstruction was compared with seed location reconstructions for 2 phantoms by conventional simulator and CT measurements. One phantom was constructed using a planar fixed grid of 1.5-mm diameter ball bearings (BBs) with 40-mm spacing. The second was a Fletcher-Suit applicator embedded in Styrofoam (Dow Chemical Co., Midland, MI) with one 3.2-mm-diameter BB inserted into each of 6 surrounding holes. Conventional simulator, kilovoltage CT (kVCT), megavoltage CT, and scout-based methods were evaluated by their ability to calculate the distance between seeds (40 mm for the fixed grid, 30-120 mm in Fletcher-Suit). All methods were able to reconstruct the fixed grid distances with an average deviation of <1%. The worst single deviations (approximately 6%) were exhibited in the 2 volumetric CT methods. In the Fletcher-Suit phantom, the intermodality agreement was within approximately 3%, with the conventional sim measuring marginally larger distances, with kVCT the smallest. All of the established reconstruction methods exhibited similar abilities to detect the distances between BBs. The 3D CT-based methods, with lower axial resolution, showed more variation, particularly with the smaller BBs. With a software implementation, scout-based reconstruction is an appealing approach because it simplifies data acquisition over film-based reconstruction without requiring any specialized equipment

  18. From Teaching Machines to Microcomputers: Some Milestones in the History of Computer-Based Instruction.

    Science.gov (United States)

    Niemiec, Richard P.; Walberg, Herbert J.

    1989-01-01

    Examines the history of computer-based education within the context of psychological theorists of instruction, including Pressey, Thorndike, Skinner, and Crowder. Topics discussed include computer-managed instruction; computer-assisted instruction; the Computer Curriculum Corporation; PLATO; TICCIT; microcomputers; effects on students; and cost…

  19. Computer-Based Instruction: Roots, Origins, Applications, Benefits, Features, Systems, Trends and Issues.

    Science.gov (United States)

    Hofstetter, Fred T.

    Dealing exclusively with instructional computing, this paper describes how computers are delivering instruction in a wide variety of subjects to students of all ages and explains why computer-based education is currently having a profound impact on education. After a discussion of roots and origins, computer applications are described for…

  20. Windows and Fieldbus Based Software Computer Numerical Control System

    Institute of Scientific and Technical Information of China (English)

    WU Hongen; ZHANG Chengrui; LI Guili; WANG Baoren

    2006-01-01

    Computer numerical control (CNC) system is the base of modern digital and intelligent manufacturing technology. And opened its architecture and constituted based on PC and Windows operating system (OS) is the main trend of CNC system. However, even if the highest system priority is used in user mode, real-time capability of Windows (2000, NT, XP) for applications is not guaranteed. By using a device driver, which is running in kernel mode, the real time performance of Windows can be enhanced greatly. The acknowledgment performance of Windows to peripheral interrupts was evaluated. Harmonized with an intelligent real-time serial communication bus (RTSB), strict real-time performance can be achieved in Windows platform. An opened architecture software CNC system which is hardware independence is proposed based on PC and RTSB. A numerical control real time kernel (NCRTK), which is implemented as a device driver on Windows, is used to perform the NC tasks. Tasks are divided into real-time and non real-time. Real-time task is running in kernel mode and non real-time task is running in user mode. Data are exchanged between kernel and user mode by DMA and Windows Messages.

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  2. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  3. Hardware Considerations for Computer Based Education in the 1980's.

    Science.gov (United States)

    Hirschbuhl, John J.

    1980-01-01

    In the future, computers will be needed to sift through the vast proliferation of available information. Among new developments in computer technology are the videodisc microcomputers and holography. Predictions for future developments include laser libraries for the visually handicapped and Computer Assisted Dialogue. (JN)

  4. Reconfigurable computing the theory and practice of FPGA-based computation

    CERN Document Server

    Hauck, Scott

    2010-01-01

    Reconfigurable Computing marks a revolutionary and hot topic that bridges the gap between the separate worlds of hardware and software design- the key feature of reconfigurable computing is its groundbreaking ability to perform computations in hardware to increase performance while retaining the flexibility of a software solution. Reconfigurable computers serve as affordable, fast, and accurate tools for developing designs ranging from single chip architectures to multi-chip and embedded systems. Scott Hauck and Andre DeHon have assembled a group of the key experts in the fields of both hardwa

  5. Job shop scheduling problem based on DNA computing

    Institute of Scientific and Technical Information of China (English)

    Yin Zhixiang; Cui Jianzhong; Yang Yan; Ma Ying

    2006-01-01

    To solve job shop scheduling problem, a new approach-DNA computing is used in solving job shop scheduling problem. The approach using DNA computing to solve job shop scheduling is divided into three stands. Finally, optimum solutions are obtained by sequencing. A small job shop scheduling problem is solved in DNA computing, and the "operations" of the computation were performed with standard protocols, as ligation, synthesis, electrophoresis etc. This work represents further evidence for the ability of DNA computing to solve NP-complete search problems.

  6. Memory Benchmarks for SMP-Based High Performance Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, A B; de Supinski, B; Mueller, F; Mckee, S A

    2001-11-20

    As the speed gap between CPU and main memory continues to grow, memory accesses increasingly dominates the performance of many applications. The problem is particularly acute for symmetric multiprocessor (SMP) systems, where the shared memory may be accessed concurrently by a group of threads running on separate CPUs. Unfortunately, several key issues governing memory system performance in current systems are not well understood. Complex interactions between the levels of the memory hierarchy, buses or switches, DRAM back-ends, system software, and application access patterns can make it difficult to pinpoint bottlenecks and determine appropriate optimizations, and the situation is even more complex for SMP systems. To partially address this problem, we formulated a set of multi-threaded microbenchmarks for characterizing and measuring the performance of the underlying memory system in SMP-based high-performance computers. We report our use of these microbenchmarks on two important SMP-based machines. This paper has four primary contributions. First, we introduce a microbenchmark suite to systematically assess and compare the performance of different levels in SMP memory hierarchies. Second, we present a new tool based on hardware performance monitors to determine a wide array of memory system characteristics, such as cache sizes, quickly and easily; by using this tool, memory performance studies can be targeted to the full spectrum of performance regimes with many fewer data points than is otherwise required. Third, we present experimental results indicating that the performance of applications with large memory footprints remains largely constrained by memory. Fourth, we demonstrate that thread-level parallelism further degrades memory performance, even for the latest SMPs with hardware prefetching and switch-based memory interconnects.

  7. Novel photonic bandgap based architectures for quantum computers and networks

    Science.gov (United States)

    Guney, Durdu

    All of the approaches for quantum information processing have their own advantages, but unfortunately also their own drawbacks. Ideally, one would merge the most attractive features of those different approaches in a single technology. We envision that large-scale photonic crystal (PC) integrated circuits and fibers could be the basis for robust and compact quantum circuits and processors of the next generation quantum computers and networking devices. Cavity QED, solid-state, and (non)linear optical models for computing, and optical fiber approach for communications are the most promising candidates to be improved through this novel technology. In our work, we consider both digital and analog quantum computing. In the digital domain, we first perform gate-level analysis. To achieve this task, we solve the Jaynes-Cummings Hamiltonian with time-dependent coupling parameters under the dipole and rotating-wave approximations for a 3D PC single-mode cavity with a sufficiently high Q-factor. We then exploit the results to show how to create a maximally entangled state of two atoms and how to implement several quantum logic gates: a dual-rail Hadamard gate, a dual-rail NOT gate, and a SWAP gate. In all of these operations, we synchronize atoms, as opposed to previous studies with PCs. The method has the potential for extension to N-atom entanglement, universal quantum logic operations, and the implementation of other useful, cavity QED-based quantum information processing tasks. In the next part of the digital domain, we study circuit-level implementations. We design and simulate an integrated teleportation and readout circuit on a single PC chip. The readout part of our device can not only be used on its own but can also be integrated with other compatible optical circuits to achieve atomic state detection. Further improvement of the device in terms of compactness and robustness is possible by integrating with sources and detectors in the optical regime. In the analog

  8. Diagnostic reliability of MMPI-2 computer-based test interpretations.

    Science.gov (United States)

    Pant, Hina; McCabe, Brian J; Deskovitz, Mark A; Weed, Nathan C; Williams, John E

    2014-09-01

    Reflecting the common use of the MMPI-2 to provide diagnostic considerations, computer-based test interpretations (CBTIs) also typically offer diagnostic suggestions. However, these diagnostic suggestions can sometimes be shown to vary widely across different CBTI programs even for identical MMPI-2 profiles. The present study evaluated the diagnostic reliability of 6 commercially available CBTIs using a 20-item Q-sort task developed for this study. Four raters each sorted diagnostic classifications based on these 6 CBTI reports for 20 MMPI-2 profiles. Two questions were addressed. First, do users of CBTIs understand the diagnostic information contained within the reports similarly? Overall, diagnostic sorts of the CBTIs showed moderate inter-interpreter diagnostic reliability (mean r = .56), with sorts for the 1/2/3 profile showing the highest inter-interpreter diagnostic reliability (mean r = .67). Second, do different CBTIs programs vary with respect to diagnostic suggestions? It was found that diagnostic sorts of the CBTIs had a mean inter-CBTI diagnostic reliability of r = .56, indicating moderate but not strong agreement across CBTIs in terms of diagnostic suggestions. The strongest inter-CBTI diagnostic agreement was found for sorts of the 1/2/3 profile CBTIs (mean r = .71). Limitations and future directions are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  9. ARCHITECTURE OF WEB BASED COMPUTER-AIDED MANUFACTURING SYSTEM

    Directory of Open Access Journals (Sweden)

    N. E. Filyukov

    2014-09-01

    Full Text Available The paper deals with design of a web-based system for Computer-Aided Manufacturing (CAM. Remote applications and databases located in the "private cloud" are proposed to be the basis of such system. The suggested approach contains: service - oriented architecture, using web applications and web services as modules, multi-agent technologies for implementation of information exchange functions between the components of the system and the usage of PDM - system for managing technology projects within the CAM. The proposed architecture involves CAM conversion into the corporate information system that will provide coordinated functioning of subsystems based on a common information space, as well as parallelize collective work on technology projects and be able to provide effective control of production planning. A system has been developed within this architecture which gives the possibility for a rather simple technological subsystems connect to the system and implementation of their interaction. The system makes it possible to produce CAM configuration for a particular company on the set of developed subsystems and databases specifying appropriate access rights for employees of the company. The proposed approach simplifies maintenance of software and information support for CAM subsystems due to their central location in the data center. The results can be used as a basis for CAM design and testing within the learning process for development and modernization of the system algorithms, and then can be tested in the extended enterprise.

  10. A computer vision based candidate for functional balance test.

    Science.gov (United States)

    Nalci, Alican; Khodamoradi, Alireza; Balkan, Ozgur; Nahab, Fatta; Garudadri, Harinath

    2015-08-01

    Balance in humans is a motor skill based on complex multimodal sensing, processing and control. Ability to maintain balance in activities of daily living (ADL) is compromised due to aging, diseases, injuries and environmental factors. Center for Disease Control and Prevention (CDC) estimate of the costs of falls among older adults was $34 billion in 2013 and is expected to reach $54.9 billion in 2020. In this paper, we present a brief review of balance impairments followed by subjective and objective tools currently used in clinical settings for human balance assessment. We propose a novel computer vision (CV) based approach as a candidate for functional balance test. The test will take less than a minute to administer and expected to be objective, repeatable and highly discriminative in quantifying ability to maintain posture and balance. We present an informal study with preliminary data from 10 healthy volunteers, and compare performance with a balance assessment system called BTrackS Balance Assessment Board. Our results show high degree of correlation with BTrackS. The proposed system promises to be a good candidate for objective functional balance tests and warrants further investigations to assess validity in clinical settings, including acute care, long term care and assisted living care facilities. Our long term goals include non-intrusive approaches to assess balance competence during ADL in independent living environments.

  11. Computer-Based Procedures for Field Workers - Identified Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    The Idaho National Laboratory (INL) computer-based procedure (CBP) research team is exploring how best to design a CBP system that will deliver the intended benefits of increased efficiency and improved human performance. It is important to note that no “off-the-shelf” technology exists for the type of CBP system that is investigated and developed by the INL researchers. As more technology is integrated into the procedure process the importance of an appropriate and methodological approach to the design of the procedure system increases. Technological advancements offer great opportunities for efficiency and safety gains, however if the system is not designed correctly there is a large risk of unintentionally introducing new opportunities for human errors. The INL research team is breaking new ground in the area of CBPs with the prototype they have developed. Current electronic procedure systems are most commonly electronic versions of the paper-based procedures with hyperlinks to other procedures, limited user input functionality, and the ability to mark steps completed. These systems do not fully exploit the advantages digital technology. It is a part of the INL researchers’ role to develop and validate new CBP technologies that greatly increase the benefits of a CBP system to the nuclear industry.

  12. Computer Vision-Based Portable System for Nitroaromatics Discrimination

    Directory of Open Access Journals (Sweden)

    Nuria López-Ruiz

    2016-01-01

    Full Text Available A computer vision-based portable measurement system is presented in this report. The system is based on a compact reader unit composed of a microcamera and a Raspberry Pi board as control unit. This reader can acquire and process images of a sensor array formed by four nonselective sensing chemistries. Processing these array images it is possible to identify and quantify eight different nitroaromatic compounds (both explosives and related compounds by using chromatic coordinates of a color space. The system is also capable of sending the obtained information after the processing by a WiFi link to a smartphone in order to present the analysis result to the final user. The identification and quantification algorithm programmed in the Raspberry board is easy and quick enough to allow real time analysis. Nitroaromatic compounds analyzed in the range of mg/L were picric acid, 2,4-dinitrotoluene (2,4-DNT, 1,3-dinitrobenzene (1,3-DNB, 3,5-dinitrobenzonitrile (3,5-DNBN, 2-chloro-3,5-dinitrobenzotrifluoride (2-C-3,5-DNBF, 1,3,5-trinitrobenzene (TNB, 2,4,6-trinitrotoluene (TNT, and tetryl (TT.

  13. Computer Based Porosity Design by Multi Phase Topology Optimization

    Science.gov (United States)

    Burblies, Andreas; Busse, Matthias

    2008-02-01

    A numerical simulation technique called Multi Phase Topology Optimization (MPTO) based on finite element method has been developed and refined by Fraunhofer IFAM during the last five years. MPTO is able to determine the optimum distribution of two or more different materials in components under thermal and mechanical loads. The objective of optimization is to minimize the component's elastic energy. Conventional topology optimization methods which simulate adaptive bone mineralization have got the disadvantage that there is a continuous change of mass by growth processes. MPTO keeps all initial material concentrations and uses methods adapted from molecular dynamics to find energy minimum. Applying MPTO to mechanically loaded components with a high number of different material densities, the optimization results show graded and sometimes anisotropic porosity distributions which are very similar to natural bone structures. Now it is possible to design the macro- and microstructure of a mechanical component in one step. Computer based porosity design structures can be manufactured by new Rapid Prototyping technologies. Fraunhofer IFAM has applied successfully 3D-Printing and Selective Laser Sintering methods in order to produce very stiff light weight components with graded porosities calculated by MPTO.

  14. Identifying Nursing Computer Training Requirements using Web-based Assessment

    Directory of Open Access Journals (Sweden)

    Naser Ghazi

    2011-12-01

    Full Text Available Our work addresses issues of inefficiency and ineffectiveness in the training of nurses in computer literacy by developing an adaptive questionnaire system. This system works to identify the most effective training modules by evaluating applicants for pre-training and post-training. Our system, Systems Knowledge Assessment Tool (SKAT, aims to increase training proficiency, decrease training time and reduce costs associated with training by identifying areas of training required, and those which are not required for training, targeted to each individual. Based on the project’s requirements, a number of HTML documents were designed to be used as templates in the implementation stage. During this stage, the milestone principle was used, in which a series of coding and testing was performed to generate an error-free product.The decision-making process and it is components, as well as knowing the priority of each attribute in the application is responsible for determining the required training for each applicant. Thus, the decision-making process is an essential aspect of system design and greatly affects the training results of the applicant. The SKAT system has been evaluated to ensure that the system meets the project’s requirements. The evaluation stage was an important part of the project and required a number of nurses with different roles to evaluate the system. Based on their feedback, changes were made.

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  16. fNIRS-based brain-computer interfaces: a review.

    Science.gov (United States)

    Naseer, Noman; Hong, Keum-Shik

    2015-01-01

    A brain-computer interface (BCI) is a communication system that allows the use of brain activity to control computers or other external devices. It can, by bypassing the peripheral nervous system, provide a means of communication for people suffering from severe motor disabilities or in a persistent vegetative state. In this paper, brain-signal generation tasks, noise removal methods, feature extraction/selection schemes, and classification techniques for fNIRS-based BCI are reviewed. The most common brain areas for fNIRS BCI are the primary motor cortex and the prefrontal cortex. In relation to the motor cortex, motor imagery tasks were preferred to motor execution tasks since possible proprioceptive feedback could be avoided. In relation to the prefrontal cortex, fNIRS showed a significant advantage due to no hair in detecting the cognitive tasks like mental arithmetic, music imagery, emotion induction, etc. In removing physiological noise in fNIRS data, band-pass filtering was mostly used. However, more advanced techniques like adaptive filtering, independent component analysis (ICA), multi optodes arrangement, etc. are being pursued to overcome the problem that a band-pass filter cannot be used when both brain and physiological signals occur within a close band. In extracting features related to the desired brain signal, the mean, variance, peak value, slope, skewness, and kurtosis of the noised-removed hemodynamic response were used. For classification, the linear discriminant analysis method provided simple but good performance among others: support vector machine (SVM), hidden Markov model (HMM), artificial neural network, etc. fNIRS will be more widely used to monitor the occurrence of neuro-plasticity after neuro-rehabilitation and neuro-stimulation. Technical breakthroughs in the future are expected via bundled-type probes, hybrid EEG-fNIRS BCI, and through the detection of initial dips.

  17. fNIRS-based brain-computer interfaces: a review

    Directory of Open Access Journals (Sweden)

    Noman eNaseer

    2015-01-01

    Full Text Available A brain-computer interface (BCI is a communication system that allows the use of brain activity to control computers or other external devices. It can, by bypassing the peripheral nervous system, provide a means of communication for people suffering from severe motor disabilities or in a persistent vegetative state. In this paper, brain-signal generation tasks, noise removal methods, feature extraction/selection schemes, and classification techniques for fNIRS-based BCI are reviewed. The most common brain areas for fNIRS BCI are the primary motor cortex and the prefrontal cortex. In relation to the motor cortex, motor imagery tasks were preferred to motor execution tasks since possible proprioceptive feedback could be avoided. In relation to the prefrontal cortex, fNIRS showed a significant advantage due to no hair in detecting the cognitive tasks like mental arithmetic, music imagery, emotion induction, etc. In removing physiological noise in fNIRS data, band-pass filtering was mostly used. However, more advanced techniques like adaptive filtering, independent component analysis, multi optodes arrangement, etc. are being pursued to overcome the problem that a band-pass filter cannot be used when both brain and physiological signals occur within a close band. In extracting features related to the desired brain signal, the mean, variance, peak value, slope, skewness, and kurtosis of the noised-removed hemodynamic response were used. For classification, the linear discriminant analysis method provided simple but good performance among others: support vector machine, hidden Markov model, artificial neural network, etc. fNIRS will be more widely used to monitor the occurrence of neuro-plasticity after neuro-rehabilitation and neuro-stimulation. Technical breakthroughs in the future are expected via bundled-type probes, hybrid EEG-fNIRS BCI, and through the detection of initial dips.

  18. A demonstrative model of a lunar base simulation on a personal computer

    Science.gov (United States)

    1985-01-01

    The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.

  19. Computer-Based GED Testing: Implications for Students, Programs, and Practitioners

    Science.gov (United States)

    Brinkley-Etzkorn, Karen E.; Ishitani, Terry T.

    2016-01-01

    The purpose of this study was to understand the process of transitioning from the 2002 version of the GED test to the new 2014 computer-based version. Specifically, this research sought to identify: (1) stakeholder attitudes regarding the new computer-based test; (2) the relationship between students' computer access/comfort and their perceptions…

  20. Computer-Based Algorithmic Determination of Muscle Movement Onset Using M-Mode Ultrasonography

    Science.gov (United States)

    2017-04-01

    from baseline. Computerized algorithms Computed MO was determined by three separate classes of algorithms using RStudio: (i) a novel standard...ARL-RP-0596 ● APR 2017 US Army Research Laboratory Computer -Based Algorithmic Determination of Muscle Movement Onset Using M...the originator. ARL-RP-0596 ● APR 2017 US Army Research Laboratory Computer -Based Algorithmic Determination of Muscle Movement

  1. Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story

    Science.gov (United States)

    Gunbas, N.

    2015-01-01

    The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…

  2. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...

  3. Evaluating Computer-Based Assessment in a Risk-Based Model

    Science.gov (United States)

    Zakrzewski, Stan; Steven, Christine; Ricketts, Chris

    2009-01-01

    There are three purposes for evaluation: evaluation for action to aid the decision making process, evaluation for understanding to further enhance enlightenment and evaluation for control to ensure compliance to standards. This article argues that the primary function of evaluation in the "Catherine Wheel" computer-based assessment (CBA) cyclic…

  4. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  5. Validation of computer-based training in ureterorenoscopy.

    Science.gov (United States)

    Knoll, Thomas; Trojan, Lutz; Haecker, Axel; Alken, Peter; Michel, Maurice Stephan

    2005-06-01

    To evaluate the outcome of training both urological novices and experts, using the recently developed UroMentor (Simbionix Ltd, Israel) trainer, that provides a realistic simulation of rigid and flexible ureterorenoscopy (URS). Twenty experienced urologists (total number of previous flexible URSs 21-153) were monitored during simulated flexible URS for treating a lower calyceal stone, and the outcome was correlated with individual experience. A score was compiled based on the variables recorded, including total operation time, stone contact time, complications such as bleeding or perforation, and treatment success. A further five urological residents with no endourological experience were trained on the UroMentor in rigid URS for ureteric stone treatment. Their acquired clinical skills were subsequently compared to those of five urological residents who received no simulator training. All 20 experienced urologists disintegrated the stone on the simulator, and the score achieved was related to their personal experience; there was a significant difference in performance in those with 80 previous flexible URSs. For the five urological residents with no endourological experience, simulator training improved their skills, and comparison with urological residents who had received no simulator training showed advantages for the trained residents. After being trained on the simulator, the group performed better in the first four URSs on patients. Individual experience correlates with individual performance on the simulator. Simulator training was helpful in improving clinical skills. Although the distribution of computer-based simulators is limited by high prices, virtual reality-based training has the potential to become an important tool for clinical education.

  6. Resource Provisioning in SLA-Based Cluster Computing

    Science.gov (United States)

    Xiong, Kaiqi; Suh, Sang

    Cluster computing is excellent for parallel computation. It has become increasingly popular. In cluster computing, a service level agreement (SLA) is a set of quality of services (QoS) and a fee agreed between a customer and an application service provider. It plays an important role in an e-business application. An application service provider uses a set of cluster computing resources to support e-business applications subject to an SLA. In this paper, the QoS includes percentile response time and cluster utilization. We present an approach for resource provisioning in such an environment that minimizes the total cost of cluster computing resources used by an application service provider for an e-business application that often requires parallel computation for high service performance, availability, and reliability while satisfying a QoS and a fee negotiated between a customer and the application service provider. Simulation experiments demonstrate the applicability of the approach.

  7. Memristor-based nanoelectronic computing circuits and architectures

    CERN Document Server

    Vourkas, Ioannis

    2016-01-01

    This book considers the design and development of nanoelectronic computing circuits, systems and architectures focusing particularly on memristors, which represent one of today’s latest technology breakthroughs in nanoelectronics. The book studies, explores, and addresses the related challenges and proposes solutions for the smooth transition from conventional circuit technologies to emerging computing memristive nanotechnologies. Its content spans from fundamental device modeling to emerging storage system architectures and novel circuit design methodologies, targeting advanced non-conventional analog/digital massively parallel computational structures. Several new results on memristor modeling, memristive interconnections, logic circuit design, memory circuit architectures, computer arithmetic systems, simulation software tools, and applications of memristors in computing are presented. High-density memristive data storage combined with memristive circuit-design paradigms and computational tools applied t...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  10. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  11. Design Guidance for Computer-Based Procedures for Field Workers

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Nearly all activities that involve human interaction with nuclear power plant systems are guided by procedures, instructions, or checklists. Paper-based procedures (PBPs) currently used by most utilities have a demonstrated history of ensuring safety; however, improving procedure use could yield significant savings in increased efficiency, as well as improved safety through human performance gains. The nuclear industry is constantly trying to find ways to decrease human error rates, especially human error rates associated with procedure use. As a step toward the goal of improving field workers’ procedure use and adherence and hence improve human performance and overall system reliability, the U.S. Department of Energy Light Water Reactor Sustainability (LWRS) Program researchers, together with the nuclear industry, have been investigating the possibility and feasibility of replacing current paper-based procedures with computer-based procedures (CBPs). PBPs have ensured safe operation of plants for decades, but limitations in paper-based systems do not allow them to reach the full potential for procedures to prevent human errors. The environment in a nuclear power plant is constantly changing, depending on current plant status and operating mode. PBPs, which are static by nature, are being applied to a constantly changing context. This constraint often results in PBPs that are written in a manner that is intended to cover many potential operating scenarios. Hence, the procedure layout forces the operator to search through a large amount of irrelevant information to locate the pieces of information relevant for the task and situation at hand, which has potential consequences of taking up valuable time when operators must be responding to the situation, and potentially leading operators down an incorrect response path. Other challenges related to use of PBPs are management of multiple procedures, place-keeping, finding the correct procedure for a task, and relying

  12. Bio-inspired computational techniques based on advanced condition monitoring

    Institute of Scientific and Technical Information of China (English)

    Su Liangcheng; He Shan; Li Xiaoli; Li Xinglin

    2011-01-01

    The application of bio-inspired computational techniques to the field of condition monitoring is addressed.First, the bio-inspired computational techniques are briefly addressed; the advantages and disadvantages of these computational methods are made clear. Then, the roles of condition monitoring in the predictive maintenance and failures prediction and the development trends of condition monitoring are discussed. Finally, a case study on the condition monitoring of grinding machine is described, which shows the application of bio-inspired computational technique to a practical condition monitoring system.

  13. 12 CFR 614.4351 - Computation of lending and leasing limit base.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Computation of lending and leasing limit base... POLICIES AND OPERATIONS Lending and Leasing Limits § 614.4351 Computation of lending and leasing limit base. (a) Lending and leasing limit base. An institution's lending and leasing limit base is composed...

  14. Bringing Vision-Based Measurements into our Daily Life: A Grand Challenge for Computer Vision Systems

    OpenAIRE

    Scharcanski, Jacob

    2016-01-01

    Bringing computer vision into our daily life has been challenging researchers in industry and in academia over the past decades. However, the continuous development of cameras and computing systems turned computer vision-based measurements into a viable option, allowing new solutions to known problems. In this context, computer vision is a generic tool that can be used to measure and monitor phenomena in wide range of fields. The idea of using vision-based measurements is appealing, since the...

  15. Integrating structure-based and ligand-based approaches for computational drug design.

    Science.gov (United States)

    Wilson, Gregory L; Lill, Markus A

    2011-04-01

    Methods utilized in computer-aided drug design can be classified into two major categories: structure based and ligand based, using information on the structure of the protein or on the biological and physicochemical properties of bound ligands, respectively. In recent years there has been a trend towards integrating these two methods in order to enhance the reliability and efficiency of computer-aided drug-design approaches by combining information from both the ligand and the protein. This trend resulted in a variety of methods that include: pseudoreceptor methods, pharmacophore methods, fingerprint methods and approaches integrating docking with similarity-based methods. In this article, we will describe the concepts behind each method and selected applications.

  16. Brain-computer interface based on generation of visual images.

    Directory of Open Access Journals (Sweden)

    Pavel Bobrov

    Full Text Available This paper examines the task of recognizing EEG patterns that correspond to performing three mental tasks: relaxation and imagining of two types of pictures: faces and houses. The experiments were performed using two EEG headsets: BrainProducts ActiCap and Emotiv EPOC. The Emotiv headset becomes widely used in consumer BCI application allowing for conducting large-scale EEG experiments in the future. Since classification accuracy significantly exceeded the level of random classification during the first three days of the experiment with EPOC headset, a control experiment was performed on the fourth day using ActiCap. The control experiment has shown that utilization of high-quality research equipment can enhance classification accuracy (up to 68% in some subjects and that the accuracy is independent of the presence of EEG artifacts related to blinking and eye movement. This study also shows that computationally-inexpensive bayesian classifier based on covariance matrix analysis yields similar classification accuracy in this problem as a more sophisticated Multi-class Common Spatial Patterns (MCSP classifier.

  17. Computer-based mechanical design of overhead lines

    Science.gov (United States)

    Rusinaru, D.; Bratu, C.; Dinu, R. C.; Manescu, L. G.

    2016-02-01

    Beside the performance, the safety level according to the actual standards is a compulsory condition for distribution grids’ operation. Some of the measures leading to improvement of the overhead lines reliability ask for installations’ modernization. The constraints imposed to the new lines components refer to the technical aspects as thermal stress or voltage drop, and look for economic efficiency, too. The mechanical sizing of the overhead lines is after all an optimization problem. More precisely, the task in designing of the overhead line profile is to size poles, cross-arms and stays and locate poles along a line route so that the total costs of the line's structure to be minimized and the technical and safety constraints to be fulfilled.The authors present in this paper an application for the Computer-Based Mechanical Design of the Overhead Lines and the features of the corresponding Visual Basic program, adjusted to the distribution lines. The constraints of the optimization problem are adjusted to the existing weather and loading conditions of Romania. The outputs of the software application for mechanical design of overhead lines are: the list of components chosen for the line: poles, cross-arms, stays; the list of conductor tension and forces for each pole, cross-arm and stay for different weather conditions; the line profile drawings.The main features of the mechanical overhead lines design software are interactivity, local optimization function and high-level user-interface

  18. Computer Aided Detection of SARS Based on Radiographs Data Mining.

    Science.gov (United States)

    Xuanyang, Xie; Yuchang, Gong; Shouhong, Wan; Xi, Li

    2005-01-01

    This paper introduces our work on how to use image mining techniques to detect SARS, the severe acute respiratory syndrome, automatically as the prototype of computer aided detection/diagnosis (CAD) system. Data used in this paper are digitalized PA(posterior anterior) X-ray images stored in the real-life picture archiving and communication system (PACS) of the 2nd Affiliation Hospital of Guangzhou Medical College. Association rule mining was applied first but results showed there was no significant difference between the locations of the lesions or infiltrate. Classification based on image textures was performed. A sample set contains both the pneumonia and SARS X-ray images was built in the first place. After modeling each sample by a feature vector, the sample set was partitioned to match the detection purpose: classification. Three methods were used: C4.5, neural network (NN) and CART. Final result shows that 70.94% SARS cases can be detected by CART. Data preparation, segmentation, feature extraction and data mining steps, with corresponding techniques are included in this paper. ROC charts and confusion matrix by all three methods are given and analyzed.

  19. Learning styles: individualizing computer-based learning environments

    Directory of Open Access Journals (Sweden)

    Tim Musson

    1995-12-01

    Full Text Available While the need to adapt teaching to the needs of a student is generally acknowledged (see Corno and Snow, 1986, for a wide review of the literature, little is known about the impact of individual learner-differences on the quality of learning attained within computer-based learning environments (CBLEs. What evidence there is appears to support the notion that individual differences have implications for the degree of success or failure experienced by students (Ford and Ford, 1992 and by trainee end-users of software packages (Bostrom et al, 1990. The problem is to identify the way in which specific individual characteristics of a student interact with particular features of a CBLE, and how the interaction affects the quality of the resultant learning. Teaching in a CBLE is likely to require a subset of teaching strategies different from that subset appropriate to more traditional environments, and the use of a machine may elicit different behaviours from those normally arising in a classroom context.

  20. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Science.gov (United States)

    Caldas-Morgan, Manuel; Alvarez-Rosario, Alexander; Rodrigues Padovese, Linilson

    2015-01-01

    As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost.

  1. An Autonomous Underwater Recorder Based on a Single Board Computer.

    Directory of Open Access Journals (Sweden)

    Manuel Caldas-Morgan

    Full Text Available As industrial activities continue to grow on the Brazilian coast, underwater sound measurements are becoming of great scientific importance as they are essential to evaluate the impact of these activities on local ecosystems. In this context, the use of commercial underwater recorders is not always the most feasible alternative, due to their high cost and lack of flexibility. Design and construction of more affordable alternatives from scratch can become complex because it requires profound knowledge in areas such as electronics and low-level programming. With the aim of providing a solution; a well succeeded model of a highly flexible, low-cost alternative to commercial recorders was built based on a Raspberry Pi single board computer. A properly working prototype was assembled and it demonstrated adequate performance levels in all tested situations. The prototype was equipped with a power management module which was thoroughly evaluated. It is estimated that it will allow for great battery savings on long-term scheduled recordings. The underwater recording device was successfully deployed at selected locations along the Brazilian coast, where it adequately recorded animal and manmade acoustic events, among others. Although power consumption may not be as efficient as that of commercial and/or micro-processed solutions, the advantage offered by the proposed device is its high customizability, lower development time and inherently, its cost.

  2. A web based Publish-Subscribe framework for mobile computing

    Directory of Open Access Journals (Sweden)

    Cosmina Ivan

    2014-05-01

    Full Text Available The growing popularity of mobile devices is permanently changing the Internet user’s computing experience. Smartphones and tablets begin to replace the desktop as the primary means of interacting with various information technology and web resources. While mobile devices facilitate in consuming web resources in the form of web services, the growing demand for consuming services on mobile device is introducing a complex ecosystem in the mobile environment. This research addresses the communication challenges involved in mobile distributed networks and proposes an event-driven communication approach for information dissemination. This research investigates different communication techniques such as polling, long-polling and server-side push as client-server interaction mechanisms and the latest web technologies standard WebSocket , as communication protocol within a Publish/Subscribe paradigm. Finally, this paper introduces and evaluates the proposed framework, that is a hybrid approach of WebSocket and event-based publish/subscribe for operating in mobile environments.

  3. Conversational Awareness in Text-Based Computer Mediated Communication

    Science.gov (United States)

    Tran, Minh Hong; Yang, Yun; Raikundalia, Gitesh K.

    Text-based computer-mediated communication (TxtCMC) supports an instant exchange of messages among geographically distributed people. TxtCMC, such as Instant Messaging and chat tools, has increasingly become widespread and popular at home and at work. Supporting conversational awareness is an important aspect of TxtCMC. Conversational awareness provides a user with information about the presence and activity of others, and therefore helps to establish a context for the user’s own activity. Unfortunately, current interface design of TxtCMC provides inadequate support for conversational awareness, especially in support for awareness of turn-taking, conversational context and multiple concurrent conversations. This research aims to address these three issues by (1) conducting an empirical study to identify the user need for conversational awareness and (2) designing an interface to support this type of awareness. This chapter presents two innovative prototypes, namely Relaxed Instant Messenger (RIM) and Conversational Dock (ConDock). RIM integrates a sequential interface with an adaptive threaded interface to support awareness of turn-taking and conversational context. ConDock adopts a focus + context visualisation technique to support awareness of multiple conversations. The evaluations of the two prototypes show that they meet their design objectives and were found useful in enhancing group communication.

  4. WaveJava: Wavelet-based network computing

    Science.gov (United States)

    Ma, Kun; Jiao, Licheng; Shi, Zhuoer

    1997-04-01

    Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.

  5. Implementing Computer-Based Procedures: Thinking Outside the Paper Margins

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna; Bly, Aaron

    2017-06-01

    In the past year there has been increased interest from the nuclear industry in adopting the use of electronic work packages and computer-based procedures (CBPs) in the field. The goal is to incorporate the use of technology in order to meet the Nuclear Promise requirements of reducing costs and improve efficiency and decrease human error rates of plant operations. Researchers, together with the nuclear industry, have been investigating the benefits an electronic work package system and specifically CBPs would have over current paper-based procedure practices. There are several classifications of CBPs ranging from a straight copy of the paper-based procedure in PDF format to a more intelligent dynamic CBP. A CBP system offers a vast variety of improvements, such as context driven job aids, integrated human performance tools (e.g., placekeeping and correct component verification), and dynamic step presentation. The latter means that the CBP system could only display relevant steps based on operating mode, plant status, and the task at hand. The improvements can lead to reduction of the worker’s workload and human error by allowing the work to focus on the task at hand more. A team of human factors researchers at the Idaho National Laboratory studied and developed design concepts for CBPs for field workers between 2012 and 2016. The focus of the research was to present information in a procedure in a manner that leveraged the dynamic and computational capabilities of a handheld device allowing the worker to focus more on the task at hand than on the administrative processes currently applied when conducting work in the plant. As a part of the research the team identified type of work, instructions, and scenarios where the transition to a dynamic CBP system might not be as beneficial as it would for other types of work in the plant. In most cases the decision to use a dynamic CBP system and utilize the dynamic capabilities gained will be beneficial to the worker

  6. Prior-based artifact correction (PBAC) in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heußer, Thorsten, E-mail: thorsten.heusser@dkfz-heidelberg.de; Brehm, Marcus [Medical Physics in Radiology, German Cancer Research Center (DKFZ), Im Neuenheimer Feld 280, 69120 Heidelberg (Germany); Ritschl, Ludwig [Ziehm Imaging GmbH, Donaustraße 31, 90451 Nürnberg (Germany); Sawall, Stefan; Kachelrieß, Marc [Medical Physics in Radiology, German Cancer Research Center (DKFZ), Im Neuenheimer Feld 280, 69120 Heidelberg, Germany and Institute of Medical Physics, Friedrich–Alexander–University (FAU) of Erlangen–Nürnberg, Henkestraße 91, 91052 Erlangen (Germany)

    2014-02-15

    Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form of a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data.

  7. Algorithmic support for commodity-based parallel computing systems.

    Energy Technology Data Exchange (ETDEWEB)

    Leung, Vitus Joseph; Bender, Michael A. (State University of New York, Stony Brook, NY); Bunde, David P. (University of Illinois, Urbna, IL); Phillips, Cynthia Ann

    2003-10-01

    The Computational Plant or Cplant is a commodity-based distributed-memory supercomputer under development at Sandia National Laboratories. Distributed-memory supercomputers run many parallel programs simultaneously. Users submit their programs to a job queue. When a job is scheduled to run, it is assigned to a set of available processors. Job runtime depends not only on the number of processors but also on the particular set of processors assigned to it. Jobs should be allocated to localized clusters of processors to minimize communication costs and to avoid bandwidth contention caused by overlapping jobs. This report introduces new allocation strategies and performance metrics based on space-filling curves and one dimensional allocation strategies. These algorithms are general and simple. Preliminary simulations and Cplant experiments indicate that both space-filling curves and one-dimensional packing improve processor locality compared to the sorted free list strategy previously used on Cplant. These new allocation strategies are implemented in Release 2.0 of the Cplant System Software that was phased into the Cplant systems at Sandia by May 2002. Experimental results then demonstrated that the average number of communication hops between the processors allocated to a job strongly correlates with the job's completion time. This report also gives processor-allocation algorithms for minimizing the average number of communication hops between the assigned processors for grid architectures. The associated clustering problem is as follows: Given n points in {Re}d, find k points that minimize their average pairwise L{sub 1} distance. Exact and approximate algorithms are given for these optimization problems. One of these algorithms has been implemented on Cplant and will be included in Cplant System Software, Version 2.1, to be released. In more preliminary work, we suggest improvements to the scheduler separate from the allocator.

  8. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2010-11-01

    Full Text Available Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.,the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups with high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

  9. Computer-based training for safety: comparing methods with older and younger workers.

    Science.gov (United States)

    Wallen, Erik S; Mulloy, Karen B

    2006-01-01

    Computer-based safety training is becoming more common and is being delivered to an increasingly aging workforce. Aging results in a number of changes that make it more difficult to learn from certain types of computer-based training. Instructional designs derived from cognitive learning theories may overcome some of these difficulties. Three versions of computer-based respiratory safety training were shown to older and younger workers who then took a high and a low level learning test. Younger workers did better overall. Both older and younger workers did best with the version containing text with pictures and audio narration. Computer-based training with pictures and audio narration may be beneficial for workers over 45 years of age. Computer-based safety training has advantages but workers of different ages may benefit differently. Computer-based safety programs should be designed and selected based on their ability to effectively train older as well as younger learners.

  10. Evaluation of Computer-Based Procedure System Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Johanna Oxstrand; Katya Le Blanc; Seth Hays

    2012-09-01

    This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE), performed in close collaboration with industry R&D programs, to provide the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The introduction of advanced technology in existing nuclear power plants may help to manage the effects of aging systems, structures, and components. In addition, the incorporation of advanced technology in the existing LWR fleet may entice the future workforce, who will be familiar with advanced technology, to work for these utilities rather than more newly built nuclear power plants. Advantages are being sought by developing and deploying technologies that will increase safety and efficiency. One significant opportunity for existing plants to increase efficiency is to phase out the paper-based procedures (PBPs) currently used at most nuclear power plants and replace them, where feasible, with computer-based procedures (CBPs). PBPs have ensured safe operation of plants for decades, but limitations in paper-based systems do not allow them to reach the full potential for procedures to prevent human errors. The environment in a nuclear power plant is constantly changing depending on current plant status and operating mode. PBPs, which are static by nature, are being applied to a constantly changing context. This constraint often results in PBPs that are written in a manner that is intended to cover many potential operating scenarios. Hence, the procedure layout forces the operator to search through a large amount of irrelevant information to locate the pieces of information

  11. The Potential for Computer Based Systems in Modular Engineering

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth

    1998-01-01

    The paper elaborates on knowledge management and the possibility for computer support of the design process of pharmaceutical production plants in relation to the ph.d. project modular engineering.......The paper elaborates on knowledge management and the possibility for computer support of the design process of pharmaceutical production plants in relation to the ph.d. project modular engineering....

  12. A Semantic Based Policy Management Framework for Cloud Computing Environments

    Science.gov (United States)

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…

  13. The Potential for Computer Based Systems in Modular Engineering

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth

    1998-01-01

    The paper elaborates on knowledge management and the possibility for computer support of the design process of pharmaceutical production plants in relation to the ph.d. project modular engineering.......The paper elaborates on knowledge management and the possibility for computer support of the design process of pharmaceutical production plants in relation to the ph.d. project modular engineering....

  14. A Semantic Based Policy Management Framework for Cloud Computing Environments

    Science.gov (United States)

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…

  15. A Mobile-Based Computer Controller via Android Technology

    Directory of Open Access Journals (Sweden)

    Siew-Chin Chong

    2013-02-01

    Full Text Available The evolution of mobile devices, especially in these modern days, has drastically changed the face of business. A mobile phone device is often expected to offer computer-like functionality. These days, most mobile phone users find it somehow inconvenient to do some tasks using their computers. Most individuals prefer to change positions while sitting, stretching, and also feeling a bit more comfortable when browsing through their computers. It can be very impractical to be confined to the keyboard and mouse while sitting 5 or 10 feet from the computer. Hence, the proposed application is meant to turn the hand phone into a wireless keyboard and mouse with a touch-pad, through the wireless network. This prototype is proven to be able to perform most of the actions a normal computer keyboard and mouse can perform.

  16. Computational Chemistry Data Management Platform Based on the Semantic Web.

    Science.gov (United States)

    Wang, Bing; Dobosh, Paul A; Chalk, Stuart; Sopek, Mirek; Ostlund, Neil S

    2017-01-12

    This paper presents a formal data publishing platform for computational chemistry using semantic web technologies. This platform encapsulates computational chemistry data from a variety of packages in an Extensible Markup Language (XML) file called CSX (Common Standard for eXchange). On the basis of a Gainesville Core (GC) ontology for computational chemistry, a CSX XML file is converted into the JavaScript Object Notation for Linked Data (JSON-LD) format using an XML Stylesheet Language Transformation (XSLT) file. Ultimately the JSON-LD file is converted to subject-predicate-object triples in a Turtle (TTL) file and published on the web portal. By leveraging semantic web technologies, we are able to place computational chemistry data onto web portals as a component of a Giant Global Graph (GGG) such that computer agents, as well as individual chemists, can access the data.

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  18. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  4. Simulation of Neurocomputing Based on Photophobic Reactions of Euglena: Toward Microbe-Based Neural Network Computing

    Science.gov (United States)

    Ozasa, Kazunari; Aono, Masashi; Maeda, Mizuo; Hara, Masahiko

    In order to develop an adaptive computing system, we investigate microscopic optical feedback to a group of microbes (Euglena gracilis in this study) with a neural network algorithm, expecting that the unique characteristics of microbes, especially their strategies to survive/adapt against unfavorable environmental stimuli, will explicitly determine the temporal evolution of the microbe-based feedback system. The photophobic reactions of Euglena are extracted from experiments, and built in the Monte-Carlo simulation of a microbe-based neurocomputing. The simulation revealed a good performance of Euglena-based neurocomputing. Dynamic transition among the solutions is discussed from the viewpoint of feedback instability.

  5. Attraction-Based Computation of Hyperbolic Lagrangian Coherent Structures

    CERN Document Server

    Karrasch, Daniel; Haller, George

    2014-01-01

    Recent advances enable the simultaneous computation of both attracting and repelling families of Lagrangian Coherent Structures (LCS) at the same initial or final time of interest. Obtaining LCS positions at intermediate times, however, has been problematic, because either the repelling or the attracting family is unstable with respect to numerical advection in a given time direction. Here we develop a new approach to compute arbitrary positions of hyperbolic LCS in a numerically robust fashion. Our approach only involves the advection of attracting material surfaces, thereby providing accurate LCS tracking at low computational cost. We illustrate the advantages of this approach on a simple model and on a turbulent velocity data set.

  6. Computer-based facial expression analysis for assessing user experience

    OpenAIRE

    Branco, Pedro

    2006-01-01

    Tese de Doutoramento em Tecnologias e Sistemas de Informação - Área de Especialização em Engenharia da Programação e dos Sistemas Informáticos For the majority of the users, computers are difficult and frustrating to use. The proliferation of computers in the daily life in all sort of shapes and forms becomes a significant factor for potentially aggravating and thus degrading the users’ acceptance of the technology. Traditional user observation methods, aiming at improving human-computer i...

  7. 面向CBT的计算性的VRML建模%Computing- based Modelling in VRML Targeting Computer Based Teaching

    Institute of Scientific and Technical Information of China (English)

    张军; 李艳; 周晓军

    2006-01-01

    VR(Virtual Reality)技术已经成为影响CBT(Computer Based Teaching)的关键技术之一.基于VRML(Virtual Reality Modeling Language)的VR技术已经作为教育部现代教育技术的重要内容[1].但在现实CBT课件中应用偏少.一个重要问题就在于有关教师包括部分开发人员对于复杂VRML建模感觉难度太大.本文首先分析了VRML的情景模拟教学特色,进而对基于几何、函数等计算的VRML建模技术进行了综合介绍和分析对比.对模型在基于VRML的情景教学中的运用做了简要说明.希望本文分析介绍的多策略VRML建模方法能促进VRML技术在CBT中的运用.

  8. Silicon-based spin and charge quantum computation

    Directory of Open Access Journals (Sweden)

    Belita Koiller

    2005-06-01

    Full Text Available Silicon-based quantum-computer architectures have attracted attention because of their promise for scalability and their potential for synergetically utilizing the available resources associated with the existing Si technology infrastructure. Electronic and nuclear spins of shallow donors (e.g. phosphorus in Si are ideal candidates for qubits in such proposals due to the relatively long spin coherence times. For these spin qubits, donor electron charge manipulation by external gates is a key ingredient for control and read-out of single-qubit operations, while shallow donor exchange gates are frequently invoked to perform two-qubit operations. More recently, charge qubits based on tunnel coupling in P+2 substitutional molecular ions in Si have also been proposed. We discuss the feasibility of the building blocks involved in shallow donor quantum computation in silicon, taking into account the peculiarities of silicon electronic structure, in particular the six degenerate states at the conduction band edge. We show that quantum interference among these states does not significantly affect operations involving a single donor, but leads to fast oscillations in electron exchange coupling and on tunnel-coupling strength when the donor pair relative position is changed on a lattice-parameter scale. These studies illustrate the considerable potential as well as the tremendous challenges posed by donor spin and charge as candidates for qubits in silicon.Arquiteturas de computadores quânticos baseadas em silício vêm atraindo atenção devido às suas perspectivas de escalabilidade e utilização dos recursos já instalados associados à tecnologia do Si. Spins eletrônicos e nucleares de doadores rasos (por exemplo fósforo em Si são candidatos ideais para bits quânticos (qubits nestas propostas, devido aos tempos de coerência relativamente longos dos spins em matrizes de Si. Para estes qubits de spin, a manipulação da carga do elétron do doador

  9. Interactive, Computer-Based Training Program for Radiological Workers

    Energy Technology Data Exchange (ETDEWEB)

    Trinoskey, P.A.; Camacho, P.I.; Wells, L.

    2000-01-18

    Lawrence Livermore National Laboratory (LLNL) is redesigning its Computer-Based Training (CBT) program for radiological workers. The redesign represents a major effort to produce a single, highly interactive and flexible CBT program that will meet the training needs of a wide range of radiological workers--from researchers and x-ray operators to individuals working in tritium, uranium, plutonium, and accelerator facilities. The new CBT program addresses the broad diversity of backgrounds found at a national laboratory. When a training audience is homogeneous in terms of education level and type of work performed, it is difficult to duplicate the effectiveness of a flexible, technically competent instructor who can tailor a course to the express needs and concerns of a course's participants. Unfortunately, such homogeneity is rare. At LLNL, they have a diverse workforce engaged in a wide range of radiological activities, from the fairly common to the quite exotic. As a result, the Laboratory must offer a wide variety of radiological worker courses. These include a general contamination-control course in addition to radioactive-material-handling courses for both low-level laboratory (i.e., bench-top) activities as well as high-level work in tritium, uranium, and plutonium facilities. They also offer training courses for employees who work with radiation-generating devices--x-ray, accelerator, and E-beam operators, for instance. However, even with the number and variety of courses the Laboratory offers, they are constrained by the diversity of backgrounds (i.e., knowledge and experience) of those to be trained. Moreover, time constraints often preclude in-depth coverage of site- and/or task-specific details. In response to this situation, several years ago LLNL began moving toward computer-based training for radiological workers. Today, that CBT effort includes a general radiological safety course developed by the Department of Energy's Hanford facility and

  10. Dictionary-based image denoising for dual energy computed tomography

    Science.gov (United States)

    Mechlem, Korbinian; Allner, Sebastian; Mei, Kai; Pfeiffer, Franz; Noël, Peter B.

    2016-03-01

    Compared to conventional computed tomography (CT), dual energy CT allows for improved material decomposition by conducting measurements at two distinct energy spectra. Since radiation exposure is a major concern in clinical CT, there is a need for tools to reduce the noise level in images while preserving diagnostic information. One way to achieve this goal is the application of image-based denoising algorithms after an analytical reconstruction has been performed. We have developed a modified dictionary denoising algorithm for dual energy CT aimed at exploiting the high spatial correlation between between images obtained from different energy spectra. Both the low-and high energy image are partitioned into small patches which are subsequently normalized. Combined patches with improved signal-to-noise ratio are formed by a weighted addition of corresponding normalized patches from both images. Assuming that corresponding low-and high energy image patches are related by a linear transformation, the signal in both patches is added coherently while noise is neglected. Conventional dictionary denoising is then performed on the combined patches. Compared to conventional dictionary denoising and bilateral filtering, our algorithm achieved superior performance in terms of qualitative and quantitative image quality measures. We demonstrate, in simulation studies, that this approach can produce 2d-histograms of the high- and low-energy reconstruction which are characterized by significantly improved material features and separation. Moreover, in comparison to other approaches that attempt denoising without simultaneously using both energy signals, superior similarity to the ground truth can be found with our proposed algorithm.

  11. Semi-supervised adaptation in ssvep-based brain-computer interface using tri-training

    DEFF Research Database (Denmark)

    Bender, Thomas; Kjaer, Troels W.; Thomsen, Carsten E.;

    2013-01-01

    This paper presents a novel and computationally simple tri-training based semi-supervised steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI). It is implemented with autocorrelation-based features and a Naïve-Bayes classifier (NBC). The system uses nine characters...

  12. Computer-Based Mathematics Instructions for Engineering Students

    Science.gov (United States)

    Khan, Mustaq A.; Wall, Curtiss E.

    1996-01-01

    Almost every engineering course involves mathematics in one form or another. The analytical process of developing mathematical models is very important for engineering students. However, the computational process involved in the solution of some mathematical problems may be very tedious and time consuming. There is a significant amount of mathematical software such as Mathematica, Mathcad, and Maple designed to aid in the solution of these instructional problems. The use of these packages in classroom teaching can greatly enhance understanding, and save time. Integration of computer technology in mathematics classes, without de-emphasizing the traditional analytical aspects of teaching, has proven very successful and is becoming almost essential. Sample computer laboratory modules are developed for presentation in the classroom setting. This is accomplished through the use of overhead projectors linked to graphing calculators and computers. Model problems are carefully selected from different areas.

  13. Computer networks. Citations from the NTIS data base

    Science.gov (United States)

    Jones, J. E.

    1980-08-01

    Research reports on aspects of computer networks, including hardware, software, data transmission, time sharing, and applicable theory to network design are cited. Specific studies on the ARPA networks, and other such systems are listed.

  14. Graph-based knowledge representation computational foundations of conceptual graphs

    CERN Document Server

    Chein, Michel; Chein, Michel

    2008-01-01

    In addressing the question of how far it is possible to go in knowledge representation and reasoning through graphs, the authors cover basic conceptual graphs, computational aspects, and kernel extensions. The basic mathematical notions are summarized.

  15. A Computer-Based Atlas of a Rat Dissection.

    Science.gov (United States)

    Quentin-Baxter, Megan; Dewhurst, David

    1990-01-01

    A hypermedia computer program that uses text, graphics, sound, and animation with associative information linking techniques to teach the functional anatomy of a rat is described. The program includes a nonintimidating tutor, to which the student may turn. (KR)

  16. Current Trends in Computer-Based Education in Medicine

    Science.gov (United States)

    Farquhar, Barbara B.; Votaw, Robert G.

    1978-01-01

    Important current trends in the use of computer technology to enhance medical education are reported in the areas of simulation and assessment of clinical competence, curriculum integration, financial support, and means of exchanging views and scientific information. (RAO)

  17. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    Science.gov (United States)

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  18. Comparison of the learning effectiveness of computer-based and ...

    African Journals Online (AJOL)

    Erna Kinsey

    advantage with regard to the use of modern technological equipment, especially computers, as ... Concepts such as electric current could be misrepresented. •. Manipulative skills such as .... ling by bicycle than by foot. These ideas can explain ...

  19. Primary School Students' Anxiety and Attitudes toward Computer-Based Learning.

    Science.gov (United States)

    Seng, SeokHoon; Choo, Mooi Lee

    The introduction and implementation of computer-based learning (CBL) in primary schools in Singapore has created both benefits and problems. This study examined the attitudes and level of anxiety of 77 students toward CBL through two scales, the Computer Programming Anxiety Scale and the Liking for Computer-Related Activities Scale. Results showed…

  20. Computer Game-Based Learning: Perceptions and Experiences of Senior Chinese Adults

    Science.gov (United States)

    Wang, Feihong; Lockee, Barbara B.; Burton, John K.

    2012-01-01

    The purpose of this study was to investigate senior Chinese adults' potential acceptance of computer game-based learning (CGBL) by probing their perceptions of computer game play and their perceived impacts of game play on their learning of computer skills and life satisfaction. A total of 60 senior adults from a local senior adult learning center…

  1. Design Process of a Goal-Based Scenario on Computing Fundamentals

    Science.gov (United States)

    Beriswill, Joanne Elizabeth

    2014-01-01

    In this design case, an instructor developed a goal-based scenario (GBS) for undergraduate computer fundamentals students to apply their knowledge of computer equipment and software. The GBS, entitled the MegaTech Project, presented the students with descriptions of the everyday activities of four persons needing to purchase a computer system. The…

  2. Asynchronous Distributed Execution of Fixpoint-Based Computational Fields

    DEFF Research Database (Denmark)

    Lluch Lafuente, Alberto; Loreti, Michele; Montanari, Ugo

    2016-01-01

    Coordination is essential for dynamic distributed systems whose components exhibit interactive and autonomous behaviors. Spatially distributed, locally interacting, propagating computational fields are particularly appealing for allowing components to join and leave with little or no overhead....... Computational fields are a key ingredient of aggregate programming, a promising software engineering methodology particularly relevant for the Internet of Things. In our approach, space topology is represented by a fixed graph-shaped field, namely a network with attributes on both nodes and arcs, where arcs...

  3. OpenRS-Cloud:A remote sensing image processing platform based on cloud computing environment

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper explores the use of cloud computing for remote sensing image processing.The main contribution of our work is to develop a remote sensing image processing platform based on cloud computing technology(OpenRS-Cloud).This paper focuses on enabling methodical investigations into the development pattern,computational model,data management and service model exploring this novel distributed computing model.The experimental INSAR processing flow is implemented to verify the efficiency and feasibility of OpenRS-Cloud platform.The results show that cloud computing is well suited for computationally-intensive and data-intensive remote sensing services.

  4. CLOUD COMPUTING BASED INFORMATION SYSTEMS -PRESENT AND FUTURE

    Directory of Open Access Journals (Sweden)

    Maximilian ROBU

    2012-12-01

    Full Text Available The current economic crisis and the global recession have affected the IT market as well. A solution camefrom the Cloud Computing area by optimizing IT budgets and eliminating different types of expenses (servers, licenses,and so on. Cloud Computing is an exciting and interesting phenomenon, because of its relative novelty and explodinggrowth. Because of its raise in popularity and usage Cloud Computing has established its role as a research topic.However the tendency is to focus on the technical aspects of Cloud Computing, thus leaving the potential that thistechnology offers unexplored. With the help of this technology new market player arise and they manage to break thetraditional value chain of service provision. The main focus of this paper is the business aspects of Cloud. In particularwe will talk about the economic aspects that cover using Cloud Computing (when, why and how to use, and theimpacts on the infrastructure, the legalistic issues that come from using Cloud Computing; the scalability and partiallyunclear legislation.

  5. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    Science.gov (United States)

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  6. Dynamic tracking of elementary preservice teachers' experiences with computer-based mathematics learning environments

    Science.gov (United States)

    Campbell, Stephen R.

    2003-05-01

    A challenging task in educational research today is to understand the implications of recent developments in computer-based learning environments. On the other hand, questions regarding learning and mathematical cognition have long been a central focus of research in mathematics education. Adding technology compounds an already complex problematic. Fortunately, computer-based technology also provides researchers with new ways of studying cognition and instruction. This paper introduces a new method for dynamically tracking learners' experiences in computer-based learning environments. Dynamic tracking is illustrated in both a classroom and a clinical setting by drawing on two studies with elementary preservice teachers working in computer-based mathematics learning environments.

  7. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  13. Computationally Efficient Implementation of Convolution-based Locally Adaptive Binarization Techniques

    OpenAIRE

    Mollah, Ayatullah Faruk; Basu, Subhadip; Nasipuri, Mita

    2012-01-01

    One of the most important steps of document image processing is binarization. The computational requirements of locally adaptive binarization techniques make them unsuitable for devices with limited computing facilities. In this paper, we have presented a computationally efficient implementation of convolution based locally adaptive binarization techniques keeping the performance comparable to the original implementation. The computational complexity has been reduced from O(W2N2) to O(WN2) wh...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  15. Industry and Academic Consortium for Computer Based Subsurface Geology Laboratory

    Science.gov (United States)

    Brown, A. L.; Nunn, J. A.; Sears, S. O.

    2008-12-01

    Twenty two licenses for Petrel Software acquired through a grant from Schlumberger are being used to redesign the laboratory portion of Subsurface Geology at Louisiana State University. The course redesign is a cooperative effort between LSU's Geology and Geophysics and Petroleum Engineering Departments and Schlumberger's Technical Training Division. In spring 2008, two laboratory sections were taught with 22 students in each section. The class contained geology majors, petroleum engineering majors, and geology graduate students. Limited enrollments and 3 hour labs make it possible to incorporate hands-on visualization, animation, manipulation of data and images, and access to geological data available online. 24/7 access to the laboratory and step by step instructions for Petrel exercises strongly promoted peer instruction and individual learning. Goals of the course redesign include: enhancing visualization of earth materials; strengthening student's ability to acquire, manage, and interpret multifaceted geological information; fostering critical thinking, the scientific method; improving student communication skills; providing cross training between geologists and engineers and increasing the quantity, quality, and diversity of students pursuing Earth Science and Petroleum Engineering careers. IT resources available in the laboratory provide students with sophisticated visualization tools, allowing them to switch between 2-D and 3-D reconstructions more seamlessly, and enabling them to manipulate larger integrated data-sets, thus permitting more time for critical thinking and hypothesis testing. IT resources also enable faculty and students to simultaneously work with the software to visually interrogate a 3D data set and immediately test hypothesis formulated in class. Preliminary evaluation of class results indicate that students found MS-Windows based Petrel easy to learn. By the end of the semester, students were able to not only map horizons and faults

  16. A General Theory of Computational Scalability Based on Rational Functions

    CERN Document Server

    Gunther, Neil J

    2008-01-01

    The universal scalability law (USL) of computational capacity is a rational function C_p = P(p)/Q(p) with P(p) a linear polynomial and Q(p) a second-degree polynomial in the number of physical processors p, that has long been used for statistical modeling and prediction of computer system performance. We prove that C_p is equivalent to the synchronous throughput bound for a machine-repairman with state-dependent service rate. Simpler rational functions, such as Amdahl's law and Gustafson speedup, are corollaries of this queue-theoretic bound. C_p is both necessary and sufficient for modeling all practical characteristics of computational scalability.

  17. Resource Optimization Based on Demand in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ramakrishnan Ramanathan

    2014-10-01

    Full Text Available A Cloud Computing gives the opportunity to dynamically scale the computing resources for application. Cloud Computing consist of large number of resources, it is called resource pool. These resources are shared among the cloud consumer using virtualization technology. Virtualization technologies engaged in cloud environment is resource consolidation and management. Cloud consists of physical and virtual resources. Cloud performance is important for Cloud Provider perspective predicts the dynamic nature of users, user demands and application demand. The cloud consumer perspective, the job should be completed on time with minimum cost and limited resources. Finding optimum resource allocation is difficult in huge system like Cluster, Data Centre and Grid. In this study we present two types of resource allocation schemes such as Commitment Allocation (CA and Over Commitment Allocation (OCA in the physical and virtual level resource. These resource allocation schemes helps to identify the virtual resource utilization and physical resource availability.

  18. USING COMPUTER-BASED TESTING AS ALTERNATIVE ASSESSMENT METHOD OF STUDENT LEARNING IN DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    Amalia SAPRIATI

    2010-04-01

    Full Text Available This paper addresses the use of computer-based testing in distance education, based on the experience of Universitas Terbuka (UT, Indonesia. Computer-based testing has been developed at UT for reasons of meeting the specific needs of distance students as the following: Ø students’ inability to sit for the scheduled test, Ø conflicting test schedules, and Ø students’ flexibility to take examination to improve their grades. In 2004, UT initiated a pilot project in the development of system and program for computer-based testing method. Then in 2005 and 2006 tryouts in the use of computer-based testing methods were conducted in 7 Regional Offices that were considered as having sufficient supporting recourses. The results of the tryouts revealed that students were enthusiastic in taking computer-based tests and they expected that the test method would be provided by UT as alternative to the traditional paper and pencil test method. UT then implemented computer-based testing method in 6 and 12 Regional Offices in 2007 and 2008 respectively. The computer-based testing was administered in the city of the designated Regional Office and was supervised by the Regional Office staff. The development of the computer-based testing was initiated with conducting tests using computers in networked configuration. The system has been continually improved, and it currently uses devices linked to the internet or the World Wide Web. The construction of the test involves the generation and selection of the test items from the item bank collection of the UT Examination Center. Thus the combination of the selected items compromises the test specification. Currently UT has offered 250 courses involving the use of computer-based testing. Students expect that more courses are offered with computer-based testing in Regional Offices within easy access by students.

  19. Computer-Based Training of Cannon Fire Direction Specialists

    Science.gov (United States)

    1993-01-01

    Tramining Prgms, John D. Winlder, Stephen J. Kirin, and John S. Uebersax, 1992. N-3527-A, Te Army Miitary Oxuvatioal Specity Databse , Stephen J. Kirin and...raning tasks 187 Determine and announce fire commands for prearranged fires -1.196 188 Compute firing data manually for todc chemical projectile -1.458 189...187 Perform operator’s PMCS on SB-22 PT switchboards -1.4547 188 Compute firing data manually for toxic chemical -1.4575 projectile 189 Determine firing

  20. Digital image processing using parallel computing based on CUDA technology

    Science.gov (United States)

    Skirnevskiy, I. P.; Pustovit, A. V.; Abdrashitova, M. O.

    2017-01-01

    This article describes expediency of using a graphics processing unit (GPU) in big data processing in the context of digital images processing. It provides a short description of a parallel computing technology and its usage in different areas, definition of the image noise and a brief overview of some noise removal algorithms. It also describes some basic requirements that should be met by certain noise removal algorithm in the projection to computer tomography. It provides comparison of the performance with and without using GPU as well as with different percentage of using CPU and GPU.

  1. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    that adopt different approaches to computing the query. Algorithm AUG uses graph augmentation, and ITE uses iterative road-network partitioning. Empirical studies with real data sets demonstrate that the algorithms are capable of offering high performance in realistic settings....... that are shown empirically to be scalable. Given a road network, a set of existing facilities, and a collection of customer route traversals, an optimal segment query returns the optimal road network segment(s) for a new facility. We propose a practical framework for computing this query, where each route...

  2. Trajectory Based Optimal Segment Computation in Road Network Databases

    DEFF Research Database (Denmark)

    Li, Xiaohui; Ceikute, Vaida; Jensen, Christian S.

    2013-01-01

    that adopt different approaches to computing the query. Algorithm AUG uses graph augmentation, and ITE uses iterative road-network partitioning. Empirical studies with real data sets demonstrate that the algorithms are capable of offering high performance in realistic settings....... that are shown empirically to be scalable. Given a road network, a set of existing facilities, and a collection of customer route traversals, an optimal segment query returns the optimal road network segment(s) for a new facility. We propose a practical framework for computing this query, where each route...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  4. Computer code for double beta decay QRPA based calculations

    Energy Technology Data Exchange (ETDEWEB)

    Barbero, C. A.; Mariano, A. [Departamento de Física, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, La Plata, Argentina and Instituto de Física La Plata, CONICET, La Plata (Argentina); Krmpotić, F. [Instituto de Física La Plata, CONICET, La Plata, Argentina and Instituto de Física Teórica, Universidade Estadual Paulista, São Paulo (Brazil); Samana, A. R.; Ferreira, V. dos Santos [Departamento de Ciências Exatas e Tecnológicas, Universidade Estadual de Santa Cruz, BA (Brazil); Bertulani, C. A. [Department of Physics, Texas A and M University-Commerce, Commerce, TX (United States)

    2014-11-11

    The computer code developed by our group some years ago for the evaluation of nuclear matrix elements, within the QRPA and PQRPA nuclear structure models, involved in neutrino-nucleus reactions, muon capture and β{sup ±} processes, is extended to include also the nuclear double beta decay.

  5. Distriblets: Java-Based Distributed Computing on the Web.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Brennan, Brian; Brennan, Chris

    1999-01-01

    Describes a system for using the World Wide Web to distribute computational tasks to multiple hosts on the Web that is written in Java programming language. Describes the programs written to carry out the load distribution, the structure of a "distriblet" class, and experiences in using this system. (Author/LRW)

  6. Traditional Host based Intrusion Detection Systems’ Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Masoudeh Keshavarzi

    Full Text Available Cloud computing is one of the hottest topics in IT today. It can help enterprises improve the creation and delivery of IT solutions by allowing them to access services more flexibly and cost-effectively. Security concerns in the cloud environment are the ...

  7. A spline-based approach for computing spatial impulse responses.

    Science.gov (United States)

    Ellis, Michael A; Guenther, Drake; Walker, William F

    2007-05-01

    Computer simulations are an essential tool for the design of phased-array ultrasonic imaging systems. FIELD II, which determines the two-way temporal response of a transducer at a point in space, is the current de facto standard for ultrasound simulation tools. However, the need often arises to obtain two-way spatial responses at a single point in time, a set of dimensions for which FIELD II is not well optimized. This paper describes an analytical approach for computing the two-way, far-field, spatial impulse response from rectangular transducer elements under arbitrary excitation. The described approach determines the response as the sum of polynomial functions, making computational implementation quite straightforward. The proposed algorithm, named DELFI, was implemented as a C routine under Matlab and results were compared to those obtained under similar conditions from the well-established FIELD II program. Under the specific conditions tested here, the proposed algorithm was approximately 142 times faster than FIELD II for computing spatial sensitivity functions with similar amounts of error. For temporal sensitivity functions with similar amounts of error, the proposed algorithm was about 1.7 times slower than FIELD II using rectangular elements and 19.2 times faster than FIELD II using triangular elements. DELFI is shown to be an attractive complement to FIELD II, especially when spatial responses are needed at a specific point in time.

  8. Computer-based learning: games as an instructional strategy.

    Science.gov (United States)

    Blake, J; Goodman, J

    1999-01-01

    Games are a creative teaching strategy that enhances learning and problem solving. Gaming strategies are being used by the authors to make learning interesting, stimulating and fun. This article focuses on the development and implementation of computer games as an instructional strategy. Positive outcomes have resulted from the use of games in the classroom.

  9. COMPUGIRLS: Stepping Stone to Future Computer-Based Technology Pathways

    Science.gov (United States)

    Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.

    2015-01-01

    The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…

  10. A Puzzle-Based Seminar for Computer Engineering Freshmen

    Science.gov (United States)

    Parhami, Behrooz

    2008-01-01

    We observe that recruitment efforts aimed at alleviating the shortage of skilled workforce in computer engineering must be augmented with strategies for retaining and motivating the students after they have enrolled in our educational programmes. At the University of California, Santa Barbara, we have taken a first step in this direction by…

  11. Computer Based Learning and Training in Colour Harmony.

    Science.gov (United States)

    Nikov, Alexander S.; Georgiev, Georgi G.

    1992-01-01

    Describes an interactive computer program, COLOR-TEST, that was developed at the Technical University of Sofia (Bulgaria) to teach color harmony and color coordination to industrial design students. A brief evaluation of the program is presented, and further developments for the program are suggested. (nine references) (LRW)

  12. COMPUGIRLS: Stepping Stone to Future Computer-Based Technology Pathways

    Science.gov (United States)

    Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.

    2015-01-01

    The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…

  13. Large Scale Development of Computer-Based Instructional Systems.

    Science.gov (United States)

    Olivier, William P.; Scott, G.F.

    The Individualization Project at the Ontario Institute for Studies in Education (OISE) was organized on a cooperative basis with a federal agency and several community colleges to move smoothly from R&D to a production mode of operation, and finally to emphasize dissemination of computer courseware and systems. The key to the successful…

  14. Intel-Based Mac Computers Improve Teaching and Learning

    Science.gov (United States)

    Technology & Learning, 2007

    2007-01-01

    Today, Mac computers offer schools an easy and powerful way to engage students in learning, foster 21st century skills and leverage existing software assets. Innovative software and hardware built into the Mac allows students to demonstrate their individual strengths--empowering them to be creators of content, rather than just consumers. Judging…

  15. Intel-Based Mac Computers Improve Teaching and Learning

    Science.gov (United States)

    Technology & Learning, 2007

    2007-01-01

    Today, Mac computers offer schools an easy and powerful way to engage students in learning, foster 21st century skills and leverage existing software assets. Innovative software and hardware built into the Mac allows students to demonstrate their individual strengths--empowering them to be creators of content, rather than just consumers. Judging…

  16. An Architectural Design System Based on Computer Graphics.

    Science.gov (United States)

    MacDonald, Stephen L.; Wehrli, Robert

    The recent developments in computer hardware and software are presented to inform architects of this design tool. Technical advancements in equipment include--(1) cathode ray tube displays, (2) light pens, (3) print-out and photo copying attachments, (4) controls for comparison and selection of images, (5) chording keyboards, (6) plotters, and (7)…

  17. Computers and Resource-Based History Teaching: A UK Perspective.

    Science.gov (United States)

    Spaeth, Donald A.; Cameron, Sonja

    2000-01-01

    Presents an overview of developments in computer-aided history teaching for higher education in the United Kingdom and the United States. Explains that these developments have focused on providing students with access to primary sources to enhance their understanding of historical methods and content. (CMK)

  18. Gender and Software Effects in Computer-Based Problem Solving.

    Science.gov (United States)

    Littleton, Karen; And Others

    Whether gender differences in performance using computer software are due to sex stereotyping or gender differentiation in the programs was investigated in two studies. An adventure game, "King and Crown," with all male characters, and a gender neutral game, "Honeybears," were played by 26 female and 26 male 11- and 12-year-olds in Milton Keynes…

  19. Computer-Based English Language Testing in China: Present and Future

    Science.gov (United States)

    Yu, Guoxing; Zhang, Jing

    2017-01-01

    In this special issue on high-stakes English language testing in China, the two articles on computer-based testing (Jin & Yan; He & Min) highlight a number of consistent, ongoing challenges and concerns in the development and implementation of the nationwide IB-CET (Internet Based College English Test) and institutional computer-adaptive…

  20. Computer-Based Grammar Instruction in an EFL Context: Improving the Effectiveness of Teaching Adverbial Clauses

    Science.gov (United States)

    Kiliçkaya, Ferit

    2015-01-01

    This study aims to find out whether there are any statistically significant differences in participants' achievements on three different types of instruction: computer-based instruction, teacher-driven instruction, and teacher-driven grammar supported by computer-based instruction. Each type of instruction follows the deductive approach. The…

  1. Improving Student Performance through Computer-Based Assessment: Insights from Recent Research.

    Science.gov (United States)

    Ricketts, C.; Wilks, S. J.

    2002-01-01

    Compared student performance on computer-based assessment to machine-graded multiple choice tests. Found that performance improved dramatically on the computer-based assessment when students were not required to scroll through the question paper. Concluded that students may be disadvantaged by the introduction of online assessment unless care is…

  2. Computer versus Paper-Based Reading: A Case Study in English Language Teaching Context

    Science.gov (United States)

    Solak, Ekrem

    2014-01-01

    This research aims to determine the preference of prospective English teachers in performing computer and paper-based reading tasks and to what extent computer and paper-based reading influence their reading speed, accuracy and comprehension. The research was conducted at a State run University, English Language Teaching Department in Turkey. The…

  3. English Language Learners' Strategies for Reading Computer-Based Texts at Home and in School

    Science.gov (United States)

    Park, Ho-Ryong; Kim, Deoksoon

    2016-01-01

    This study investigated four elementary-level English language learners' (ELLs') use of strategies for reading computer-based texts at home and in school. The ELLs in this study were in the fourth and fifth grades in a public elementary school. We identify the ELLs' strategies for reading computer-based texts in home and school environments. We…

  4. Computer-Based Grammar Instruction in an EFL Context: Improving the Effectiveness of Teaching Adverbial Clauses

    Science.gov (United States)

    Kiliçkaya, Ferit

    2015-01-01

    This study aims to find out whether there are any statistically significant differences in participants' achievements on three different types of instruction: computer-based instruction, teacher-driven instruction, and teacher-driven grammar supported by computer-based instruction. Each type of instruction follows the deductive approach. The…

  5. Effects of an Interactive Computer-Based Reading Strategy on Student Comprehension

    Science.gov (United States)

    Worrell, Jamie L.

    2011-01-01

    The computer-based testing mode has received limited research as a task condition for elementary students as it relates to comprehension for both narrative and expository text. The majority of schools now use computer-based testing to measure students' progress for end of the year exams. Additionally, schools are also delivering state-wide…

  6. Effects of Computer-Based Programs on Mathematical Achievement Scores for Fourth-Grade Students

    Science.gov (United States)

    Ravenel, Jessica; Lambeth, Dawn T.; Spires, Bob

    2014-01-01

    The purpose of the research study was to identify the effects of computer-based programs on mathematical achievement, perceptions, and engagement of fourth-grade students. The 31 student participants were divided into two intervention groups, as a hands-on group and a computer-based group. Student achievement was measured by comparing the pretest…

  7. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    Science.gov (United States)

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  8. Neurological rehabilitation of stroke patients via motor imaginary-based brain-computer interface technology

    Institute of Scientific and Technical Information of China (English)

    Hongyu Sun; Yang Xiang; Mingdao Yang

    2011-01-01

    The present study utilized motor imaginary-based brain-computer interface technology combined with rehabilitation training in 20 stroke patients. Results from the Berg Balance Scale and the Holden Walking Classification were significantly greater at 4 weeks after treatment (P < 0.01), which suggested that motor imaginary-based brain-computer interface technology improved balance and walking in stroke patients.

  9. A Pilot Meta-Analysis of Computer-Based Scaffolding in STEM Education

    Science.gov (United States)

    Belland, Brian R.; Walker, Andrew E.; Olsen, Megan Whitney; Leary, Heather

    2015-01-01

    This paper employs meta-analysis to determine the influence of computer-based scaffolding characteristics and study and test score quality on cognitive outcomes in science, technology, engineering, and mathematics education at the secondary, college, graduate, and adult levels. Results indicate that (a) computer-based scaffolding positively…

  10. Prospective Mathematics Teachers' Views about Using Computer-Based Instructional Materials in Constructing Mathematical Concepts

    Science.gov (United States)

    Bukova-Guzel, Esra; Canturk-Gunhan, Berna

    2011-01-01

    The purpose of the study is to determine prospective mathematics teachers' views about using computer-based instructional materials in constructing mathematical concepts and to reveal how the sample computer-based instructional materials for different mathematical concepts altered their views. This is a qualitative study involving twelve…

  11. Overview of Design, Lifecycle, and Safety for Computer-Based Systems

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2015-01-01

    This document describes the need and justification for the development of a design guide for safety-relevant computer-based systems. This document also makes a contribution toward the design guide by presenting an overview of computer-based systems design, lifecycle, and safety.

  12. Providing Feedback on Computer-Based Algebra Homework in Middle-School Classrooms

    Science.gov (United States)

    Fyfe, Emily R.

    2016-01-01

    Homework is transforming at a rapid rate with continuous advances in educational technology. Computer-based homework, in particular, is gaining popularity across a range of schools, with little empirical evidence on how to optimize student learning. The current aim was to test the effects of different types of feedback on computer-based homework.…

  13. Discovery Learning, Representation, and Explanation within a Computer-Based Simulation: Finding the Right Mix

    Science.gov (United States)

    Rieber, Lloyd P.; Tzeng, Shyh-Chii; Tribble, Kelly

    2004-01-01

    The purpose of this research was to explore how adult users interact and learn during an interactive computer-based simulation supplemented with brief multimedia explanations of the content. A total of 52 college students interacted with a computer-based simulation of Newton's laws of motion in which they had control over the motion of a simple…

  14. The Tenth Summative Report of the Office of Computer-Based Instruction.

    Science.gov (United States)

    Hofstetter, Fred T.

    The University of Delaware's work with computer-based instruction since 1974 is summarized with attention to the history and development of the Office of Computer-Based Instruction, university applications, outside user applications, and research and evaluation. PLATO was the system that met the university's criteria, which included support for…

  15. The Ninth Summative Report of the Office of Computer-Based Instruction.

    Science.gov (United States)

    Hofstetter, Fred T.

    The University of Delaware's work with computer-based instruction since 1974 is summarized, with attention to the history and development of the Office of Computer-Based Instruction, university applications, outside user applications, and research and evaluation. PLATO was the system that met the university's criteria, which included: supporting…

  16. The Eleventh Summative Report of the Office of Computer-Based Instruction.

    Science.gov (United States)

    Hofstetter, Fred T.

    The University of Delaware's work with computer-based instruction since 1974 is summarized with attention to the history and development of the Office of Computer-Based Instruction, university applications, outside user applications, and research and evaluation. PLATO was the system that met the university's criteria, which included support for…

  17. Development and validation of a computer-based learning module for wrist arthroscopy.

    Science.gov (United States)

    Obdeijn, M C; Alewijnse, J V; Mathoulin, C; Liverneaux, P; Tuijthof, G J M; Schijven, M P

    2014-04-01

    The objective of this study was to develop and validate a computer-based module for wrist arthroscopy to which a group of experts could consent. The need for such a module was assessed with members of the European Wrist Arthroscopy Society (EWAS). The computer-based module was developed through several rounds of consulting experts on the content. The module's learning enhancement was tested in a randomized controlled trial with 28 medical students who were assigned to the computer-based module group or lecture group. The design process led to a useful tool, which is supported by a panel of experts. Although the computer based module did not enhance learning, the participants did find the module more pleasant to use. Developing learning tools such as this computer-based module can improve the teaching of wrist arthroscopy skills.

  18. Application of CT-PSF-based computer-simulated lung nodules for evaluating the accuracy of computer-aided volumetry.

    Science.gov (United States)

    Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji

    2012-07-01

    With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.

  19. A new approach to computer-aided spine surgery: fluoroscopy-based surgical navigation

    OpenAIRE

    Nolte, L.-P.; Slomczykowski, M. A.; Berlemann, U.; Strauss, M. J.; Hofstetter, R; Schlenzka, D.; Laine, T.; Lund, T

    2000-01-01

    A new computer-based navigation system for spinal surgery has been designed. This was achieved by combining intraoperative fluoroscopy-based imaging using conventional C-arm technology with freehand surgical navigation principles. Modules were developed to automate digital X-ray image registration. This is in contrast to existing computed tomography- (CT) based spinal navigation systems, which require a vertebra-based registration procedure. Cross-referencing of the image intensifier with the...

  20. A new similarity computing method based on concept similarity in Chinese text processing

    Institute of Scientific and Technical Information of China (English)

    PENG Jing; YANG DongQing; TANG ShiWei; WANG TengJiao; GAO Jun

    2008-01-01

    The paper proposes a new text similarity computing method based on concept similarity in Chinese text processing. The new method converts text to words vec-tor space modet al first, and then splits words into a set of concepts. Through computing the inner products between concepts, it obtains the similarity between words. The new method computes the similarity of text based on the similarity of words at last. The contributions of the paper include: 1) propose a new computing formula between words; 2) propose a new text similarity computing method based on words similarity; 3) successfully use the method in the application of similarity computing of WEB news; and 4) prove the validity of the method through extensive experiments.