WorldWideScience

Sample records for software symbolic computation

  1. Analytical exploration of the thermodynamic potentials by using symbolic computation software

    International Nuclear Information System (INIS)

    Hantsaridou, Anastasia P; Polatoglou, Hariton M

    2005-01-01

    Thermodynamics is a very general theory, based on fundamental symmetries. It generalizes classical mechanics and incorporates theoretical concepts such as field and field equations. Although all these ingredients are of the highest importance for a scientist, they are not given the attention they perhaps deserve in most undergraduate courses. Nowadays, powerful computers in conjunction with equally powerful software can ease the exploration of the crucial ideas of thermodynamics. The purpose of the present work is to show how the utilization of symbolic computation software can lead to a complementary understanding of thermodynamics. The method was applied to first and second year physics students in the Aristotle University of Thessaloniki (Greece) during the 2002-2003 academic year. The results indicate that symbolic computation software is appropriate not only for enhancing the teaching of the fundamental principles in thermodynamics and their applications, but also for increasing students' motivation for learning

  2. Symbolic math for computation of radiation shielding

    International Nuclear Information System (INIS)

    Suman, Vitisha; Datta, D.; Sarkar, P.K.; Kushwaha, H.S.

    2010-01-01

    Radiation transport calculations for shielding studies in the field of accelerator technology often involve intensive numerical computations. Traditionally, radiation transport equation is solved using finite difference scheme or advanced finite element method with respect to specific initial and boundary conditions suitable for the geometry of the problem. All these computations need CPU intensive computer codes for accurate calculation of scalar and angular fluxes. Computation using symbols of the analytical expression representing the transport equation as objects is an enhanced numerical technique in which the computation is completely algorithm and data oriented. Algorithm on the basis of symbolic math architecture is developed using Symbolic math toolbox of MATLAB software. Present paper describes the symbolic math algorithm and its application as a case study in which shielding calculation of rectangular slab geometry is studied for a line source of specific activity. Study of application of symbolic math in this domain evolves a new paradigm compared to the existing computer code such as DORT. (author)

  3. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  4. Applications of symbolic algebraic computation

    International Nuclear Information System (INIS)

    Brown, W.S.; Hearn, A.C.

    1979-01-01

    This paper is a survey of applications of systems for symbomic algebraic computation. In most successful applications, calculations that can be taken to a given order by hand are then extended one or two more orders by computer. Furthermore, with a few notable exceptins, these applications also involve numerical computation in some way. Therefore the authors emphasize the interface between symbolic and numerical computation, including: 1. Computations with both symbolic and numerical phases. 2. Data involving both the unpredictible size and shape that typify symbolic computation and the (usually inexact) numerical values that characterize numerical computation. 3. Applications of one field to the other. It is concluded that the fields of symbolic and numerical computation can advance most fruitfully in harmony rather than in competition. (Auth.)

  5. Exact computation of the 9-j symbols

    International Nuclear Information System (INIS)

    Lai Shantao; Chiu Jingnan

    1992-01-01

    A useful algebraic formula for the 9-j symbol has been rewritten for convenient use on a computer. A simple FORTRAN program for the exact computation of 9-j symbols has been written for the VAX with VMS version V5,4-1 according to this formula. The results agree with the approximate values in existing literature. Some specific values of 9-j symbols needed for the intensity and alignments of three-photon nonresonant transitions are tabulated. Approximate 9-j symbol values beyond the limitation of the computer can also be computed by this program. The computer code of the exact computation of 3-j, 6-j and 9-j symbols are available through electronic mail upon request. (orig.)

  6. Numerical and symbolic scientific computing

    CERN Document Server

    Langer, Ulrich

    2011-01-01

    The book presents the state of the art and results and also includes articles pointing to future developments. Most of the articles center around the theme of linear partial differential equations. Major aspects are fast solvers in elastoplasticity, symbolic analysis for boundary problems, symbolic treatment of operators, computer algebra, and finite element methods, a symbolic approach to finite difference schemes, cylindrical algebraic decomposition and local Fourier analysis, and white noise analysis for stochastic partial differential equations. Further numerical-symbolic topics range from

  7. Computer-Aided Authoring of Programmed Instruction for Teaching Symbol Recognition. Final Report.

    Science.gov (United States)

    Braby, Richard; And Others

    This description of AUTHOR, a computer program for the automated authoring of programmed texts designed to teach symbol recognition, includes discussions of the learning strategies incorporated in the design of the instructional materials, hardware description and the algorithm for the software, and current and future developments. Appendices…

  8. Symbolic initiative and its application to computers

    Energy Technology Data Exchange (ETDEWEB)

    Hellerman, L

    1982-01-01

    The author reviews the role of symbolic initiative in mathematics and then defines a sense in which computers compute mathematical functions. This allows a clarification of the semantics of computer and communication data. Turing's view of machine intelligence is examined in terms of its use of symbolic initiative. 12 references.

  9. Scientific applications of symbolic computation

    International Nuclear Information System (INIS)

    Hearn, A.C.

    1976-02-01

    The use of symbolic computation systems for problem solving in scientific research is reviewed. The nature of the field is described, and particular examples are considered from celestial mechanics, quantum electrodynamics and general relativity. Symbolic integration and some more recent applications of algebra systems are also discussed [fr

  10. Exact computation of the 3-j and 6-j symbols

    International Nuclear Information System (INIS)

    Lai Shantao; Chiu Yingnan

    1990-01-01

    A simple FORTRAN program for the exaxt computation of 3-j and 6-j symbols has been written for the VAX with VMS version v5.1 in our university's computing center. It goes beyond and contains all of the 3-j and 6-j symbols evaluated in the book by M. Rotenberg, R. Bivins, N. Metropolis and J.K. Wooten Jr. The 3-j symbols up to (30/m 1 30/m 2 30/m 3 ) and 6-j symbols up to {20/20 20/20 20/20} can be computed exactly by this program. Approximate values for larger j's up to (200/m 1 200/m 2 200/m 3 ) and {200/200 200/200 200/220} can also be computed by this program. (orig.)

  11. Expressions of manipulator kinematic equations via symbolic computation

    International Nuclear Information System (INIS)

    Sasaki, Shinobu

    1993-09-01

    While it is simple in principle to determine the position and orientation of the manipulator hand, its computational process has been regarded as extremely laborious since trigonometric functions must be calculated many times in operations of revolute or rotation. Due to development of a general class of kinematic algorithm based on iterative methods, however, we have come to a satisfactory settlement of this problem. In the present article, we consider to construct symbolic kinematic equations in an automatic fashion making use of the algorithm. To this end, recursive expressions are applied to a symbolic computation system REDUCE. As a concrete result, a complete kinematic model for a six-jointed arm having all kinematic attributes is provided. Together with work space analysis, the computer-aided generation of kinematic equations in symbolic form will serve to liberate us from their cumbersome derivations. (author)

  12. Symbolic Computation, Number Theory, Special Functions, Physics and Combinatorics

    CERN Document Server

    Ismail, Mourad

    2001-01-01

    These are the proceedings of the conference "Symbolic Computation, Number Theory, Special Functions, Physics and Combinatorics" held at the Department of Mathematics, University of Florida, Gainesville, from November 11 to 13, 1999. The main emphasis of the conference was Com­ puter Algebra (i. e. symbolic computation) and how it related to the fields of Number Theory, Special Functions, Physics and Combinatorics. A subject that is common to all of these fields is q-series. We brought together those who do symbolic computation with q-series and those who need q-series in­ cluding workers in Physics and Combinatorics. The goal of the conference was to inform mathematicians and physicists who use q-series of the latest developments in the field of q-series and especially how symbolic computa­ tion has aided these developments. Over 60 people were invited to participate in the conference. We ended up having 45 participants at the conference, including six one hour plenary speakers and 28 half hour speakers. T...

  13. Symbolic computation of nonlinear wave interactions on MACSYMA

    International Nuclear Information System (INIS)

    Bers, A.; Kulp, J.L.; Karney, C.F.F.

    1976-01-01

    In this paper the use of a large symbolic computation system - MACSYMA - in determining approximate analytic expressions for the nonlinear coupling of waves in an anisotropic plasma is described. MACSYMA was used to implement the solutions of a fluid plasma model nonlinear partial differential equations by perturbation expansions and subsequent iterative analytic computations. By interacting with the details of the symbolic computation, the physical processes responsible for particular nonlinear wave interactions could be uncovered and appropriate approximations introduced so as to simplify the final analytic result. Details of the MACSYMA system and its use are discussed and illustrated. (Auth.)

  14. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  15. Second International workshop Geometry and Symbolic Computation

    CERN Document Server

    Walczak, Paweł; Geometry and its Applications

    2014-01-01

    This volume has been divided into two parts: Geometry and Applications. The geometry portion of the book relates primarily to geometric flows, laminations, integral formulae, geometry of vector fields on Lie groups, and osculation; the articles in the applications portion concern some particular problems of the theory of dynamical systems, including mathematical problems of liquid flows and a study of cycles for non-dynamical systems. This Work is based on the second international workshop entitled "Geometry and Symbolic Computations," held on May 15-18, 2013 at the University of Haifa and is dedicated to modeling (using symbolic calculations) in differential geometry and its applications in fields such as computer science, tomography, and mechanics. It is intended to create a forum for students and researchers in pure and applied geometry to promote discussion of modern state-of-the-art in geometric modeling using symbolic programs such as Maple™ and Mathematica®, as well as presentation of new results. ...

  16. Genomecmp: computer software to detect genomic rearrangements using markers

    Science.gov (United States)

    Kulawik, Maciej; Nowak, Robert M.

    2017-08-01

    Detection of genomics rearrangements is a tough task, because of the size of data to be processed. As genome sequences may consist of hundreds of millions symbols, it is not only practically impossible to compare them by hand, but it is also complex problem for computer software. The way to significantly accelerate the process is to use rearrangement detection algorithm based on unique short sequences called markers. The algorithm described in this paper develops markers using base genome and find the markers positions on other genome. The algorithm has been extended by support for ambiguity symbols. Web application with graphical user interface has been created using three-layer architecture, where users could run the task simultaneously. The accuracy and efficiency of proposed solution has been studied using generated and real data.

  17. Symbolic mathematical computing: orbital dynamics and application to accelerators

    International Nuclear Information System (INIS)

    Fateman, R.

    1986-01-01

    Computer-assisted symbolic mathematical computation has become increasingly useful in applied mathematics. A brief introduction to such capabilitites and some examples related to orbital dynamics and accelerator physics are presented. (author)

  18. Use of Writing with Symbols 2000 Software to Facilitate Emergent Literacy Development

    Science.gov (United States)

    Parette, Howard P.; Boeckmann, Nichole M.; Hourcade, Jack J.

    2008-01-01

    This paper outlines the use of the "Writing with Symbols 2000" software to facilitate emergent literacy development. The program's use of pictures incorporated with text has great potential to help young children with and without disabilities acquire fundamental literacy concepts about print, phonemic awareness, alphabetic principle, vocabulary…

  19. An approach to first principles electronic structure calculation by symbolic-numeric computation

    Directory of Open Access Journals (Sweden)

    Akihito Kikuchi

    2013-04-01

    Full Text Available There is a wide variety of electronic structure calculation cooperating with symbolic computation. The main purpose of the latter is to play an auxiliary role (but not without importance to the former. In the field of quantum physics [1-9], researchers sometimes have to handle complicated mathematical expressions, whose derivation seems almost beyond human power. Thus one resorts to the intensive use of computers, namely, symbolic computation [10-16]. Examples of this can be seen in various topics: atomic energy levels, molecular dynamics, molecular energy and spectra, collision and scattering, lattice spin models and so on [16]. How to obtain molecular integrals analytically or how to manipulate complex formulas in many body interactions, is one such problem. In the former, when one uses special atomic basis for a specific purpose, to express the integrals by the combination of already known analytic functions, may sometimes be very difficult. In the latter, one must rearrange a number of creation and annihilation operators in a suitable order and calculate the analytical expectation value. It is usual that a quantitative and massive computation follows a symbolic one; for the convenience of the numerical computation, it is necessary to reduce a complicated analytic expression into a tractable and computable form. This is the main motive for the introduction of the symbolic computation as a forerunner of the numerical one and their collaboration has won considerable successes. The present work should be classified as one such trial. Meanwhile, the use of symbolic computation in the present work is not limited to indirect and auxiliary part to the numerical computation. The present work can be applicable to a direct and quantitative estimation of the electronic structure, skipping conventional computational methods.

  20. Production of graphic symbol sentences by individuals with aphasia: efficacy of a computer-based augmentative and alternative communication intervention.

    Science.gov (United States)

    Koul, Rajinder; Corwin, Melinda; Hayes, Summer

    2005-01-01

    The study employed a single-subject multiple baseline design to examine the ability of 9 individuals with severe Broca's aphasia or global aphasia to produce graphic symbol sentences of varying syntactical complexity using a software program that turns a computer into a speech output communication device. The sentences ranged in complexity from simple two-word phrases to those with morphological inflections, transformations, and relative clauses. Overall, results indicated that individuals with aphasia are able to access, manipulate, and combine graphic symbols to produce phrases and sentences of varying degrees of syntactical complexity. The findings are discussed in terms of the clinical and public policy implications.

  1. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... at one site or multiple site licenses, and the format and media in which the software or... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software...

  2. Analytical SN solutions in heterogeneous slabs using symbolic algebra computer programs

    International Nuclear Information System (INIS)

    Warsa, J.S.

    2002-01-01

    A modern symbolic algebra computer program, MAPLE, is used to compute solutions to the well-known analytical discrete ordinates, or S N , solutions in one-dimensional, slab geometry. Symbolic algebra programs compute the solutions with arbitrary precision and are free of spatial discretization error so they can be used to investigate new discretizations for one-dimensional slab, geometry S N methods. Pointwise scalar flux solutions are computed for several sample calculations of interest. Sample MAPLE command scripts are provided to illustrate how easily the theory can be translated into a working solution and serve as a complete tool capable of computing analytical S N solutions for mono-energetic, one-dimensional transport problems

  3. The asymptotic expansion method via symbolic computation

    OpenAIRE

    Navarro, Juan F.

    2012-01-01

    This paper describes an algorithm for implementing a perturbation method based on an asymptotic expansion of the solution to a second-order differential equation. We also introduce a new symbolic computation system which works with the so-called modified quasipolynomials, as well as an implementation of the algorithm on it.

  4. The Asymptotic Expansion Method via Symbolic Computation

    Directory of Open Access Journals (Sweden)

    Juan F. Navarro

    2012-01-01

    Full Text Available This paper describes an algorithm for implementing a perturbation method based on an asymptotic expansion of the solution to a second-order differential equation. We also introduce a new symbolic computation system which works with the so-called modified quasipolynomials, as well as an implementation of the algorithm on it.

  5. Application of symbolic and algebraic manipulation software in solving applied mechanics problems

    Science.gov (United States)

    Tsai, Wen-Lang; Kikuchi, Noboru

    1993-01-01

    As its name implies, symbolic and algebraic manipulation is an operational tool which not only can retain symbols throughout computations but also can express results in terms of symbols. This report starts with a history of symbolic and algebraic manipulators and a review of the literatures. With the help of selected examples, the capabilities of symbolic and algebraic manipulators are demonstrated. These applications to problems of applied mechanics are then presented. They are the application of automatic formulation to applied mechanics problems, application to a materially nonlinear problem (rigid-plastic ring compression) by finite element method (FEM) and application to plate problems by FEM. The advantages and difficulties, contributions, education, and perspectives of symbolic and algebraic manipulation are discussed. It is well known that there exist some fundamental difficulties in symbolic and algebraic manipulation, such as internal swelling and mathematical limitation. A remedy for these difficulties is proposed, and the three applications mentioned are solved successfully. For example, the closed from solution of stiffness matrix of four-node isoparametrical quadrilateral element for 2-D elasticity problem was not available before. Due to the work presented, the automatic construction of it becomes feasible. In addition, a new advantage of the application of symbolic and algebraic manipulation found is believed to be crucial in improving the efficiency of program execution in the future. This will substantially shorten the response time of a system. It is very significant for certain systems, such as missile and high speed aircraft systems, in which time plays an important role.

  6. Symbolic Processing Combined with Model-Based Reasoning

    Science.gov (United States)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  7. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  8. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  9. 48 CFR 12.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...

  10. Symbolic computation in waste management

    International Nuclear Information System (INIS)

    Grant, M.W.

    1989-01-01

    This paper reports a prototype environment for decision analysis implemented on the PC using an object-oriented LISP. A unique feature of this environment is the extensive use of symbolic computation and object oriented programming. Models, e.g. contaminant transport and uptake, are built with the environment, itself recognizing new parameters, uncertainties and functions as they first appear and can be modified at any time with the appropriate changes propagated throughout the environment. The environment automatically generates and executes a complete interactive input sequence during the model construction. In addition to a primary decision criterion, secondary criteria or constraints may be used to define any decision

  11. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Science.gov (United States)

    2010-10-01

    ... operation of the software to display a restrictive rights legend or other license notice; and (2) Requires a... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and...

  12. Symbolic computation and abundant travelling wave solutions to ...

    Indian Academy of Sciences (India)

    2016-12-09

    Dec 9, 2016 ... Abstract. In this article, the novel (G /G)-expansion method is successfully applied to construct the abundant travelling wave solutions to the KdV–mKdV equation with the aid of symbolic computation. This equation is one of the most popular equation in soliton physics and appear in many practical scenarios ...

  13. SymPy: symbolic computing in Python

    Directory of Open Access Journals (Sweden)

    Aaron Meurer

    2017-01-01

    Full Text Available SymPy is an open source computer algebra system written in pure Python. It is built with a focus on extensibility and ease of use, through both interactive and programmatic applications. These characteristics have led SymPy to become a popular symbolic library for the scientific Python ecosystem. This paper presents the architecture of SymPy, a description of its features, and a discussion of select submodules. The supplementary material provide additional examples and further outline details of the architecture and features of SymPy.

  14. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ...) Restricted rights in computer software, limited rights in technical data, or government purpose license... necessary to perfect a license or licenses in the deliverable software or documentation of the appropriate... the license rights obtained. (e) Identification and delivery of computer software and computer...

  15. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Conformity, acceptance... Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...) Conformity and acceptance. Solicitations and contracts requiring the delivery of computer software shall...

  16. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Sabry, Amr

    This issue of HOSC is dedicated to the general topic of continuations. It grew out of the third ACM SIGPLAN Workshop on Continuations (CW'01), which took place in London, UK on January 16, 2001 [3]. The notion of continuation is ubiquitous in many different areas of computer science, including...... and streamline Filinski's earlier work in the previous special issue of HOSC (then LISP and Symbolic Computation) that grew out of the first ACM SIGPLAN Workshop on Continuations [1, 2]. Hasegawa and Kakutani's article is the journal version of an article presented at FOSSACS 2001 and that received the EATCS...

  17. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  18. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  19. Symbolic computer vector analysis

    Science.gov (United States)

    Stoutemyer, D. R.

    1977-01-01

    A MACSYMA program is described which performs symbolic vector algebra and vector calculus. The program can combine and simplify symbolic expressions including dot products and cross products, together with the gradient, divergence, curl, and Laplacian operators. The distribution of these operators over sums or products is under user control, as are various other expansions, including expansion into components in any specific orthogonal coordinate system. There is also a capability for deriving the scalar or vector potential of a vector field. Examples include derivation of the partial differential equations describing fluid flow and magnetohydrodynamics, for 12 different classic orthogonal curvilinear coordinate systems.

  20. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  1. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  2. Trends in computer hardware and software.

    Science.gov (United States)

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  3. Exact and approximate probabilistic symbolic execution for nondeterministic programs

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Păsăreanu, Corina S.; Dwyer, Matthew B.

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probab...... Java programs. We show that our algorithms significantly improve upon a state-of-the-art statistical model checking algorithm, originally developed for Markov Decision Processes....... probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also...

  4. Brute force meets Bruno force in parameter optimisation: introduction of novel constraints for parameter accuracy improvement by symbolic computation.

    Science.gov (United States)

    Nakatsui, M; Horimoto, K; Lemaire, F; Ürgüplü, A; Sedoglavic, A; Boulier, F

    2011-09-01

    Recent remarkable advances in computer performance have enabled us to estimate parameter values by the huge power of numerical computation, the so-called 'Brute force', resulting in the high-speed simultaneous estimation of a large number of parameter values. However, these advancements have not been fully utilised to improve the accuracy of parameter estimation. Here the authors review a novel method for parameter estimation using symbolic computation power, 'Bruno force', named after Bruno Buchberger, who found the Gröbner base. In the method, the objective functions combining the symbolic computation techniques are formulated. First, the authors utilise a symbolic computation technique, differential elimination, which symbolically reduces an equivalent system of differential equations to a system in a given model. Second, since its equivalent system is frequently composed of large equations, the system is further simplified by another symbolic computation. The performance of the authors' method for parameter accuracy improvement is illustrated by two representative models in biology, a simple cascade model and a negative feedback model in comparison with the previous numerical methods. Finally, the limits and extensions of the authors' method are discussed, in terms of the possible power of 'Bruno force' for the development of a new horizon in parameter estimation.

  5. The use of symbolic computation in radiative, energy, and neutron transport calculations

    Science.gov (United States)

    Frankel, J. I.

    This investigation uses symbolic computation in developing analytical methods and general computational strategies for solving both linear and nonlinear, regular and singular, integral and integro-differential equations which appear in radiative and combined mode energy transport. This technical report summarizes the research conducted during the first nine months of the present investigation. The use of Chebyshev polynomials augmented with symbolic computation has clearly been demonstrated in problems involving radiative (or neutron) transport, and mixed-mode energy transport. Theoretical issues related to convergence, errors, and accuracy have also been pursued. Three manuscripts have resulted from the funded research. These manuscripts have been submitted to archival journals. At the present time, an investigation involving a conductive and radiative medium is underway. The mathematical formulation leads to a system of nonlinear, weakly-singular integral equations involving the unknown temperature and various Legendre moments of the radiative intensity in a participating medium. Some preliminary results are presented illustrating the direction of the proposed research.

  6. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A. O. [Savannah River National Laboratory, Washington Savannah River Company, Aiken, SC 29808 (United States)

    2006-07-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  7. The ANS mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A. O.

    2006-01-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  8. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  9. Methods in Symbolic Computation and p-Adic Valuations of Polynomials

    Science.gov (United States)

    Guan, Xiao

    Symbolic computation has widely appear in many mathematical fields such as combinatorics, number theory and stochastic processes. The techniques created in the area of experimental mathematics provide us efficient ways of symbolic computing and verification of complicated relations. Part I consists of three problems. The first one focuses on a unimodal sequence derived from a quartic integral. Many of its properties are explored with the help of hypergeometric representations and automatic proofs. The second problem tackles the generating function of the reciprocal of Catalan number. It springs from the closed form given by Mathematica. Furthermore, three methods in special functions are used to justify this result. The third issue addresses the closed form solutions for the moments of products of generalized elliptic integrals , which combines the experimental mathematics and classical analysis. Part II concentrates on the p-adic valuations of polynomials from the perspective of trees. For a given polynomial f( n) indexed in positive integers, the package developed in Mathematica will create certain tree structure following a couple of rules. The evolution of such trees are studied both rigorously and experimentally from the view of field extension, nonparametric statistics and random matrix.

  10. Symbolic derivation of high-order Rayleigh-Schroedinger perturbation energies using computer algebra: Application to vibrational-rotational analysis of diatomic molecules

    Energy Technology Data Exchange (ETDEWEB)

    Herbert, John M. [Kansas State Univ., Manhattan, KS (United States). Dept. of Chemistry

    1997-01-01

    Rayleigh-Schroedinger perturbation theory is an effective and popular tool for describing low-lying vibrational and rotational states of molecules. This method, in conjunction with ab initio techniques for computation of electronic potential energy surfaces, can be used to calculate first-principles molecular vibrational-rotational energies to successive orders of approximation. Because of mathematical complexities, however, such perturbation calculations are rarely extended beyond the second order of approximation, although recent work by Herbert has provided a formula for the nth-order energy correction. This report extends that work and furnishes the remaining theoretical details (including a general formula for the Rayleigh-Schroedinger expansion coefficients) necessary for calculation of energy corrections to arbitrary order. The commercial computer algebra software Mathematica is employed to perform the prohibitively tedious symbolic manipulations necessary for derivation of generalized energy formulae in terms of universal constants, molecular constants, and quantum numbers. As a pedagogical example, a Hamiltonian operator tailored specifically to diatomic molecules is derived, and the perturbation formulae obtained from this Hamiltonian are evaluated for a number of such molecules. This work provides a foundation for future analyses of polyatomic molecules, since it demonstrates that arbitrary-order perturbation theory can successfully be applied with the aid of commercially available computer algebra software.

  11. On the Computational Complexity of the Languages of General Symbolic Dynamical Systems and Beta-Shifts

    DEFF Research Database (Denmark)

    Simonsen, Jakob Grue

    2009-01-01

    We consider the computational complexity of languages of symbolic dynamical systems. In particular, we study complexity hierarchies and membership of the non-uniform class P/poly. We prove: 1.For every time-constructible, non-decreasing function t(n)=@w(n), there is a symbolic dynamical system...... with language decidable in deterministic time O(n^2t(n)), but not in deterministic time o(t(n)). 2.For every space-constructible, non-decreasing function s(n)=@w(n), there is a symbolic dynamical system with language decidable in deterministic space O(s(n)), but not in deterministic space o(s(n)). 3.There...... are symbolic dynamical systems having hard and complete languages under @?"m^l^o^g^s- and @?"m^p-reduction for every complexity class above LOGSPACE in the backbone hierarchy (hence, P-complete, NP-complete, coNP-complete, PSPACE-complete, and EXPTIME-complete sets). 4.There are decidable languages of symbolic...

  12. Collection Of Software For Computer Graphics

    Science.gov (United States)

    Hibbard, Eric A.; Makatura, George

    1990-01-01

    Ames Research Graphics System (ARCGRAPH) collection of software libraries and software utilities assisting researchers in generating, manipulating, and visualizing graphical data. Defines metafile format containing device-independent graphical data. File format used with various computer-graphics-manipulation and -animation software packages at Ames, including SURF (COSMIC Program ARC-12381) and GAS (COSMIC Program ARC-12379). Consists of two-stage "pipeline" used to put out graphical primitives. ARCGRAPH libraries developed on VAX computer running VMS.

  13. Computer systems and software engineering

    Science.gov (United States)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  14. 14 CFR 415.123 - Computing systems and software.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  15. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  16. Computer software review procedures

    International Nuclear Information System (INIS)

    Mauck, J.L.

    1993-01-01

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  17. Symbolic PathFinder v7

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Păsăreanu, Corina

    2014-01-01

    We describe Symbolic PathFinder v7 in terms of its updated design addressing the changes of Java PathFinder v7 and of its new optimization when computing path conditions. Furthermore, we describe the Symbolic Execution Tree Extension; a newly added feature that allows for outputting the symbolic...... execution tree that characterizes the execution paths covered during symbolic execution. The new extension can be tailored to the needs of subsequent analyses/processing facilities, and we demonstrate this by presenting SPF-Visualizer, which is a tool for customizable visualization of the symbolic execution...

  18. Software For Computing Selected Functions

    Science.gov (United States)

    Grant, David C.

    1992-01-01

    Technical memorandum presents collection of software packages in Ada implementing mathematical functions used in science and engineering. Provides programmer with function support in Pascal and FORTRAN, plus support for extended-precision arithmetic and complex arithmetic. Valuable for testing new computers, writing computer code, or developing new computer integrated circuits.

  19. SYMBOL LEVEL DECODING FOR DUO-BINARY TURBO CODES

    Directory of Open Access Journals (Sweden)

    Yogesh Beeharry

    2017-05-01

    Full Text Available This paper investigates the performance of three different symbol level decoding algorithms for Duo-Binary Turbo codes. Explicit details of the computations involved in the three decoding techniques, and a computational complexity analysis are given. Simulation results with different couple lengths, code-rates, and QPSK modulation reveal that the symbol level decoding with bit-level information outperforms the symbol level decoding by 0.1 dB on average in the error floor region. Moreover, a complexity analysis reveals that symbol level decoding with bit-level information reduces the decoding complexity by 19.6 % in terms of the total number of computations required for each half-iteration as compared to symbol level decoding.

  20. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  1. 48 CFR 27.405-3 - Commercial computer software.

    Science.gov (United States)

    2010-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  2. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    Science.gov (United States)

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  3. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  4. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  5. Computer, Network, Software, and Hardware Engineering with Applications

    CERN Document Server

    Schneidewind, Norman F

    2012-01-01

    There are many books on computers, networks, and software engineering but none that integrate the three with applications. Integration is important because, increasingly, software dominates the performance, reliability, maintainability, and availability of complex computer and systems. Books on software engineering typically portray software as if it exists in a vacuum with no relationship to the wider system. This is wrong because a system is more than software. It is comprised of people, organizations, processes, hardware, and software. All of these components must be considered in an integr

  6. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  7. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  8. 48 CFR 212.7003 - Technical data and computer software.

    Science.gov (United States)

    2010-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  9. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  10. 9th and 10th Asian Symposium on Computer Mathematics

    CERN Document Server

    Lee, Wen-shin; Sato, Yosuke

    2014-01-01

    This book covers original research and the latest advances in symbolic, algebraic and geometric computation; computational methods for differential and difference equations, symbolic-numerical computation; mathematics software design and implementation; and scientific and engineering applications based on features, invited talks, special sessions and contributed papers presented at the 9th (in Fukuoka, Japan in 2009) and 10th (in Beijing China in 2012) Asian Symposium on Computer Mathematics (ASCM). Thirty selected and refereed articles in the book present the conference participants’ ideas and views on researching mathematics using computers.

  11. Assembly processor program converts symbolic programming language to machine language

    Science.gov (United States)

    Pelto, E. V.

    1967-01-01

    Assembly processor program converts symbolic programming language to machine language. This program translates symbolic codes into computer understandable instructions, assigns locations in storage for successive instructions, and computer locations from symbolic addresses.

  12. Software engineering frameworks for the cloud computing paradigm

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents the latest research on Software Engineering Frameworks for the Cloud Computing Paradigm, drawn from an international selection of researchers and practitioners. The book offers both a discussion of relevant software engineering approaches and practical guidance on enterprise-wide software deployment in the cloud environment, together with real-world case studies. Features: presents the state of the art in software engineering approaches for developing cloud-suitable applications; discusses the impact of the cloud computing paradigm on software engineering; offers guidance an

  13. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  14. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  15. Symbolic signal processing

    International Nuclear Information System (INIS)

    Rechester, A.B.; White, R.B.

    1993-01-01

    Complex dynamic processes exhibit many complicated patterns of evolution. How can all these patterns be recognized using only output (observational, experimental) data without prior knowledge of the equations of motion? The powerful method for doing this is based on symbolic dynamics: (1) Present output data in symbolic form (trial language). (2) Topological and metric entropies are constructed. (3) Develop algorithms for computer optimization of entropies. (4) By maximizing entropies, find the most appropriate symbolic language for the purpose of pattern recognition. (5) Test this method using a variety of dynamical models from nonlinear science. The authors are in the process of applying this method for analysis of MHD fluctuations in tokamaks

  16. Exact computation and large angular momentum asymptotics of 3nj symbols: Semiclassical disentangling of spin networks

    International Nuclear Information System (INIS)

    Anderson, Roger W.; Aquilanti, Vincenzo; Silva Ferreira, Cristiane da

    2008-01-01

    Spin networks, namely, the 3nj symbols of quantum angular momentum theory and their generalizations to groups other than SU(2) and to quantum groups, permeate many areas of pure and applied science. The issues of their computation and characterization for large values of their entries are a challenge for diverse fields, such as spectroscopy and quantum chemistry, molecular and condensed matter physics, quantum computing, and the geometry of space time. Here we record progress both in their efficient calculation and in the study of the large j asymptotics. For the 9j symbol, a prototypical entangled network, we present and extensively check numerically formulas that illustrate the passage to the semiclassical limit, manifesting both the occurrence of disentangling and the discrete-continuum transition.

  17. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  18. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  19. 48 CFR 52.227-19 - Commercial Computer Software License.

    Science.gov (United States)

    2010-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  20. CMS Software and Computing Ready for Run 2

    CERN Document Server

    Bloom, Kenneth

    2015-01-01

    In Run 1 of the Large Hadron Collider, software and computing was a strategic strength of the Compact Muon Solenoid experiment. The timely processing of data and simulation samples and the excellent performance of the reconstruction algorithms played an important role in the preparation of the full suite of searches used for the observation of the Higgs boson in 2012. In Run 2, the LHC will run at higher intensities and CMS will record data at a higher trigger rate. These new running conditions will provide new challenges for the software and computing systems. Over the two years of Long Shutdown 1, CMS has built upon the successes of Run 1 to improve the software and computing to meet these challenges. In this presentation we will describe the new features in software and computing that will once again put CMS in a position of physics leadership.

  1. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  2. Application of symbolic computations to the constitutive modeling of structural materials

    Science.gov (United States)

    Arnold, Steven M.; Tan, H. Q.; Dong, X.

    1990-01-01

    In applications involving elevated temperatures, the derivation of mathematical expressions (constitutive equations) describing the material behavior can be quite time consuming, involved and error-prone. Therefore intelligent application of symbolic systems to faciliate this tedious process can be of significant benefit. Presented here is a problem oriented, self contained symbolic expert system, named SDICE, which is capable of efficiently deriving potential based constitutive models in analytical form. This package, running under DOE MACSYMA, has the following features: (1) potential differentiation (chain rule), (2) tensor computations (utilizing index notation) including both algebraic and calculus; (3) efficient solution of sparse systems of equations; (4) automatic expression substitution and simplification; (5) back substitution of invariant and tensorial relations; (6) the ability to form the Jacobian and Hessian matrix; and (7) a relational data base. Limited aspects of invariant theory were also incorporated into SDICE due to the utilization of potentials as a starting point and the desire for these potentials to be frame invariant (objective). The uniqueness of SDICE resides in its ability to manipulate expressions in a general yet pre-defined order and simplify expressions so as to limit expression growth. Results are displayed, when applicable, utilizing index notation. SDICE was designed to aid and complement the human constitutive model developer. A number of examples are utilized to illustrate the various features contained within SDICE. It is expected that this symbolic package can and will provide a significant incentive to the development of new constitutive theories.

  3. Signal- and Symbol-based Representations in Computer Vision

    DEFF Research Database (Denmark)

    Krüger, Norbert; Felsberg, Michael

    We discuss problems of signal-- and symbol based representations in terms of three dilemmas which are faced in the design of each vision system. Signal- and symbol-based representations are opposite ends of a spectrum of conceivable design decisions caught at opposite sides of the dilemmas. We make...... inherent problems explicit and describe potential design decisions for artificial visual systems to deal with the dilemmas....

  4. Computer organization and design the hardware/software interface

    CERN Document Server

    Hennessy, John L

    1994-01-01

    Computer Organization and Design: The Hardware/Software Interface presents the interaction between hardware and software at a variety of levels, which offers a framework for understanding the fundamentals of computing. This book focuses on the concepts that are the basis for computers.Organized into nine chapters, this book begins with an overview of the computer revolution. This text then explains the concepts and algorithms used in modern computer arithmetic. Other chapters consider the abstractions and concepts in memory hierarchies by starting with the simplest possible cache. This book di

  5. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  6. Ethics in computer software design and development

    Science.gov (United States)

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  7. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  8. Teaching cloud computing: a software engineering perspective

    OpenAIRE

    Sommerville, Ian

    2012-01-01

    This short papers discusses the issues of teaching cloud computing from a software engineering rather than a business perspective. It discusses what topics might be covered in a senior course on cloud software engineering.

  9. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  10. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  11. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  12. Symbolic computation of exact solutions expressible in rational formal hyperbolic and elliptic functions for nonlinear partial differential equations

    International Nuclear Information System (INIS)

    Wang Qi; Chen Yong

    2007-01-01

    With the aid of symbolic computation, some algorithms are presented for the rational expansion methods, which lead to closed-form solutions of nonlinear partial differential equations (PDEs). The new algorithms are given to find exact rational formal polynomial solutions of PDEs in terms of Jacobi elliptic functions, solutions of the Riccati equation and solutions of the generalized Riccati equation. They can be implemented in symbolic computation system Maple. As applications of the methods, we choose some nonlinear PDEs to illustrate the methods. As a result, we not only can successfully obtain the solutions found by most existing Jacobi elliptic function methods and Tanh-methods, but also find other new and more general solutions at the same time

  13. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  14. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  15. Modular system for the control of complex accelerators using portable software

    International Nuclear Information System (INIS)

    von der Schmitt, H.; Aufhaus, H.

    1982-01-01

    When designing the Mainz Microtron control system, care was taken to achieve an expandable system with long-lived application software. A multi-processor system was built from the beginning. The software is split into modules, according to function and position in hierarchy, which are distributed over the computers. The decoupling which results from modularity eases software development and maintainance. RATFOR was chosen as implementation language. With a message system for communication between the modules, several aims were reached at once: (1) symbolic addressing of the accelerator components throughout the software layers, (2) transparent access to I/O devices (CAMAC) at remote computers, (3) multitasking in FORTRAN (and RATFOR) programs, (4) a separating layer for adaptation to different operating systems - essential points for software portability. The system is in operation since April 1979 for the control of MAMI stage I

  16. The method of covariant symbols in curved space-time

    International Nuclear Information System (INIS)

    Salcedo, L.L.

    2007-01-01

    Diagonal matrix elements of pseudodifferential operators are needed in order to compute effective Lagrangians and currents. For this purpose the method of symbols is often used, which however lacks manifest covariance. In this work the method of covariant symbols, introduced by Pletnev and Banin, is extended to curved space-time with arbitrary gauge and coordinate connections. For the Riemannian connection we compute the covariant symbols corresponding to external fields, the covariant derivative and the Laplacian, to fourth order in a covariant derivative expansion. This allows one to obtain the covariant symbol of general operators to the same order. The procedure is illustrated by computing the diagonal matrix element of a nontrivial operator to second order. Applications of the method are discussed. (orig.)

  17. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2000-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  18. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  19. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  20. Combining metric episodes with semantic event concepts within the Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS)

    Science.gov (United States)

    Kelley, Troy D.; McGhee, S.

    2013-05-01

    This paper describes the ongoing development of a robotic control architecture that inspired by computational cognitive architectures from the discipline of cognitive psychology. The Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS) combines symbolic and sub-symbolic representations of knowledge into a unified control architecture. The new architecture leverages previous work in cognitive architectures, specifically the development of the Adaptive Character of Thought-Rational (ACT-R) and Soar. This paper details current work on learning from episodes or events. The use of episodic memory as a learning mechanism has, until recently, been largely ignored by computational cognitive architectures. This paper details work on metric level episodic memory streams and methods for translating episodes into abstract schemas. The presentation will include research on learning through novelty and self generated feedback mechanisms for autonomous systems.

  1. Comparison of computed tomography dose reporting software

    International Nuclear Information System (INIS)

    Abdullah, A.; Sun, Z.; Pongnapang, N.; Ng, K. H.

    2008-01-01

    Computed tomography (CT) dose reporting software facilitates the estimation of doses to patients undergoing CT examinations. In this study, comparison of three software packages, i.e. CT-Expo (version 1.5, Medizinische Hochschule, Hannover (Germany)), ImPACT CT Patients Dosimetry Calculator (version 0.99x, Imaging Performance Assessment on Computed Tomography, www.impactscan.org) and WinDose (version 2.1a, Wellhofer Dosimetry, Schwarzenbruck (Germany)), has been made in terms of their calculation algorithm and the results of calculated doses. Estimations were performed for head, chest, abdominal and pelvic examinations based on the protocols recommended by European guidelines using single-slice CT (SSCT) (Siemens Somatom Plus 4, Erlangen (Germany)) and multi-slice CT (MSCT) (Siemens Sensation 16, Erlangen (Germany)) for software-based female and male phantoms. The results showed that there are some differences in final dose reporting provided by these software packages. There are deviations of effective doses produced by these software packages. Percentages of coefficient of variance range from 3.3 to 23.4 % in SSCT and from 10.6 to 43.8 % in MSCT. It is important that researchers state the name of the software that is used to estimate the various CT dose quantities. Users must also understand the equivalent terminologies between the information obtained from the CT console and the software packages in order to use the software correctly. (authors)

  2. Seeding the cloud: Financial bootstrapping in the computer software sector

    OpenAIRE

    Mac An Bhaird, Ciarán; Lynn, Theo

    2015-01-01

    This study investigates resourcing of computer software companies that have adopted cloud computing for the development and delivery of application software. Use of this innovative technology potentially impacts firm financing because the initial infrastructure investment requirement is much lower than for packaged software, lead time to market is shorter, and cloud computing supports instant scalability. We test these predictions by conducting in-depth interviews with founders of 18 independ...

  3. Nonlinear analysis of flexible beams undergoing large rotations Via symbolic computations

    Directory of Open Access Journals (Sweden)

    Yuan Xiaofeng

    2001-01-01

    Full Text Available In this paper, a two-stage approach is presented for analyzing flexible beams undergoing large rotations. In the first stage, the symbolic forms of equations of motion and the Jacobian matrix are generated by means of MATLAB and written into a MATLAB script file automatically, where the flexible beams are described by the unified formulation presented in our previous paper. In the second stage, the derived equations of motion are solved by means of implicit numerical methods. Several comparison computations are performed. The two-stage approach proves to be much more efficient than pure numerical one.

  4. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  5. Three Alternative Symbol-Lock Detectors

    Science.gov (United States)

    Shihabi, Mazen M.; Hinedi, Sami M.; Shah, Biren N.

    1993-01-01

    Three symbol-lock detectors proposed as alternatives in advanced receivers processing non-return-to-zero binary data signals. Two perform operations similar to those of older square-law and absolute-value types. However, integrals computed during nonoverlapping symbol periods and, therefore, only one integrator needed in each such detector. Proposed detectors simpler, but performances worse because noises in overlapping samples correlated, whereas noises in nonoverlapping samples not correlated. Third detector is signal-power-estimator type. Signal integrated during successive half symbol cycles, and therefore only one integrator needed. Half-cycle integrals multiplied to eliminate effect of symbol polarity, and products accumulated during M-cycle observation period to smooth out estimate of signal power. If estimated signal power exceeds threshold, delta, then lock declared.

  6. Self-organisation of symbolic information

    Science.gov (United States)

    Feistel, R.

    2017-01-01

    Information is encountered in two different appearances, in native form by arbitrary physical structures, or in symbolic form by coded sequences of letters or the like. The self-organised emergence of symbolic information from structural information is referred to as a ritualisation transition. Occurring at some stage in evolutionary history, ritualisation transitions have in common that after the crossover, arbitrary symbols are issued and recognised by information-processing devices, by transmitters and receivers in the sense of Shannon's communication theory. Symbolic information-processing systems exhibit the fundamental code symmetry whose key features, such as largely lossless copying or persistence under hostile conditions, may elucidate the reasons for the repeated successful occurrence of ritualisation phenomena in evolution history. Ritualisation examples are briefly reviewed such as the origin of life, the appearance of human languages, the establishment of emergent social categories such as money, or the development of digital computers. In addition to their role as carriers of symbolic information, symbols are physical structures which also represent structural information. For a thermodynamic description of symbols and their arrangements, it appears reasonable to distinguish between Boltzmann entropy, Clausius entropy and Pauling entropy. Thermodynamic properties of symbols imply that their lifetimes are limited by the 2nd law.

  7. Hardware and software maintenance strategies for upgrading vintage computers

    International Nuclear Information System (INIS)

    Wang, B.C.; Buijs, W.J.; Banting, R.D.

    1992-01-01

    The paper focuses on the maintenance of the computer hardware and software for digital control computers (DCC). Specific design and problems related to various maintenance strategies are reviewed. A foundation was required for a reliable computer maintenance and upgrading program to provide operation of the DCC with high availability and reliability for 40 years. This involved a carefully planned and executed maintenance and upgrading program, involving complementary hardware and software strategies. The computer system was designed on a modular basis, with large sections easily replaceable, to facilitate maintenance and improve availability of the system. Advances in computer hardware have made it possible to replace DCC peripheral devices with reliable, inexpensive, and widely available components from PC-based systems (PC = personal computer). By providing a high speed link from the DCC to a PC, it is now possible to use many commercial software packages to process data from the plant. 1 fig

  8. TMS communications software. Volume 1: Computer interfaces

    Science.gov (United States)

    Brown, J. S.; Lenker, M. D.

    1979-01-01

    A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.

  9. 14 CFR 417.123 - Computing systems and software.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  10. Symbols of a cosmic order

    Science.gov (United States)

    Madjid, F. Hadi; Myers, John M.

    2016-10-01

    The world runs on networks over which signals communicate sequences of symbols, e.g. numerals. Examining both engineered and natural communications networks reveals an unsuspected order that depends on contact with an unpredictable entity. This order has three roots. The first is a proof within quantum theory that no evidence can ever determine its explanation, so that an agent choosing an explanation must do so unpredictably. The second root is the showing that clocks that step computers do not "tell time" but serve as self-adjusting symbol-handling agents that regulate "logically synchronized" motion in response to unpredictable disturbances. Such a clock-agent has a certain independence as well as the capacity to communicate via unpredictable symbols with other clock-agents and to adjust its own tick rate in response to that communication. The third root is the noticing of unpredictable symbol exchange in natural systems, including the transmission of symbols found in molecular biology. We introduce a symbol-handling agent as a role played in some cases by a person, for example a physicist who chooses an explanation of given experimental outcomes, and in other cases by some other biological entity, and in still other cases by an inanimate device, such as a computer-based detector used in physical measurements. While we forbear to try to explain the propensity of agents at all levels from cells to civilizations to form and operate networks of logically synchronized symbol-handling agents, we point to this propensity as an overlooked cosmic order, an order structured by the unpredictability ensuing from the proof. Appreciating the cosmic order leads to a conception of agency that replaces volition by unpredictability and reconceives the notion of objectivity in a way that makes a place for agency in the world as described by physics. Some specific implications for physics are outlined.

  11. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  12. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and maintenance. We proposed that some software engineering principles can be incorporated into the introductory-level of the computer science curriculum. Our vision is to give community college students a broader exposure to the software development lifecycle. For those students who plan to transfer to a baccalaureate program subsequent to their community college education, our vision is to prepare them sufficiently to move seamlessly into mainstream computer science and software engineering degrees. For those students who plan to move from the community college to a programming career, our vision is to equip them with the foundational knowledge and skills required by the software industry. To accomplish our goals, we developed curriculum modules for teaching seven of the software engineering knowledge areas within current computer science introductory-level courses. Each module was designed to be self-supported with suggested learning objectives, teaching outline, software tool support, teaching activities, and other material to assist the instructor in using it.

  13. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Requirement Specifications for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission... issuing a revised regulatory guide (RG), revision 1 of RG 1.172, ``Software Requirement Specifications for...

  14. Assessment of Computer Software Usage for Estimating and Tender ...

    African Journals Online (AJOL)

    It has been discovered that there are limitations to the use of computer software packages in construction operations especially estimating and tender analysis. The objectives of this research is to evaluate the level of computer software usage for estimating and tender analysis while also assessing the challenges faced by ...

  15. 48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.

    Science.gov (United States)

    2010-10-01

    ... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...

  16. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  17. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    Science.gov (United States)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  18. Symbolic Computations and Exact and Explicit Solutions of Some Nonlinear Evolution Equations in Mathematical Physics

    International Nuclear Information System (INIS)

    Oezis, Turgut; Aslan, Imail

    2009-01-01

    With the aid of symbolic computation system Mathematica, several explicit solutions for Fisher's equation and CKdV equation are constructed by utilizing an auxiliary equation method, the so called G'/G-expansion method, where the new and more general forms of solutions are also constructed. When the parameters are taken as special values, the previously known solutions are recovered. (general)

  19. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  20. CMS software and computing for LHC Run 2

    CERN Document Server

    INSPIRE-00067576

    2016-11-09

    The CMS offline software and computing system has successfully met the challenge of LHC Run 2. In this presentation, we will discuss how the entire system was improved in anticipation of increased trigger output rate, increased rate of pileup interactions and the evolution of computing technology. The primary goals behind these changes was to increase the flexibility of computing facilities where ever possible, as to increase our operational efficiency, and to decrease the computing resources needed to accomplish the primary offline computing workflows. These changes have resulted in a new approach to distributed computing in CMS for Run 2 and for the future as the LHC luminosity should continue to increase. We will discuss changes and plans to our data federation, which was one of the key changes towards a more flexible computing model for Run 2. Our software framework and algorithms also underwent significant changes. We will summarize the our experience with a new multi-threaded framework as deployed on ou...

  1. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Science.gov (United States)

    2010-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  2. Symbolics in control design: prospects and research issues

    DEFF Research Database (Denmark)

    Christensen, Anders

    1994-01-01

    The symbolic processor is targeted as a novel basic service in computer aided control system design. Basic symbolic tools are exemplified. A design process model is formulated for control design, with subsets manipulator, tools, target and goals. It is argued, that symbolic processing will give...... substantial contributions to future design environments, as it provides flexibility of representation not possible with traditional numerics. Based on the design process, views on research issues in the incorporation of symbolic processing into traditional numerical design environments are given...

  3. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Science.gov (United States)

    2010-10-01

    .../contractor proposes its standard commercial software license, those applicable portions thereof consistent... its standard commercial software license until after this purchase order/contract has been issued, or at or after the time the computer software is delivered, such license shall nevertheless be deemed...

  4. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisci......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...... into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples....

  5. A briefing to verification and validation of computer software

    International Nuclear Information System (INIS)

    Zhang Aisen; Xie Yalian

    2012-01-01

    Nowadays, the computer equipment and information processing technology is coming into the engineering of instrument and process control. Owing to its convenient and other advantages, more and more utilities are more than happy to use it. After initial utilization in basic functional controlling, the computer equipment and information processing technology is widely used in safety critical control. Consequently, the people pay more attentions to the quality of computer software. How to assess and ensure its quality are the most concerned problems. The verification and validation technology of computer software are important steps to the quality assurance. (authors)

  6. Computers and Young Children. Storyboard Software: Flannel Boards in the Computer Age.

    Science.gov (United States)

    Shade, Daniel D.

    1995-01-01

    Describes storyboard software as computer programs with which children can build a story using visuals. Notes the importance of such programs from preliterate or nonreading children. Describes a new storyboard program, "Wiggins in Storyland," and its features. Lists recommended storyboard software programs, with publishers and compatible…

  7. Symbolic computation on cylindrical-modified dust-ion-acoustic nebulons in dusty plasmas

    International Nuclear Information System (INIS)

    Tian Bo; Gao Yitian

    2007-01-01

    In this Letter, for the dust-ion-acoustic waves with azimuthal perturbation in a dusty plasma, a cylindrical modified Kadomtsev-Petviashvili (CMKP) model is constructed by virtue of symbolic computation, with three families of exact analytic solutions obtained as well. Dark and bright CMKP nebulons are investigated with pictures and related to such dusty-plasma environments as the supernova shells and Saturn's F-ring. Difference of the CMKP nebulons from other known nebulons is also analyzed, and possibly-observable CMKP-nebulonic effects for the future plasma experiments are proposed, especially those on the possible notch/slot and dark-bright bi-existence

  8. Symbolic behavior in regular classrooms. A specification of symbolic and non-symbolic behavior

    Directory of Open Access Journals (Sweden)

    Stefan eBillinger

    2011-06-01

    Full Text Available Students’ capabilities to use symbolic information in classroom setting could be expected to influence their possibilities to be active and participating. The development of strategies for teachers to compensate for reduced capability need specific operational definition of symbolic behavior. Fifty-three students, aged 11 to 13 years old, 29 boys and 24 girls, from three classes in the same Swedish compulsory regular school participated in the current study. After a short training sequence 25 students (47% were defined as showing symbolic behavior (symbolic, and 28 students (53% were not (non-symbolic, based on their follow-up test performances. Symbolic and non-symbolic differed significantly on post test performances (p. < .05. Surprisingly, non-symbolic behavior deteriorated their performance, while symbolic enhanced their performance (p. < .05. The results indicate that the operational definition used in the present study may be useful in further studies relating the capability to show symbolic behavior and students’ activity and participation in classroom settings.

  9. NMR-CT image and symbol phantoms

    International Nuclear Information System (INIS)

    Hongo, Syozo; Yamaguchi, Hiroshi; Takeshita, Hiroshi

    1990-01-01

    We have developed Japanese phantoms in two procedures. One is described as a mathematical expression. Another is 'symbol phantoms' in 3 dimensional picture-elements, each of which symbolize an organ name. The concept and the algorithm of the symbol phantom enables us to make a phantom for a individual in terms of all his transversal section images. We got 85 transversal section images of head and trunk parts, and those of 40 legs parts by using NMR-CT. We have made the individual phantom for computation of organ doses. The transversal section images were not so clear to identify all organs needed to dose estimation that we had to do hand-editing the shapes of organs with viewing a typical section images: we could not yet make symbol phantom in a automatic editing. Symbols were coded to be visual cords as ASCII characters. After we got the symbol phantom of the first stage, we can edit it easily using a word-processor. Symbol phantom could describe more freely the shape of organs than mathematical phantom. Symbol phantom has several advantages to be an individual phantom, but the only difficult point is how to determine its end-point as a reference man when we apply the method to build the reference man. (author)

  10. Symbol Recognition using Spatial Relations

    OpenAIRE

    K.C., Santosh; Lamiroy, Bart; Wendling, Laurent

    2012-01-01

    International audience; In this paper, we present a method for symbol recognition based on the spatio-structural description of a 'vocabulary' of extracted visual elementary parts. It is applied to symbols in electrical wiring diagrams. The method consists of first identifying vocabulary elements into different groups based on their types (e.g., circle, corner ). We then compute spatial relations between the possible pairs of labelled vocabulary types which are further used as a basis for bui...

  11. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  12. General Symbol Machines: The First Stage in the Evolution of Symbolic Communication

    Directory of Open Access Journals (Sweden)

    Thomas E. Dickins

    2003-01-01

    Full Text Available Humans uniquely form stimulus equivalence (SE classes of abstract and unrelated stimuli, i.e. if taught to match A with B and B with C, they will spontaneously match B with A, and C with B, (the relation of symmetry, and A with C (transitivity. Other species do not do this. The SE ability is possibly the consequence of a specific selection event in the Homo lineage. SE is of interest because it appears to demonstrate a facility that is core to symbolic behavior. Linguistic symbols, for example, are arbitrarily and symmetrically related to their referent such that the term banana has no resemblance to bananas but when processed can be used to discriminate bananas. Equally when bananas are perceived the term banana is readily produced. This relation is arguably the defining mark of symbolic representation. In this paper I shall detail the SE phenomenon and argue that it is evidence for a cognitive device that I term a General Symbol Machine (GSM. The GSM not only sets the background condition for subsequent linguistic evolution but also for other symbolic behaviors such as mathematical reasoning. In so doing the GSM is not particularly domain-specific. The apparent domain-specificity of, for example, natural language is a consequence of other computational developments. This introduces complexity to evolutionary arguments about cognitive architecture.

  13. Active vision and image/video understanding with decision structures based on the network-symbolic models

    Science.gov (United States)

    Kuvich, Gary

    2003-08-01

    Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolve ambiguity and uncertainty via feedback projections, and provide image understanding that is an interpretation of visual information in terms of such knowledge models. The ability of human brain to emulate knowledge structures in the form of networks-symbolic models is found. And that means an important shift of paradigm in our knowledge about brain from neural networks to "cortical software". Symbols, predicates and grammars naturally emerge in such active multilevel hierarchical networks, and logic is simply a way of restructuring such models. Brain analyzes an image as a graph-type decision structure created via multilevel hierarchical compression of visual information. Mid-level vision processes like clustering, perceptual grouping, separation of figure from ground, are special kinds of graph/network transformations. They convert low-level image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena are results of such analysis. Composition of network-symbolic models works similar to frames and agents, combines learning, classification, analogy together with higher-level model-based reasoning into a single framework. Such models do not require supercomputers. Based on such principles, and using methods of Computational intelligence, an Image Understanding system can convert images into the network-symbolic knowledge models, and effectively resolve uncertainty and ambiguity, providing unifying representation for perception and cognition. That allows creating new intelligent computer vision systems for robotic and defense industries.

  14. Titration Calculations with Computer Algebra Software

    Science.gov (United States)

    Lachance, Russ; Biaglow, Andrew

    2012-01-01

    This article examines the symbolic algebraic solution of the titration equations for a diprotic acid, as obtained using "Mathematica," "Maple," and "Mathcad." The equilibrium and conservation equations are solved symbolically by the programs to eliminate the approximations that normally would be performed by the student. Of the three programs,…

  15. Computer software to assess weld thickness loss in offshore pipelines: PEDS

    Energy Technology Data Exchange (ETDEWEB)

    Germano, Andre Luiz Silva; Correa, Samanda Cristine Arruda [Centro Universitario Estadual da Zona Oeste (CCMAT/UEZO), Rio de Janeiro, RJ (Brazil)], e-mail: scorrea@nuclear.ufrj.br; Souza, Edmilson Monteiro de; Silva, Ademir Xavier da; Lopes, Ricardo Tadeu [Programa de Engenharia Nuclear, COPPE, Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil)], e-mails: emonteiro@nuclear.ufrj.br, ademir@nuclear.ufrj.br, ricardo@lin.ufrj.br

    2010-07-01

    The purpose of this work is to present an initial vision about a computer software named PEDS to assess weld thickness loss in offshore pipelines through digital radiography. This software calculates the thickness loss through a data bank obtained using computational modeling based on Monte Carlo MCNPX code. In order to give users more flexibility, the computer software was written in Java, which allows it to run on Linux, Mac OSX and Windows. Furthermore, tools are provided to image display, select and analyze specific areas of the image (measure average, area of selection) and generate profile plots. Applications of this software in the offshore area are presented. (author)

  16. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  17. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  18. Quality assurance of nuclear medicine computer software

    International Nuclear Information System (INIS)

    Cradduck, T.D.

    1986-01-01

    Although quality assurance activities have become well established for the hardware found in nuclear medicine little attention has been paid to computer software. This paper outlines some of the problems that exist and indicates some of the solutions presently under development. The major thrust has been towards establishment of programming standards and comprehensive documentation. Some manufacturers have developed installation verification procedures which programmers are urged to use as models for their own programs. Items that tend to cause erroneous results are discussed with the emphasis for error detection and correction being placed on proper education and training of the computer operator. The concept of interchangeable data files or 'software phantoms' for purposes of quality assurance is discussed. (Author)

  19. Certified symbolic management of financial multi-party contracts

    DEFF Research Database (Denmark)

    Bahr, Patrick; Berthold, Jost; Elsman, Martin

    2015-01-01

    Domain-specific languages (DSLs) for complex financial contracts are in practical use in many banks and financial institutions today. Given the level of automation and pervasiveness of software in the sector, the financial domain is immensely sensitive to software bugs. At the same time...... automatically extract a Haskell implementation of an embedded contract DSL along with the formally verified contract management functionality. This approach opens a road map towards more reliable contract management software, including the possibility of analysing contracts based on symbolic instead of numeric...

  20. New exact travelling wave solutions for two potential coupled KdV equations with symbolic computation

    International Nuclear Information System (INIS)

    Yang Zonghang

    2007-01-01

    We find new exact travelling wave solutions for two potential KdV equations which are presented by Foursov [Foursov MV. J Math Phys 2000;41:6173-85]. Compared with the extended tanh-function method, the algorithm used in our paper can obtain some new kinds of exact travelling wave solutions. With the aid of symbolic computation, some novel exact travelling wave solutions of the potential KdV equations are constructed

  1. Symbolic computation and solitons of the nonlinear Schroedinger equation in inhomogeneous optical fiber media

    International Nuclear Information System (INIS)

    Li Biao; Chen Yong

    2007-01-01

    In this paper, the inhomogeneous nonlinear Schroedinger equation with the loss/gain and the frequency chirping is investigated. With the help of symbolic computation, three families of exact analytical solutions are presented by employing the extended projective Riccati equation method. From our results, many previous known results of nonlinear Schroedinger equation obtained by some authors can be recovered by means of some suitable selections of the arbitrary functions and arbitrary constants. Of optical and physical interests, soliton propagation and soliton interaction are discussed and simulated by computer, which include snake-soliton propagation and snake-solitons interaction, boomerang-like soliton propagation and boomerang-like solitons interaction, dispersion managed (DM) bright (dark) soliton propagation and DM solitons interaction

  2. A Software Rejuvenation Framework for Distributed Computing

    Science.gov (United States)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  3. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  4. The use of symbolic computation in radiative, energy, and neutron transport calculations. Technical report, 15 August 1992--14 August 1994

    International Nuclear Information System (INIS)

    Frankel, J.I.

    1995-01-01

    This investigation uses symbolic computation in developing analytical methods and general computational strategies for solving both linear and nonlinear, regular and singular, integral and integro-differential equations which appear in radiative and combined mode energy transport. This technical report summarizes the research conducted during the first nine months of the present investigation. The use of Chebyshev polynomials augmented with symbolic computation has clearly been demonstrated in problems involving radiative (or neutron) transport, and mixed-mode energy transport. Theoretical issues related to convergence, errors, and accuracy have also been pursued. Three manuscripts have resulted from the funded research. These manuscripts have been submitted to archival journals. At the present time, an investigation involving a conductive and radiative medium is underway. The mathematical formulation leads to a system of nonlinear, weakly-singular integral equations involving the unknown temperature and various Legendre moments of the radiative intensity in a participating medium. Some preliminary results are presented illustrating the direction of the proposed research

  5. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  6. A Symbolic Computation Approach to Parameterizing Controller for Polynomial Hamiltonian Systems

    Directory of Open Access Journals (Sweden)

    Zhong Cao

    2014-01-01

    Full Text Available This paper considers controller parameterization method of H∞ control for polynomial Hamiltonian systems (PHSs, which involves internal stability and external disturbance attenuation. The aims of this paper are to design a controller with parameters to insure that the systems are H∞ stable and propose an algorithm for solving parameters of the controller with symbolic computation. The proposed parameterization method avoids solving Hamilton-Jacobi-Isaacs equations, and thus the obtained controllers with parameters are relatively simple in form and easy in operation. Simulation with a numerical example shows that the controller is effective as it can optimize H∞ control by adjusting parameters. All these results are expected to be of use in the study of H∞ control for nonlinear systems with perturbations.

  7. 48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.

    Science.gov (United States)

    2010-10-01

    ... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...

  8. Computer software summaries. Numbers 1 through 423

    International Nuclear Information System (INIS)

    1979-09-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the US Department of Energy and the Nuclear Regulatory Commission. A major activity of the Center is the preparation and publication of two reports issued periodically - the Center's compilation of program abstracts, ANL-7411, and this software summaries report, ANL-8040. The abstracts describe the softward packages available in the software exchange library maintained and distributed by the Center. The summaries describe agency-sponsored software that is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. Summaries describe software that is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. The purpose of the summaries report is to keep agency and contractor personnel informed as to the existence, status, and availability of computer programs within the agency, and thereby minimize duplication costs and maximize the value of agency software development efforts

  9. Comparison of Pilot Symbol Embedded Channel Estimation Algorithms

    Directory of Open Access Journals (Sweden)

    P. Kadlec

    2009-12-01

    Full Text Available In the paper, algorithms of the pilot symbol embedded channel estimation are compared. Attention is turned to the Least Square (LS channel estimation and the Sliding Correlator (SC algorithm. Both algorithms are implemented in Matlab to estimate the Channel Impulse Response (CIR of a channel exhibiting multi-path propagation. Algorithms are compared from the viewpoint of computational demands, influence of the Additive White Gaussian Noise (AWGN, an embedded pilot symbol and a computed CIR over the estimation error.

  10. Computer software summaries. Numbers 325 through 423

    Energy Technology Data Exchange (ETDEWEB)

    1978-08-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the U.S. Department of Energy and the Nuclear Regulatory Commission. These summaries describe agency-sponsored software which is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. They describe software which is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. Codes dealing with the following subjects are included: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics; and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis, and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; biology and medicine; and data. (RWR)

  11. Computer software summaries. Numbers 325 through 423

    International Nuclear Information System (INIS)

    1978-08-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the U.S. Department of Energy and the Nuclear Regulatory Commission. These summaries describe agency-sponsored software which is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. They describe software which is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. Codes dealing with the following subjects are included: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics; and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis, and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; biology and medicine; and data

  12. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  13. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2012-08-22

    ... review of applications for permits and licenses. The DG entitled ``Developing Software Life Cycle... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission...

  14. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    Technological obsolescence is an on-going challenge for all computer use. By design, and to some extent good fortune, AECL has had a good track record with respect to the march of obsolescence in CANDU digital control computer technology. Recognizing obsolescence as a fact of life, AECL has undertaken a program of supporting the digital control technology of existing CANDU plants. Other AECL groups are developing complete replacement systems for the digital control computers, and more advanced systems for the digital control computers of the future CANDU reactors. This paper presents the results of the efforts of AECL's DCC service support group to replace obsolete digital control computer and related components and to provide friendlier software technology related to the maintenance and use of digital control computers in CANDU. These efforts are expected to extend the current lifespan of existing digital control computers through their mandated life. This group applied two simple rules; the product, whether new or replacement should have a generic basis, and the products should be applicable to both existing CANDU plants and to 'repeat' plant designs built using current design guidelines. While some exceptions do apply, the rules have been met. The generic requirement dictates that the product should not be dependent on any brand technology, and should back-fit to and interface with any such technology which remains in the control design. The application requirement dictates that the product should have universal use and be user friendly to the greatest extent possible. Furthermore, both requirements were designed to anticipate user involvement, modifications and alternate user defined applications. The replacements for hardware components such as paper tape reader/punch, moving arm disk, contact scanner and Ramtek are discussed. The development of these hardware replacements coincide with the development of a gateway system for selected CANDU digital control

  15. Delivering LHC software to HPC compute elements

    CERN Document Server

    Blomer, Jakob; Hardi, Nikola; Popescu, Radu

    2017-01-01

    In recent years, there was a growing interest in improving the utilization of supercomputers by running applications of experiments at the Large Hadron Collider (LHC) at CERN when idle cores cannot be assigned to traditional HPC jobs. At the same time, the upcoming LHC machine and detector upgrades will produce some 60 times higher data rates and challenge LHC experiments to use so far untapped compute resources. LHC experiment applications are tailored to run on high-throughput computing resources and they have a different anatomy than HPC applications. LHC applications comprise a core framework that allows hundreds of researchers to plug in their specific algorithms. The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. To facilitate the distribution of these software stacks to world-wide distributed computing resources, LHC experiments use a purpose-built, global, POSIX file system, the CernVM File System. CernVM-FS pre-processes dat...

  16. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...... is designed with a high abstraction interface to improve developer productivity. The library is based on modern template-based design concepts as described in Glimberg, Engsig-Karup, Nielsen & Dammann (2013). The library utilizes heterogeneous CPU/GPU environments in order to maximize computational throughput...

  17. A directory of computer software applications: energy. Report for 1974--1976

    International Nuclear Information System (INIS)

    Grooms, D.W.

    1977-04-01

    The computer programs or the computer program documentation cited in this directory have been developed for a variety of applications in the field of energy. The cited computer software includes applications in solar energy, petroleum resources, batteries, electrohydrodynamic generators, magnetohydrodynamic generators, natural gas, nuclear fission, nuclear fusion, hydroelectric power production, and geothermal energy. The computer software cited has been used for simulation and modeling, calculations of future energy requirements, calculations of energy conservation measures, and computations of economic considerations of energy systems

  18. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  19. The family of standard hydrogen monitoring system computer software design description: Revision 2

    International Nuclear Information System (INIS)

    Bender, R.M.

    1994-01-01

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ''The family of SHMS.'' When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation

  20. A Novel Coupling Pattern in Computational Science and Engineering Software

    Science.gov (United States)

    Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization,existing CSE software may need to integrate other CSE software systems developed by different groups of experts. Thecoupling problem is one of the challenges f...

  1. A Novel Coupling Pattern in Computational Science and Engineering Software

    Science.gov (United States)

    Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization, existing CSE software may need to integrate other CSE software systems developed by different groups of experts. The coupling problem is one of the challenges...

  2. Low complexity symbol-wise beamforming for MIMO-OFDM systems

    KAUST Repository

    Lee, Hyun Ho

    2011-12-01

    In this paper, we consider a low complexity symbol-wise beamforming for MIMO-OFDM systems. We propose a non-iterative algorithm for the symbol-wise beamforming, which can provide the performance approaching that of the conventional symbol-wise beamforming based on the iterative algorithm. We demonstrate that our proposed scheme can reduce the computational complexity significantly. From our simulation results, it is evident that our proposed scheme leads to a negligible performance loss compared to the conventional symbol-wise beamforming regardless of spatial correlation or presence of co-channel interference. © 2011 IEEE.

  3. The December 2006 ATLAS Computing & Software Workshop

    CERN Multimedia

    Fred Luehring

    The 29th ATLAS Computing & Software Workshop was held on December 11-15 at CERN. With the rapidly approaching onset of data taking, the workshop participants had an air of urgency about them. There was considerable discussion on hot topics such as physics validation of the software, data analysis, actual software production on the GRID, and the schedule of work for 2007 including the Final Dress Rehearsal (FDR). However don't be fooled, the workshop was not all work - there were also two social events which were greatly enjoyed by the attendees. The workshop welcomed Wouter Verkerke as the new Physics Validation Coordinator (replacing Davide Costanzo). Most recent validation work has centered on the 12.0.X release series that will be used for the Computing System Commissioning (CSC) exercise. The validation is now a big job because it needs to be done over a variety of conditions (magnetic field on/off, aligned/misaligned geometry) for every candidate release. Luckily there have been a large number of pe...

  4. Computer Software: Copyright and Licensing Considerations for Schools and Libraries. ERIC Digest.

    Science.gov (United States)

    Reed, Mary Hutchings

    This digest notes that the terms and conditions of computer software package license agreements control the use of software in schools and libraries, and examines the implications of computer software license agreements for classroom use and for library lending policies. Guidelines are provided for interpreting the Copyright Act, and insuring the…

  5. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  6. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  7. Symbol phantoms

    International Nuclear Information System (INIS)

    Yamaguchi, Hiroshi; Hongo, Syozo; Takeshita, Hiroshi

    1990-01-01

    We have developed Japanese phantoms in two procedures for computation of organ doses exposed to internal and/or external radiation sources. One method is to make mathematical phantoms on the basis of ORNL mathematical phantoms. Parameters to specify organs of Japanese mathematical phantom are determined by interpolations of the ORNL data, which define the organs of Caucasian males and females of various ages, i.e. new born, 1, 5, 10, 15 years and adult, with survey data for Japanese physiques. Another procedure is to build 'symbol phantoms' for the Japanese public. The concept and its method of the symbol phantom enables us to make a phantom for an individual when we have all of his transversal section images obtained by a medical imaging device like MRI, and thus we may achieve more realistic phantoms for Japanese public than the mathematical phantoms. Both studies are in progress in NIRS. (author)

  8. Functional requirements for gas characterization system computer software

    International Nuclear Information System (INIS)

    Tate, D.D.

    1996-01-01

    This document provides the Functional Requirements for the Computer Software operating the Gas Characterization System (GCS), which monitors the combustible gasses in the vapor space of selected tanks. Necessary computer functions are defined to support design, testing, operation, and change control. The GCS requires several individual computers to address the control and data acquisition functions of instruments and sensors. These computers are networked for communication, and must multi-task to accommodate operation in parallel

  9. Copyright Protection for Computer Software: Is There a Need for More Protection?

    Science.gov (United States)

    Ku, Linlin

    Because the computer industry's expansion has been much faster than has the development of laws protecting computer software and since the practice of software piracy seems to be alive and well, the issue of whether existing laws can provide effective protection for software needs further discussion. Three bodies of law have been used to protect…

  10. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  11. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  12. Software for simulation of a computed tomography imaging spectrometer using optical design software

    Science.gov (United States)

    Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.

    2000-11-01

    Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.

  13. What makes computational open source software libraries successful?

    International Nuclear Information System (INIS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects. (paper)

  14. What makes computational open source software libraries successful?

    Science.gov (United States)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  15. The mathematica guidebook for symbolics

    CERN Document Server

    Trott, Michael

    2006-01-01

    Mathematica is today's most advanced technical computing system. It features a rich programming environment, two-and three-dimensional graphics capabilities and hundreds of sophisticated, powerful programming and mathematical functions using state-of-the-art algorithms. Combined with a user-friendly interface, and a complete mathematical typesetting system, Mathematica offers an intuitive easy-to-handle environment of great power and utility. "The Mathematica GuideBook for Symbolics" (code and text fully tailored for Mathematica 5.1) deals with Mathematica's symbolic mathematical capabilities. Structural and mathematical operations on single and systems of polynomials are fundamental to many symbolic calculations and they are covered in considerable detail. The solution of equations and differential equations, as well as the classical calculus operations (differentiation, integration, summation, series expansion, limits) are exhaustively treated. Generalized functions and their uses are discussed. In addition...

  16. From On-Premise Software to Cloud Services: The Impact of Cloud Computing on Enterprise Software Vendors' Business Models

    OpenAIRE

    Boillat, Thomas; Legner, Christine

    2013-01-01

    Cloud computing is an emerging paradigm that allows users to conveniently access computing resources as pay-per-use services. Whereas cloud offerings such as Amazon's Elastic Compute Cloud and Google Apps are rapidly gaining a large user base, enterprise software's migration towards the cloud is still in its infancy. For software vendors the move towardscloud solutions implies profound changes in their value-creation logic. Not only are they forced to deliver fully web-enabled solutions and t...

  17. Honeywell modular automation system computer software documentation

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-21I

  18. Carrier tracking by smoothing filter improves symbol SNR

    Science.gov (United States)

    Pomalaza-Raez, Carlos A.; Hurd, William J.

    1986-01-01

    The potential benefit of using a smoothing filter to estimate carrier phase over use of phase locked loops (PLL) is determined. Numerical results are presented for the performance of three possible configurations of the deep space network advanced receiver. These are residual carrier PLL, sideband aided residual carrier PLL, and finally sideband aiding with a Kalman smoother. The average symbol signal to noise ratio (SNR) after losses due to carrier phase estimation error is computed for different total power SNRs, symbol rates and symbol SNRs. It is found that smoothing is most beneficial for low symbol SNRs and low symbol rates. Smoothing gains up to 0.4 dB over a sideband aided residual carrier PLL, and the combined benefit of smoothing and sideband aiding relative to a residual carrier loop is often in excess of 1 dB.

  19. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  20. Computer Software for Life Cycle Cost.

    Science.gov (United States)

    1987-04-01

    34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually

  1. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    OpenAIRE

    Soojin Park; Mansoo Hwang; Sangeun Lee; Young B. Park

    2015-01-01

    Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS) cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo si...

  2. Dynamic power scaling of an intermediate symbol buffer associated with covariance computations

    NARCIS (Netherlands)

    2014-01-01

    An intermediate symbol buffer (ISB) configuration and method is provided such that the ISB memory comprises 15 portions, one for each HSDPA spreading code. Symbols associated with a spreading code are written to the memory portion associated with the same spreading code. When a covariance

  3. A computer-aided software-tool for sustainable process synthesis-intensification

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis...... operations as well as reported hybrid/intensified unit operations is large and can be difficult to manually navigate in order to determine the best process flowsheet for the production of a desired chemical product. Therefore, it is beneficial to utilize computer-aided methods and tools to enumerate, analyze...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more...

  4. Fun and software exploring pleasure, paradox and pain in computing

    CERN Document Server

    Goriunova, Olga

    2014-01-01

    Fun and Software offers the untold story of fun as constitutive of the culture and aesthetics of computing. Fun in computing is a mode of thinking, making and experiencing. It invokes and convolutes the question of rationalism and logical reason, addresses the sensibilities and experience of computation and attests to its creative drives. By exploring topics as diverse as the pleasure and pain of the programmer, geek wit, affects of play and coding as a bodily pursuit of the unique in recursive structures, Fun and Software helps construct a different point of entry to the understanding of soft

  5. SOFTWARE FOR COMPUTER-AIDED DESIGN OF CROSS-WEDGE ROLLING

    OpenAIRE

    A. A. Abramov; S. V. Medvedev

    2013-01-01

    The issues of computer technology creation of 3D-design and engineering analysis of metal forming processes using cross wedge rolling methods (CWR) are considered. The developed software for computer-aided design and simulation of cross-wedge rolling is described.

  6. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Science.gov (United States)

    2010-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  7. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  8. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2005-01-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  9. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  10. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Soojin Park

    2015-04-01

    Full Text Available Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo significant remodeling. This study analyzes actual cases of SaaS cloud computing environment adoption as a way to derive four new best practices for software development and incorporates the identified best practices for currently-in-use processes. Furthermore, this study presents a design for generic software development processes that implement the proposed best practices. The design for the generic process has been applied to reinforce the weak points found in SaaS cloud service development practices used by eight enterprises currently developing or operating actual SaaS cloud computing services. Lastly, this study evaluates the applicability of the proposed SaaS cloud oriented development process through analyzing the feedback data collected from actual application to the development of a SaaS cloud service Astation.

  11. A Methodological Framework for Software Safety in Safety Critical Computer Systems

    OpenAIRE

    P. V. Srinivas Acharyulu; P. Seetharamaiah

    2012-01-01

    Software safety must deal with the principles of safety management, safety engineering and software engineering for developing safety-critical computer systems, with the target of making the system safe, risk-free and fail-safe in addition to provide a clarified differentaition for assessing and evaluating the risk, with the principles of software risk management. Problem statement: Prevailing software quality models, standards were not subsisting in adequately addressing the software safety ...

  12. Software For Computer-Aided Design Of Control Systems

    Science.gov (United States)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  13. Overview of the ANS [American Nuclear Society] mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A.O.

    1991-01-01

    The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Guidelines for Accommodating User Needs in Computer Program Development, ANS-10.5. 5 refs

  14. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    CUNNINGHAM, L.T.

    1999-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-211 and vertical denitration calciner in HC-230C-2

  15. Linux software for large topology optimization problems

    DEFF Research Database (Denmark)

    evolving product, which allows a parallel solution of the PDE, it lacks the important feature that the matrix-generation part of the computations is localized to each processor. This is well-known to be critical for obtaining a useful speedup on a Linux cluster and it motivates the search for a COMSOL......-like package for large topology optimization problems. One candidate for such software is developed for Linux by Sandia Nat’l Lab in the USA being the Sundance system. Sundance also uses a symbolic representation of the PDE and a scalable numerical solution is achieved by employing the underlying Trilinos...

  16. The ''NAIRI-2'' computer plotter software

    International Nuclear Information System (INIS)

    Aksenova, E.K.; Kol'ga, V.V.; Trejbal, Z.

    1977-01-01

    The software is described for the grapher of the computer ''Nairi-2''. The system of subprograms ''Plot'' written in the machine language of ''Nairi-2'' allows to present graphically the information obtained with the computer ''Nairi-2'' and with basis computers (BESM-6, CDC-6500) through the information processing system. The graphic dependence can be represented on a pre-selected scale either as a continuous line with a program linear interpolation between the points with the plotting of coordinates of the x, y axes, or as separate points with the construction of the x, y coordinates axes, in any prescribed direction. The system of subprograms is operated in a language of autoprogramming with the application of a number of new operators introduced into the translator of ''Nairi-2''

  17. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  18. The Implementation of Computer Data Processing Software for EAST NBI

    International Nuclear Information System (INIS)

    Zhang Xiaodan; Hu Chundong; Sheng Peng; Zhao Yuanzhe; Wu Deyun; Cui Qinglong

    2014-01-01

    One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy to the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well. (fusion engineering)

  19. Carrier tracking by smoothing filter can improve symbol SNR

    Science.gov (United States)

    Hurd, W. J.; Pomalaza-Raez, C. A.

    1985-01-01

    The potential benefit of using a smoothing filter to estimate carrier phase over use of phase locked loops (PLL) is determined. Numerical results are presented for the performance of three possible configurations of the deep space network advanced receiver. These are residual carrier PLL, sideband aided residual carrier PLL, and finally sideband aiding with a Kalman smoother. The average symbol signal to noise ratio (CNR) after losses due to carrier phase estimation error is computed for different total power SNRs, symbol rates and symbol SNRs. It is found that smoothing is most beneficial for low symbol SNRs and low symbol rates. Smoothing gains up to 0.4 dB over a sideband aided residual carrier PLL, and the combined benefit of smoothing and sideband aiding relative to a residual carrier loop is often in excess of 1 dB.

  20. Influence of colour on acquisition and generalisation of graphic symbols.

    Science.gov (United States)

    Hetzroni, O E; Ne'eman, A

    2013-07-01

    Children with autism may benefit from using graphic symbols for their communication, language and literacy development. The purpose of this study was to investigate the influence of colour versus grey-scale displays on the identification of graphic symbols using a computer-based intervention. An alternating treatment design was employed to examine the learning and generalisation of 58 colour and grey-scale symbols by four preschool children with autism. The graphic symbols were taught via a meaning-based intervention using stories and educational games. Results demonstrate that all of the children were able to learn and maintain symbol identification over time for both symbol displays with no apparent differences. Differences were apparent for two of the children who exhibited better generalisation when learning grey-scale symbols first. The other two showed no noticeable difference, between displays when generalising from one display to the other. Implications and further research are discussed. © 2012 The Authors. Journal of Intellectual Disability Research © 2012 John Wiley & Sons Ltd, MENCAP & IASSID.

  1. Object Oriented and Functional Programming for Symbolic Manipulation

    OpenAIRE

    Vlasov, Alexander Yu.

    1999-01-01

    The advantages of mixed approach with using different kinds of programming techniques for symbolic manipulation are discussed. The main purpose of approach offered is merge the methods of object oriented programming that convenient for presentation data and algorithms for user with advantages of functional languages for data manipulation, internal presentation, and portability of software.

  2. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Science.gov (United States)

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  3. Lisbon Symbol Database (LSD): Subjective norms for 600 symbols.

    Science.gov (United States)

    Prada, Marília; Rodrigues, David; Silva, Rita R; Garrido, Margarida V

    2016-12-01

    This article presents subjective rating norms for a new set of 600 symbols, depicting various contents (e.g., transportation, technology, and leisure activities) that can be used by researchers in different fields. Symbols were evaluated for aesthetic appeal, familiarity, visual complexity, concreteness, valence, arousal, and meaningfulness. The normative data were obtained from 388 participants, and no gender differences were found. Descriptive results (means, standard deviations, and confidence intervals) for each symbol in each dimension are presented. Overall, the dimensions were highly correlated. Additionally, participants were asked to briefly describe the meaning of each symbol. The results indicate that the present symbol set is varied, allowing for the selection of exemplars with different levels on the seven examined dimensions. This set of symbols constitutes a tool with potential for research in different areas. The database with all of the symbols is available as supplemental materials.

  4. Influence of Colour on Acquisition and Generalisation of Graphic Symbols

    Science.gov (United States)

    Hetzroni, O. E.; Ne'eman, A.

    2013-01-01

    Background: Children with autism may benefit from using graphic symbols for their communication, language and literacy development. The purpose of this study was to investigate the influence of colour versus grey-scale displays on the identification of graphic symbols using a computer-based intervention. Method: An alternating treatment design was…

  5. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    Science.gov (United States)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  6. Application of software technology to a future spacecraft computer design

    Science.gov (United States)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  7. Software For Computer-Security Audits

    Science.gov (United States)

    Arndt, Kate; Lonsford, Emily

    1994-01-01

    Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.

  8. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  9. Software Reviews: "Pow! Zap! Ker-plunk! The Comic Book Maker" (Pelican Software).

    Science.gov (United States)

    Porter, Bernajean

    1990-01-01

    Reviews the newest addition to Pelican's Creative Writing Series of instructional software, which uses the comic book format to provide a unique writing environment for satire, symbolism, sequencing, and combining text and graphics to communicate ideas. (SR)

  10. Alternate symbol inversion for improved symbol synchronization in convolutionally coded systems

    Science.gov (United States)

    Simon, M. K.; Smith, J. G.

    1980-01-01

    Inverting alternate symbols of the encoder output of a convolutionally coded system provides sufficient density of symbol transitions to guarantee adequate symbol synchronizer performance, a guarantee otherwise lacking. Although alternate symbol inversion may increase or decrease the average transition density, depending on the data source model, it produces a maximum number of contiguous symbols without transition for a particular class of convolutional codes, independent of the data source model. Further, this maximum is sufficiently small to guarantee acceptable symbol synchronizer performance for typical applications. Subsequent inversion of alternate detected symbols permits proper decoding.

  11. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  12. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  13. Software designs of image processing tasks with incremental refinement of computation.

    Science.gov (United States)

    Anastasia, Davide; Andreopoulos, Yiannis

    2010-08-01

    Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.

  14. Fast Multi-Symbol Based Iterative Detectors for UWB Communications

    Directory of Open Access Journals (Sweden)

    Lottici Vincenzo

    2010-01-01

    Full Text Available Ultra-wideband (UWB impulse radios have shown great potential in wireless local area networks for localization, coexistence with other services, and low probability of interception and detection. However, low transmission power and high multipath effect make the detection of UWB signals challenging. Recently, multi-symbol based detection has caught attention for UWB communications because it provides good performance and does not require explicit channel estimation. Most of the existing multi-symbol based methods incur a higher computational cost than can be afforded in the envisioned UWB systems. In this paper, we propose an iterative multi-symbol based method that has low complexity and provides near optimal performance. Our method uses only one initial symbol to start and applies a decision directed approach to iteratively update a filter template and information symbols. Simulations show that our method converges in only a few iterations (less than 5, and that when the number of symbols increases, the performance of our method approaches that of the ideal Rake receiver.

  15. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    Science.gov (United States)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  16. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  17. Symbolic transfer entropy-based premature signal analysis

    International Nuclear Information System (INIS)

    Wang Jun; Yu Zheng-Feng

    2012-01-01

    In this paper, we use symbolic transfer entropy to study the coupling strength between premature signals. Numerical experiments show that three types of signal couplings are in the same direction. Among them, normal signal coupling is the strongest, followed by that of premature ventricular contractions, and that of atrial premature beats is the weakest. The T test shows that the entropies of the three signals are distinct. Symbolic transfer entropy requires less data, can distinguish the three types of signals and has very good computational efficiency. (interdisciplinary physics and related areas of science and technology)

  18. Sigref - A Symbolic Bisimulation Tool Box

    NARCIS (Netherlands)

    Wimmer, Ralf; Herbstritt, Marc; Hermanns, Holger; Strampp, Kelley; Becker, Bernd; Graf, Susanne; Zhang, Wenhui

    2006-01-01

    We present a uniform signature-based approach to compute the most popular bisimulations. Our approach is implemented symbolically using BDDs, which enables the handling of very large transition systems. Signatures for the bisimulations are built up from a few generic building blocks, which naturally

  19. Generic multiset programming with discrimination-based joins and symbolic Cartesian products

    DEFF Research Database (Denmark)

    Henglein, Fritz; Larsen, Ken Friis

    2010-01-01

    This paper presents GMP, a library for generic, SQL-style programming with multisets. It generalizes the querying core of SQL in a number of ways: Multisets may contain elements of arbitrary first-order data types, including references (pointers), recur- sive data types and nested multisets......: symbolic (term) repre- sentations of multisets, specifically for Cartesian products, for facilitating dynamic symbolic computation, which intersperses algebraic simplification steps with conventional data pro- cessing; and discrimination-based joins, a generic technique for computing equijoins based...

  20. A NEW CONTROL CIRCUIT AND COMPUTER SOFTWARE FOR CONTROLING PHOTOVOLTAIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Mustafa Berkant SELEK

    2008-02-01

    Full Text Available In this study, a new microcontroller circuit was designed and new computer software was implemented to control power flow currents of renewable energy system, which is established in Solar Energy Institute, Ege University, Bornova, Izmir, Turkey. PIC18F452 microcontroller based electronic circuit was designed to control another electronic circuit that includes power electronic switching components. Readily available standard control circuits are designed for switching single level inverters. In contrary, implemented circuit allows to switch multilevel inverters. In addition, because the efficiency of solar energy panels is considerably low, solar panels should be operated under the maximum power point (MPP. Therefore, MPP algorithm is included in the designed control circuit. Next, the control circuit also includes a serial communication interface based on RS232 standard. Using this interface enables the user to choose all functions available in the control circuit and take status report via computer software. Last, a general purpose command set was designed to establish communication between the computer software and the microcontroller-based control circuit. As a result, it is aimed that this study supply a basis for the researchers who want to develop own control circuits or more visual software.

  1. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  2. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  3. Painleve Analysis and Darboux Transformation for a Variable-Coefficient Boussinesq System in Fluid Dynamics with Symbolic Computation

    International Nuclear Information System (INIS)

    Li Hongzhe; Tian Bo; Li Lili; Zhang Haiqiang

    2010-01-01

    The new soliton solutions for the variable-coefficient Boussinesq system, whose applications are seen in fluid dynamics, are studied in this paper with symbolic computation. First, the Painleve analysis is used to investigate its integrability properties. For the identified case we give, the Lax pair of the system is found, and then the Darboux transformation is constructed. At last, some new soliton solutions are presented via the Darboux method. Those solutions might be of some value in fluid dynamics. (general)

  4. Symbol Synchronization for Diffusion-Based Molecular Communications.

    Science.gov (United States)

    Jamali, Vahid; Ahmadzadeh, Arman; Schober, Robert

    2017-12-01

    Symbol synchronization refers to the estimation of the start of a symbol interval and is needed for reliable detection. In this paper, we develop several symbol synchronization schemes for molecular communication (MC) systems where we consider some practical challenges, which have not been addressed in the literature yet. In particular, we take into account that in MC systems, the transmitter may not be equipped with an internal clock and may not be able to emit molecules with a fixed release frequency. Such restrictions hold for practical nanotransmitters, e.g., modified cells, where the lengths of the symbol intervals may vary due to the inherent randomness in the availability of food and energy for molecule generation, the process for molecule production, and the release process. To address this issue, we develop two synchronization-detection frameworks which both employ two types of molecule. In the first framework, one type of molecule is used for symbol synchronization and the other one is used for data detection, whereas in the second framework, both types of molecule are used for joint symbol synchronization and data detection. For both frameworks, we first derive the optimal maximum likelihood (ML) symbol synchronization schemes as performance upper bounds. Since ML synchronization entails high complexity, for each framework, we also propose three low-complexity suboptimal schemes, namely a linear filter-based scheme, a peak observation-based scheme, and a threshold-trigger scheme, which are suitable for MC systems with limited computational capabilities. Furthermore, we study the relative complexity and the constraints associated with the proposed schemes and the impact of the insertion and deletion errors that arise due to imperfect synchronization. Our simulation results reveal the effectiveness of the proposed synchronization schemes and suggest that the end-to-end performance of MC systems significantly depends on the accuracy of the symbol

  5. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  6. Symbolic-computation study of the perturbed nonlinear Schrodinger model in inhomogeneous optical fibers

    International Nuclear Information System (INIS)

    Tian Bo; Gao Yitian

    2005-01-01

    A realistic, inhomogeneous fiber in the optical communication systems can be described by the perturbed nonlinear Schrodinger model (also named as the normalized nonlinear Schrodinger model with periodically varying coefficients, dispersion managed nonlinear Schrodinger model or nonlinear Schrodinger model with variable coefficients). Hereby, we extend to this model a direct method, perform symbolic computation and obtain two families of the exact, analytic bright-solitonic solutions, with or without the chirp respectively. The parameters addressed include the shape of the bright soliton, soliton amplitude, inverse width of the soliton, chirp, frequency, center of the soliton and center of the phase of the soliton. Of optical and physical interests, we discuss some previously-published special cases of our solutions. Those solutions could help the future studies on the optical communication systems. ms

  7. AI tools in computer based problem solving

    Science.gov (United States)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  8. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  9. Cross-spectrum symbol synchronization

    Science.gov (United States)

    Mccallister, R. D.; Simon, M. K.

    1981-01-01

    A popular method of symbol synchronization exploits one aspect of generalized harmonic analysis, normally referred to as the cross-spectrum. Utilizing nonlinear techniques, the input symbol energy is effectively concentrated onto multiples of the symbol clock frequency, facilitating application of conventional phase lock synchronization techniques. A general treatment of the cross-spectrum technique is developed and shown to be applicable across a broad class of symbol modulation formats. An important specific symbol synchronization application is then treated, focusing the general development to provide both insight and quantitative measure of the performance impact associated with variation in these key synchronization parameters: symbol modulation format, symbol transition probability, symbol energy to noise density ratio, and symbol rate to filter bandwidth ratio.

  10. Advances in Multimedia, Software Engineering and Computing Vol.1 : Proceedings of the 2011 MSEC International Conference on Multimedia, Software Engineering and Computing

    CERN Document Server

    Lin, Sally

    2012-01-01

    MSEC2011 is an integrated conference concentrating its focus upon Multimedia ,Software Engineering, Computing and Education. In the proceeding, you can learn much more knowledge about Multimedia, Software Engineering ,Computing and Education of researchers all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned field. In order to meet high standard of Springer, AISC series ,the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organization had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  11. Advances in Multimedia, Software Engineering and Computing Vol.2 : Proceedings of the 2011 MSEC International Conference on Multimedia, Software Engineering and Computing

    CERN Document Server

    Lin, Sally

    2012-01-01

    MSEC2011 is an integrated conference concentrating its focus upon Multimedia ,Software Engineering, Computing and Education. In the proceeding, you can learn much more knowledge about Multimedia, Software Engineering, Computing and Education of researchers all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned field. In order to meet high standard of Springer, AISC series ,the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organization had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  12. Software Defined Radio Datalink Implementation Using PC-Type Computers

    National Research Council Canada - National Science Library

    Zafeiropoulos, Georgios

    2003-01-01

    The objective of this thesis was to examine the feasibility of implementation and the performance of a Software Defined Radio datalink, using a common PC type host computer and a high level programming language...

  13. Symbolic Number Comparison Is Not Processed by the Analog Number System: Different Symbolic and Non-symbolic Numerical Distance and Size Effects

    Directory of Open Access Journals (Sweden)

    Attila Krajcsi

    2018-02-01

    Full Text Available HIGHLIGHTSWe test whether symbolic number comparison is handled by an analog noisy system.Analog system model has systematic biases in describing symbolic number comparison.This suggests that symbolic and non-symbolic numbers are processed by different systems.Dominant numerical cognition models suppose that both symbolic and non-symbolic numbers are processed by the Analog Number System (ANS working according to Weber's law. It was proposed that in a number comparison task the numerical distance and size effects reflect a ratio-based performance which is the sign of the ANS activation. However, increasing number of findings and alternative models propose that symbolic and non-symbolic numbers might be processed by different representations. Importantly, alternative explanations may offer similar predictions to the ANS prediction, therefore, former evidence usually utilizing only the goodness of fit of the ANS prediction is not sufficient to support the ANS account. To test the ANS model more rigorously, a more extensive test is offered here. Several properties of the ANS predictions for the error rates, reaction times, and diffusion model drift rates were systematically analyzed in both non-symbolic dot comparison and symbolic Indo-Arabic comparison tasks. It was consistently found that while the ANS model's prediction is relatively good for the non-symbolic dot comparison, its prediction is poorer and systematically biased for the symbolic Indo-Arabic comparison. We conclude that only non-symbolic comparison is supported by the ANS, and symbolic number comparisons are processed by other representation.

  14. The benefit of introducing audit software into curricula for computer ...

    African Journals Online (AJOL)

    The benefit of introducing audit software into curricula for computer auditing students: a student perspective from the University of Pretoria. ... willing to sacrifice more of their time for practical computer classes because they are aware of the beneficial impact on their understanding of the subject as well as their future careers.

  15. Symbolic Dynamics of Reanalysis Data

    Science.gov (United States)

    Larson, J. W.; Dickens, P. M.

    2003-12-01

    Symbolic dynamics1 is the study of sequences of symbols belonging to a discrete set of elements, the most commmon example being a sequence of ones and zeroes. Often the set of symbols is derived from a timeseries of a continuous variable through the introduction of a partition function--a process called symbolization. Symbolic dynamics has been used widely in the physical sciences; a geophysical example being the application of C1 and C2 complexity2 to hourly precipitation station data3. The C1 and C2 complexities are computed by examining subsequences--or words--of fixed length L in the limit of large values of L. Recent advances in information theory have led to techniques focused on the growth rate of the Shannon entropy and its asymptotic behavior in the limit of long words--levels of entropy convergence4. The result is a set of measures one can use to quantify the amount of memory stored in the sequence, whether or not an observer is able to synchronize to the sequence, and with what confidence it may be predicted. These techniques may also be used to uncover periodic behavior in the sequence. We are currently applying complexity theory and levels of entropy convergence to gridpoint timeseries from the NCAR/NCEP 50-year reanalysis5. Topics to be discussed include: a brief introduction to symbolic dynamics; a description of the partition function/symbolization strategy; a discussion of C1 and C2 complexity and entropy convergence rates and their utility; and example applications of these techniques to NCAR/NCEP 50-reanalyses gridpoint timeseries, resulting in maps of C1 and C2 complexities and entropy convergence rates. Finally, we will discuss how these results may be used to validate climate models. 1{Hao, Bai-Lin, Elementary Symbolic Dynamics and Chaos in Dissipative Systems, Wold Scientific, Singapore (1989)} 2{d'Alessandro, G. and Politi, A., Phys. Rev. Lett., 64, 1609-1612 (1990).} 3{Elsner, J. and Tsonis, A., J. Atmos. Sci., 50, 400-405 (1993).} 4

  16. The experimental modification of a computer software package for ...

    African Journals Online (AJOL)

    The experimental modification of a computer software package for graphing algebraic functions. ... No Abstract Available South African Journal of Education Vol.25(2) 2005: 61-68. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  17. Computing with concepts, computing with numbers: Llull, Leibniz, and Boole

    NARCIS (Netherlands)

    Uckelman, S.L.

    2010-01-01

    We consider two ways to understand "reasoning as computation", one which focuses on the computation of concept symbols and the other on the computation of number symbols. We illustrate these two ways with Llull’s Ars Combinatoria and Leibniz’s attempts to arithmetize language, respectively. We then

  18. V-1 nuclear power plant standby RPP-16S computer software

    International Nuclear Information System (INIS)

    Suchy, R.

    1988-01-01

    The software structure of the function of program modules of the RPP-16S standby computer which is part of the information system of the V-1 Bohunice nuclear power plant are described. The multitasking AMOS operational system is used for the organization of programs in the computer. The program modules are classified in five groups by function, i.e., in modules for the periodical collection of values and for the measurement of process quantities for both nuclear power plant units; for the primary processing of the values; for the monitoring of exceedance of preset limits; for unit operators' communication with the computer. The fifth group consists of users program modules. The standby computer software was tested in the actual operating conditions of the V-1 power plant. The results showed it operated correctly; minor shortcomings were removed. (Z.M.). 1 fig

  19. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2004-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  20. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2005-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  1. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2000-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  2. Report of Investigation Committee on Programs for Research and Development of Strategic Software for Advanced Computing; Kodo computing yo senryakuteki software no kenkyu kaihatsu program kento iinkai hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-12-26

    The committee met on December 26, 2000, with 32 people in attendance. Discussion was made on the results of surveys conducted for the development of strategic software for advanced computing and on candidate projects for strategic software development. Taken up at the meeting were eight subjects which were the interim report on the survey results, semiconductor TCAD (technology computer-aided design) system, nanodevice surface analysis system, network distribution parallel processing platform (tentative name), fatigue simulation system, chemical reaction simulator, protein structure analysis system, and a next-generation fluid analysis system. In this report, the author uses his own way in arranging the discussion results into the four categories of (1) a strategic software development system, (2) popularization method and maintenance system, (3) handling of the results, and (4) the evaluation of the program for research and development. In relation to category (1), it is stated that the software grows up with the passage of time, that the software is a commercial program, and that in the development of a commercial software program the process of basic study up to the preparation of a prototype should be completely separated from the process for its completion. (NEDO)

  3. Symbolism in prehistoric man.

    Science.gov (United States)

    Facchini, F

    2000-12-01

    The aptitude for symbolization, characteristic of man, is revealed not only in artistic representations and funerary practices. It is exhibited by every manifestation of human activity or representation of natural phenomena that assumes or refers to a meaning. We can recognize functional symbolism (tool-making, habitative or food technology), social symbolism, (language and social communication) and spiritual symbolism (funerary practices and artistic expressions). On the basis of these concepts, research into symbolism in prehistoric man allows us to recognize forms of symbolism already in the manifestations of the most ancient humans, starting with Homo habilis (or rudolfensis). Toolmaking, social organization and organization of the territory are oriented toward survival and the life of the family group. They attest to symbolic behaviors and constitute symbolic systems by means of which man expresses himself, lives and transmits his symbolic world. The diverse forms of symbolism are discussed with reference to the different phases of prehistoric humanity.

  4. Can symbols be ‘promoted’ or ‘demoted’?: Symbols as religious phenomena

    Directory of Open Access Journals (Sweden)

    Jaco Beyers

    2013-03-01

    Full Text Available Religious symbols are part of our world, relating to another world. In order to understand the process by which symbols grow and develop, the particular context of a symbol is important. In this article a particular theory as to what symbols are, is presented. Religion presupposes the existence of two worlds: this-worldly (profane and the other-worldly (sacred. The means of communication and reference between these two worlds are symbols. Two examples are investigated so as to indicate how symbols can over time either be demoted or promoted. In the case of the Asherah and asherah as related in the Old Testament a demotion of a symbol is illustrated. The growth of ancient Egyptian religion is an example of a possible promotion of symbols. The conditions under which these processes can occur are investigated.

  5. Incorporating Computer-Aided Software in the Undergraduate Chemical Engineering Core Courses

    Science.gov (United States)

    Alnaizy, Raafat; Abdel-Jabbar, Nabil; Ibrahim, Taleb H.; Husseini, Ghaleb A.

    2014-01-01

    Introductions of computer-aided software and simulators are implemented during the sophomore-year of the chemical engineering (ChE) curriculum at the American University of Sharjah (AUS). Our faculty concurs that software integration within the curriculum is beneficial to our students, as evidenced by the positive feedback received from industry…

  6. Image segmentation for enhancing symbol recognition in prosthetic vision.

    Science.gov (United States)

    Horne, Lachlan; Barnes, Nick; McCarthy, Chris; He, Xuming

    2012-01-01

    Current and near-term implantable prosthetic vision systems offer the potential to restore some visual function, but suffer from poor resolution and dynamic range of induced phosphenes. This can make it difficult for users of prosthetic vision systems to identify symbolic information (such as signs) except in controlled conditions. Using image segmentation techniques from computer vision, we show it is possible to improve the clarity of such symbolic information for users of prosthetic vision implants in uncontrolled conditions. We use image segmentation to automatically divide a natural image into regions, and using a fixation point controlled by the user, select a region to phosphenize. This technique improves the apparent contrast and clarity of symbolic information over traditional phosphenization approaches.

  7. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  8. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  9. Real Time Decoding of Color Symbol for Optical Positioning System

    Directory of Open Access Journals (Sweden)

    Abdul Waheed Malik

    2015-01-01

    Full Text Available This paper presents the design and real-time decoding of a color symbol that can be used as a reference marker for optical navigation. The designed symbol has a circular shape and is printed on paper using two distinct colors. This pair of colors is selected based on the highest achievable signal to noise ratio. The symbol is designed to carry eight bit information. Real time decoding of this symbol is performed using a heterogeneous combination of Field Programmable Gate Array (FPGA and a microcontroller. An image sensor having a resolution of 1600 by 1200 pixels is used to capture images of symbols in complex backgrounds. Dynamic image segmentation, component labeling and feature extraction was performed on the FPGA. The region of interest was further computed from the extracted features. Feature data belonging to the symbol was sent from the FPGA to the microcontroller. Image processing tasks are partitioned between the FPGA and microcontroller based on data intensity. Experiments were performed to verify the rotational independence of the symbols. The maximum distance between camera and symbol allowing for correct detection and decoding was analyzed. Experiments were also performed to analyze the number of generated image components and sub-pixel precision versus different light sources and intensities. The proposed hardware architecture can process up to 55 frames per second for accurate detection and decoding of symbols at two Megapixels resolution. The power consumption of the complete system is 342mw.

  10. Symbolic PathFinder: Symbolic Execution of Java Bytecode

    Science.gov (United States)

    Pasareanu, Corina S.; Rungta, Neha

    2010-01-01

    Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  12. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  13. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  14. Computer Games as Virtual Environments for Safety-Critical Software Validation

    Directory of Open Access Journals (Sweden)

    Štefan Korečko

    2017-01-01

    Full Text Available Computer games became an inseparable part of everyday life in modern society and the time people spend playing them every day is increasing. This trend caused a noticeable research activity focused on utilizing the time spent playing in a meaningful way, for example to help solving scientific problems or tasks related to computer systems development. In this paper we present one contribution to this activity, a software system consisting of a modified version of the Open Rails train simulator and an application called TS2JavaConn, which allows to use separately developed software controllers with the simulator. The system is intended for validation of controllers developed by formal methods. The paper describes the overall architecture of the system and operation of its components. It also compares the system with other approaches to purposeful utilization of computer games, specifies suitable formal methods and illustrates its intended use on an example.

  15. The Virtual Cell: a software environment for computational cell biology.

    Science.gov (United States)

    Loew, L M; Schaff, J C

    2001-10-01

    The newly emerging field of computational cell biology requires software tools that address the needs of a broad community of scientists. Cell biological processes are controlled by an interacting set of biochemical and electrophysiological events that are distributed within complex cellular structures. Computational modeling is familiar to researchers in fields such as molecular structure, neurobiology and metabolic pathway engineering, and is rapidly emerging in the area of gene expression. Although some of these established modeling approaches can be adapted to address problems of interest to cell biologists, relatively few software development efforts have been directed at the field as a whole. The Virtual Cell is a computational environment designed for cell biologists as well as for mathematical biologists and bioengineers. It serves to aid the construction of cell biological models and the generation of simulations from them. The system enables the formulation of both compartmental and spatial models, the latter with either idealized or experimentally derived geometries of one, two or three dimensions.

  16. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  17. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  18. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  19. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  20. New tools for digital medical image processing implemented in DIP software

    International Nuclear Information System (INIS)

    Araujo, Erica A.C.; Santana, Ivan E.; Lima, Fernando R.A.; Viera, Jose W.

    2011-01-01

    The anthropomorphic models used in computational dosimetry, also called phantoms, are mostly built from stacks of images CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) obtained from scans of patients or volunteers. The construction of voxel phantoms requires computational processing for transforming image formats, dimensional image compression (2D) to form three-dimensional arrays (3D), quantization, resampling, enhancement, restoration and image segmentation, among others. Hardly the computational dosimetry researcher finds all these skills into a single software and often it results in a decreased development of their research or inadequate use of alternative tools. The need to integrate the various tasks of the original digital image processing to obtain an image that can be used in a computational model of exposure led to the development of software DIP (Digital Image Processing). This software reads, writes and edits binary files containing the 3D matrix corresponding to a stack of cross-sectional images of a given geometry that can be a human body or other volume of interest. It can also read any type of computer image and do conversions. When the task involves only one output image, it is saved in the JPEG standard Windows. When it involves a stack of images, the binary output file is called SGI (Interactive Graphic Simulations, a symbol already used in other publications of the Research Group in Numerical Dosimetry). The following paper presents the third version of the DIP software and emphasizes the new tools it implemented. Currently it has the menus Basics, Views, Spatial Domain, Frequency Domain, Segmentations and Study. Each menu contains items and subitems with features that generally require an image as input and produce an image or an attribute in the output. (author)

  1. The GeoSteiner software package for computing Steiner trees in the plane

    DEFF Research Database (Denmark)

    Juhl, Daniel; Warme, David M.; Winter, Pawel

    The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the GeoSteiner...... approach --- allowing the exact solution of Steiner tree problems with more than a thousand terminals. Since then, a number of algorithmic enhancements have improved the performance of the software package significantly. In this computational study we run the current code on the largest problem instances...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....

  2. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  3. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  4. Concealed identification symbols and nondestructive determination of the identification symbols

    Science.gov (United States)

    Nance, Thomas A.; Gibbs, Kenneth M.

    2014-09-16

    The concealing of one or more identification symbols into a target object and the subsequent determination or reading of such symbols through non-destructive testing is described. The symbols can be concealed in a manner so that they are not visible to the human eye and/or cannot be readily revealed to the human eye without damage or destruction of the target object. The identification symbols can be determined after concealment by e.g., the compilation of multiple X-ray images. As such, the present invention can also provide e.g., a deterrent to theft and the recovery of lost or stolen objects.

  5. A software for computer automated radioactive particle tracking

    International Nuclear Information System (INIS)

    Vieira, Wilson S.; Brandao, Luis E.; Braz, Delson

    2008-01-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  6. Flexibility of Bricard's linkages and other structures via resultants and computer algebra.

    Science.gov (United States)

    Lewis, Robert H; Coutsias, Evangelos A

    2016-07-01

    Flexibility of structures is extremely important for chemistry and robotics. Following our earlier work, we study flexibility using polynomial equations, resultants, and a symbolic algorithm of our creation that analyzes the resultant. We show that the software solves a classic arrangement of quadrilaterals in the plane due to Bricard. We fill in several gaps in Bricard's work and discover new flexible arrangements that he was apparently unaware of. This provides strong evidence for the maturity of the software, and is a wonderful example of mathematical discovery via computer assisted experiment.

  7. A community Q&A for HEP Software and Computing ?

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    How often do you use StackOverflow or ServerFault to find information in your daily work? Would you be interested in a community Q&A site for HEP Software and Computing, for instance a dedicated StackExchange site? I looked into this question...

  8. Proceedings of the Workshop on software tools for distributed intelligent control systems

    Energy Technology Data Exchange (ETDEWEB)

    Herget, C.J. (ed.)

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  9. Blind trials of computer-assisted structure elucidation software

    Directory of Open Access Journals (Sweden)

    Moser Arvin

    2012-02-01

    Full Text Available Abstract Background One of the largest challenges in chemistry today remains that of efficiently mining through vast amounts of data in order to elucidate the chemical structure for an unknown compound. The elucidated candidate compound must be fully consistent with the data and any other competing candidates efficiently eliminated without doubt by using additional data if necessary. It has become increasingly necessary to incorporate an in silico structure generation and verification tool to facilitate this elucidation process. An effective structure elucidation software technology aims to mimic the skills of a human in interpreting the complex nature of spectral data while producing a solution within a reasonable amount of time. This type of software is known as computer-assisted structure elucidation or CASE software. A systematic trial of the ACD/Structure Elucidator CASE software was conducted over an extended period of time by analysing a set of single and double-blind trials submitted by a global audience of scientists. The purpose of the blind trials was to reduce subjective bias. Double-blind trials comprised of data where the candidate compound was unknown to both the submitting scientist and the analyst. The level of expertise of the submitting scientist ranged from novice to expert structure elucidation specialists with experience in pharmaceutical, industrial, government and academic environments. Results Beginning in 2003, and for the following nine years, the algorithms and software technology contained within ACD/Structure Elucidator have been tested against 112 data sets; many of these were unique challenges. Of these challenges 9% were double-blind trials. The results of eighteen of the single-blind trials were investigated in detail and included problems of a diverse nature with many of the specific challenges associated with algorithmic structure elucidation such as deficiency in protons, structure symmetry, a large number of

  10. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  11. Protecting software agents from malicious hosts using quantum computing

    Science.gov (United States)

    Reisner, John; Donkor, Eric

    2000-07-01

    We evaluate how quantum computing can be applied to security problems for software agents. Agent-based computing, which merges technological advances in artificial intelligence and mobile computing, is a rapidly growing domain, especially in applications such as electronic commerce, network management, information retrieval, and mission planning. System security is one of the more eminent research areas in agent-based computing, and the specific problem of protecting a mobile agent from a potentially hostile host is one of the most difficult of these challenges. In this work, we describe our agent model, and discuss the capabilities and limitations of classical solutions to the malicious host problem. Quantum computing may be extremely helpful in addressing the limitations of classical solutions to this problem. This paper highlights some of the areas where quantum computing could be applied to agent security.

  12. Pushing the asymptotics of the 6j-symbol further

    International Nuclear Information System (INIS)

    Dupuis, Maiete; Livine, Etera R.

    2009-01-01

    In the context of spin-foam models for quantum gravity, we investigate the asymptotical behavior of the (6j)-symbol at next-to-leading order. This gives the first quantum gravity correction to the (3d) Regge action. We compute it analytically and check our results against numerical calculations. The (6j)-symbol is the building block of the Ponzano-Regge amplitudes for 3d quantum gravity, and the present analysis is directly relevant to deriving the quantum corrections to gravitational correlations in the spin-foam formalism.

  13. Software and man-machine interface considerations for a nuclear plant computer replacement and upgrade project

    International Nuclear Information System (INIS)

    Diamond, G.; Robinson, E.

    1984-01-01

    Some of the key software functions and Man-Machine Interface considerations in a computer replacement and upgrade project for a nuclear power plant are described. The project involves the installation of two separate computer systems: an Emergency Response Facilities Computer System (ERFCS) and a Plant Process Computer System (PPCS). These systems employ state-of-the-art computer hardware and software. The ERFCS is a new system intended to provide enhanced functions to meet NRC post-TMI guidelines. The PPCS is intended to replace and upgrade an existing obsolete plant computer system. A general overview of the hardware and software aspects of the replacement and upgrade is presented. The work done to develop the upgraded Man-Machine Interface is described. For the ERFCS, a detailed discussion is presented of the work done to develop logic to evaluate the readiness and performance of safety systems and their supporting functions. The Man-Machine Interface considerations of reporting readiness and performance to the operator are discussed. Finally, the considerations involved in the implementation of this logic in real-time software are discussed.. For the PPCS, a detailed discussion is presented of some new features

  14. Software Safety Risk in Legacy Safety-Critical Computer Systems

    Science.gov (United States)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  15. Symbolic and non symbolic numerical representation in adults with and without developmental dyscalculia

    Directory of Open Access Journals (Sweden)

    Furman Tamar

    2012-11-01

    Full Text Available Abstract Background The question whether Developmental Dyscalculia (DD; a deficit in the ability to process numerical information is the result of deficiencies in the non symbolic numerical representation system (e.g., a group of dots or in the symbolic numerical representation system (e.g., Arabic numerals has been debated in scientific literature. It is accepted that the non symbolic system is divided into two different ranges, the subitizing range (i.e., quantities from 1-4 which is processed automatically and quickly, and the counting range (i.e., quantities larger than 4 which is an attention demanding procedure and is therefore processed serially and slowly. However, so far no study has tested the automaticity of symbolic and non symbolic representation in DD participants separately for the subitizing and the counting ranges. Methods DD and control participants undergo a novel version of the Stroop task, i.e., the Enumeration Stroop. They were presented with a random series of between one and nine written digits, and were asked to name either the relevant written digit (in the symbolic task or the relevant quantity of digits (in the non symbolic task while ignoring the irrelevant aspect. Result DD participants, unlike the control group, didn't show any congruency effect in the subitizing range of the non symbolic task. Conclusion These findings suggest that DD may be impaired in the ability to process symbolic numerical information or in the ability to automatically associate the two systems (i.e., the symbolic vs. the non symbolic. Additionally DD have deficiencies in the non symbolic counting range.

  16. Symbolic and non symbolic numerical representation in adults with and without developmental dyscalculia.

    Science.gov (United States)

    Furman, Tamar; Rubinsten, Orly

    2012-11-28

    The question whether Developmental Dyscalculia (DD; a deficit in the ability to process numerical information) is the result of deficiencies in the non symbolic numerical representation system (e.g., a group of dots) or in the symbolic numerical representation system (e.g., Arabic numerals) has been debated in scientific literature. It is accepted that the non symbolic system is divided into two different ranges, the subitizing range (i.e., quantities from 1-4) which is processed automatically and quickly, and the counting range (i.e., quantities larger than 4) which is an attention demanding procedure and is therefore processed serially and slowly. However, so far no study has tested the automaticity of symbolic and non symbolic representation in DD participants separately for the subitizing and the counting ranges. DD and control participants undergo a novel version of the Stroop task, i.e., the Enumeration Stroop. They were presented with a random series of between one and nine written digits, and were asked to name either the relevant written digit (in the symbolic task) or the relevant quantity of digits (in the non symbolic task) while ignoring the irrelevant aspect. DD participants, unlike the control group, didn't show any congruency effect in the subitizing range of the non symbolic task. These findings suggest that DD may be impaired in the ability to process symbolic numerical information or in the ability to automatically associate the two systems (i.e., the symbolic vs. the non symbolic). Additionally DD have deficiencies in the non symbolic counting range.

  17. Waterloo Workshop on Computer Algebra

    CERN Document Server

    Zima, Eugene; WWCA-2016; Advances in computer algebra : in honour of Sergei Abramov's' 70th birthday

    2018-01-01

    This book discusses the latest advances in algorithms for symbolic summation, factorization, symbolic-numeric linear algebra and linear functional equations. It presents a collection of papers on original research topics from the Waterloo Workshop on Computer Algebra (WWCA-2016), a satellite workshop of the International Symposium on Symbolic and Algebraic Computation (ISSAC’2016), which was held at Wilfrid Laurier University (Waterloo, Ontario, Canada) on July 23–24, 2016.   This workshop and the resulting book celebrate the 70th birthday of Sergei Abramov (Dorodnicyn Computing Centre of the Russian Academy of Sciences, Moscow), whose highly regarded and inspirational contributions to symbolic methods have become a crucial benchmark of computer algebra and have been broadly adopted by many Computer Algebra systems.

  18. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  19. Learning Vocabulary in a Foreign Language: A Computer Software Based Model Attempt

    Science.gov (United States)

    Yelbay Yilmaz, Yasemin

    2015-01-01

    This study aimed at devising a vocabulary learning software that would help learners learn and retain vocabulary items effectively. Foundation linguistics and learning theories have been adapted to the foreign language vocabulary learning context using a computer software named Parole that was designed exclusively for this study. Experimental…

  20. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    Science.gov (United States)

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  1. Software for the ACP [Advanced Computer Program] multiprocessor system

    International Nuclear Information System (INIS)

    Biel, J.; Areti, H.; Atac, R.

    1987-01-01

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system

  2. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Perret-Gallix, D.; Wojcik, W.

    1990-01-01

    These proceedings relate in a pragmatic way the use of methods and techniques of software engineering and artificial intelligence in high energy and nuclear physics. Such fundamental research can only be done through the design, the building and the running of equipments and systems among the most complex ever undertaken by mankind. The use of these new methods is mandatory in such an environment. However their proper integration in these real applications raise some unsolved problems. Their solution, beyond the research field, will lead to a better understanding of some fundamental aspects of software engineering and artificial intelligence. Here is a sample of subjects covered in the proceedings : Software engineering in a multi-users, multi-versions, multi-systems environment, project management, software validation and quality control, data structure and management object oriented languages, multi-languages application, interactive data analysis, expert systems for diagnosis, expert systems for real-time applications, neural networks for pattern recognition, symbolic manipulation for automatic computation of complex processes

  3. The Influence of Personal Characteristics, Interaction: (Computer/Individual), Computer Self-efficacy, Personal Innovativeness in Information Technology to Computer Anxiety in use of Mind your Own Business Accounting Software

    OpenAIRE

    Mayasari, Mega; ., Gudono

    2015-01-01

    The purpose of this study was to identify the factors that cause computer anxiety in the use of Mind Your Own Business (MYOB) accounting software, i.e., to assess if there are any influence of age, gender, amount of training, ownership (usage of accounting software on a regular basis), computer self-efficacy, personal innovativeness in Information Technology (IT) to computer anxiety. The study also examined whether there is a relationship trait anxiety and negative affect to computer self-eff...

  4. Children’s Non-symbolic, Symbolic Addition and Their Mapping Capacity at 4–7 Years Old

    Directory of Open Access Journals (Sweden)

    Yanjun Li

    2017-07-01

    Full Text Available The study aimed to examine the developmental trajectories of non-symbolic and symbolic addition capacities in children and the mapping ability between these two. We assessed 106 4- to 7-year-old children and found that 4-year-olds were able to do non-symbolic addition but not symbolic addition. Five-year-olds and older were able to do symbolic addition and their performance in symbolic addition exceeded non-symbolic addition in grade 1 (approximate age 7. These results suggested non-symbolic addition ability emerges earlier and is less affected by formal mathematical education than symbolic addition. Meanwhile, we tested children’s bi-directional mapping ability using a novel task and found that children were able to map between symbolic and non-symbolic representations of number at age 5. Their ability in mapping non-symbolic to symbolic number became more proficient in grade 1 (approximate age 7. This suggests children at age 7 have developed a relatively mature symbolic representation system.

  5. Integrating Free Computer Software in Chemistry and Biochemistry Instruction: An International Collaboration

    Science.gov (United States)

    Cedeno, David L.; Jones, Marjorie A.; Friesen, Jon A.; Wirtz, Mark W.; Rios, Luz Amalia; Ocampo, Gonzalo Taborda

    2010-01-01

    At the Universidad de Caldas, Manizales, Colombia, we used their new computer facilities to introduce chemistry graduate students to biochemical database mining and quantum chemistry calculations using freeware. These hands-on workshops allowed the students a strong introduction to easily accessible software and how to use this software to begin…

  6. The symbol grounding problem revisited: a thorough evaluation of the ANS mapping account and the proposal of an alternative account based on symbol-symbol associations.

    Directory of Open Access Journals (Sweden)

    Bert Reynvoet

    2016-10-01

    Full Text Available Recently, a lot of studies in the domain of numerical cognition have been published demonstrating a robust association between numerical symbol processing and individual differences in mathematics achievement. Because numerical symbols are so important for mathematics achievement, many researchers want to provide an answer on the ‘symbol grounding problem’, i.e., how does a symbol acquires its numerical meaning? The most popular account, the ANS mapping account, assumes that a symbol acquires its numerical meaning by being mapped on a non-verbal and Approximate Number System (ANS. Here, we critically evaluate four arguments that are supposed to support this account, i.e., (1 there is an evolutionary system for approximate number processing, (2 non-symbolic and symbolic number processing show the same behavioral effects, (3 non-symbolic and symbolic numbers activate the same brain regions which are also involved in more advanced calculation and (4 non-symbolic comparison is related to the performance on symbolic mathematics achievement tasks. Based on this evaluation, we conclude that all of these arguments and consequently also the mapping account are questionable. Next we explored less popular alternative, where small numerical symbols are initially mapped on a precise representation and then, in combination with increasing knowledge of the counting list result in an independent and exact symbolic system based on order relations between symbols. We evaluate this account by reviewing evidence on order judgement tasks following the same four arguments. Although further research is necessary, the available evidence so far suggests that this symbol-symbol association account should be considered as a worthy alternative of how symbols acquire their meaning.

  7. The Symbol Grounding Problem Revisited: A Thorough Evaluation of the ANS Mapping Account and the Proposal of an Alternative Account Based on Symbol-Symbol Associations.

    Science.gov (United States)

    Reynvoet, Bert; Sasanguie, Delphine

    2016-01-01

    Recently, a lot of studies in the domain of numerical cognition have been published demonstrating a robust association between numerical symbol processing and individual differences in mathematics achievement. Because numerical symbols are so important for mathematics achievement, many researchers want to provide an answer on the 'symbol grounding problem,' i.e., how does a symbol acquires its numerical meaning? The most popular account, the approximate number system ( ANS ) mapping account , assumes that a symbol acquires its numerical meaning by being mapped on a non-verbal and ANS. Here, we critically evaluate four arguments that are supposed to support this account, i.e., (1) there is an evolutionary system for approximate number processing, (2) non-symbolic and symbolic number processing show the same behavioral effects, (3) non-symbolic and symbolic numbers activate the same brain regions which are also involved in more advanced calculation and (4) non-symbolic comparison is related to the performance on symbolic mathematics achievement tasks. Based on this evaluation, we conclude that all of these arguments and consequently also the mapping account are questionable. Next we explored less popular alternative, where small numerical symbols are initially mapped on a precise representation and then, in combination with increasing knowledge of the counting list result in an independent and exact symbolic system based on order relations between symbols. We evaluate this account by reviewing evidence on order judgment tasks following the same four arguments. Although further research is necessary, the available evidence so far suggests that this symbol-symbol association account should be considered as a worthy alternative of how symbols acquire their meaning.

  8. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  9. Structured brain computing and its learning

    International Nuclear Information System (INIS)

    Ae, Tadashi; Araki, Hiroyuki; Sakai, Keiichi

    1999-01-01

    We have proposed a two-level architecture for brain computing, where two levels are introduced for processing of meta-symbol. At level 1 a conventional pattern recognition is performed, where neural computation is included, and its output gives the meta-symbol which is a symbol enlarged from a symbol to a kind of pattern. At Level 2 an algorithm acquisition is made by using a machine for abstract states. We are also developing the VLSI chips at each level for SBC (Structured Brain Computer) Ver.1.0

  10. Computer software design description for the integrated control and data acquisition system LDUA system

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components

  11. Automatic computation of cross sections in HEP. Status of GRACE system

    International Nuclear Information System (INIS)

    Yuasa, F.; Fujimoto, J.; Ishikawa, T.

    2000-01-01

    For the study of reactions in High Energy Physics (HEP) automatic computation systems have been developed are widely used nowadays. GRACE is one of such systems and it has achieved much success in analyzing experimental data. Since we deal with the cross section whose value can be given by calculating hundreds of Feynman diagrams, we manage the large scale calculation, so that effective symbolic manipulation, the treat of singularity in the numerical integration are required. The talk will describe the software design of GRACE system and computational techniques in the GRACE. (author)

  12. 'demoted'?: Symbols as religious phenomena

    African Journals Online (AJOL)

    2013-03-06

    Mar 6, 2013 ... process by which symbols grow and develop, the particular context of a symbol is important. In this article a particular theory as to what symbols are, is presented. ... of communication and reference between these two worlds are symbols. .... from a psychological perspective, understands symbols as a.

  13. Mathcad in the Chemistry Curriculum Symbolic Software in the Chemistry Curriculum

    Science.gov (United States)

    Zielinski, Theresa Julia

    2000-05-01

    Physical chemistry is such a broad discipline that the topics we expect average students to complete in two semesters usually exceed their ability for meaningful learning. Consequently, the number and kind of topics and the efficiency with which students can learn them are important concerns. What topics are essential and what can we do to provide efficient and effective access to those topics? How do we accommodate the fact that students come to upper-division chemistry courses with a variety of nonuniformly distributed skills, a bit of calculus, and some physics studied one or more years before physical chemistry? The critical balance between depth and breadth of learning in courses and curricula may be achieved through appropriate use of technology and especially through the use of symbolic mathematics software. Software programs such as Mathcad, Mathematica, and Maple, however, have learning curves that diminish their effectiveness for novices. There are several ways to address the learning curve conundrum. First, basic instruction in the software provided during laboratory sessions should be followed by requiring laboratory reports that use the software. Second, one should assign weekly homework that requires the software and builds student skills within the discipline and with the software. Third, a complementary method, supported by this column, is to provide students with Mathcad worksheets or templates that focus on one set of related concepts and incorporate a variety of features of the software that they are to use to learn chemistry. In this column we focus on two significant topics for young chemists. The first is curve-fitting and the statistical analysis of the fitting parameters. The second is the analysis of the rotation/vibration spectrum of a diatomic molecule, HCl. A broad spectrum of Mathcad documents exists for teaching chemistry. One collection of 50 documents can be found at http://www.monmouth.edu/~tzielins/mathcad/Lists/index.htm. Another

  14. A directory of computer software applications: astronomy and astrophysics, 1970-May, 1979

    International Nuclear Information System (INIS)

    1979-05-01

    Astronomy and astrophysics reports that list computer programs and/or their documentation are cited. These software applications pertain to topics such as solar activity, atmospheric radiative transfer, stellar and galactic structure, lunar and planetary studies, and astrophysical data reduction. The directory contains complete bibliographic data for each report as well as a subject and a corporate author index. The computer software offered by NTIS was created by a variety of Federal agencies to meet their diverse but quite specific objectives. It is provided without installation, support, or maintenance services and sometimes requires customer modifications to run effectively in customer environments

  15. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  16. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    Science.gov (United States)

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  17. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    Glasscock, J.A.

    1995-01-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  18. A software for computer automated radioactive particle tracking

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Wilson S.; Brandao, Luis E. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mails: wilson@ien.gov.br; brandao@ien.gov.br; Braz, Delson [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE)]. E-mail: delson@smb.lin.ufrj.br

    2008-07-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  19. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  20. Symbolic and non-symbolic number magnitude processing in children with developmental dyscalculia.

    Science.gov (United States)

    Castro Cañizares, Danilka; Reigosa Crespo, Vivian; González Alemañy, Eduardo

    2012-11-01

    The aim of this study was to evaluate if children with Developmental Dyscalculia (DD) exhibit a general deficit in magnitude representations or a specific deficit in the connection of symbolic representations with the corresponding analogous magnitudes. DD was diagnosed using a timed arithmetic task. The experimental magnitude comparison tasks were presented in non-symbolic and symbolic formats. DD and typically developing (TD) children showed similar numerical distance and size congruity effects. However, DD children performed significantly slower in the symbolic task. These results are consistent with the access deficit hypothesis, according to which DD children's deficits are caused by difficulties accessing magnitude information from numerical symbols rather than in processing numerosities per se.

  1. Fostering Multirepresentational Levels of Chemical Concepts: A Framework to Develop Educational Software

    Science.gov (United States)

    Marson, Guilherme A.; Torres, Bayardo B.

    2011-01-01

    This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…

  2. Animated symbols

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth

    2008-01-01

    an analytic working model called Animated Symbols concerning critical reflection in a dialogic learning process. The model shows dialogue as interactions that involve two types of transformation: inner ‘learning processes' and outer signs and symbols. The classroom-based research study is part of a Ph...

  3. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  4. SYMBOL AND LOGO. THE WAY IN WHICH YOUNG PEOPLE IN KRAKOW PERCEIVE SYMBOLS

    Directory of Open Access Journals (Sweden)

    Marta Jarzyna

    2006-01-01

    Full Text Available Symbols are essential elements of each culture. Thanks to them the meaning is created and tradition is kept alive. Advertising and marketing specialist quite often use the meanings of the symbols to create trademarks. In this way specialists refer to the assotiations rooted in the tradition.In my article I am trying to answer following questions: Has logo become symbol? Has logo taken over all the function of the symbol? Can we tell the difference between the meaning of the advertisement and the cultural meaning? I also want to find out, what people understand through the meaning of the symbol. Therfore I have conducted the survey among the high school students and the customers of three banks of Krakow. My researches have shown that most young people find it difficult to define the meaning of the symbol. Moreover high school students cannot show the difference between the symbol and the trade mark.

  5. Mahotas: Open source software for scriptable computer vision

    Directory of Open Access Journals (Sweden)

    Luis Pedro Coelho

    2013-07-01

    Full Text Available Mahotas is a computer vision library for Python. It contains traditional image processing functionality such as filtering and morphological operations as well as more modern computer vision functions for feature computation, including interest point detection and local descriptors. The interface is in Python, a dynamic programming language, which is appropriate for fast development, but the algorithms are implemented in C++ and are tuned for speed. The library is designed to fit in with the scientific software ecosystem in this language and can leverage the existing infrastructure developed in that language. Mahotas is released under a liberal open source license (MIT License and is available from http://github.com/luispedro/mahotas and from the Python Package Index (http://pypi.python.org/pypi/mahotas. Tutorials and full API documentation are available online at http://mahotas.readthedocs.org/.

  6. 1986 CERN School of Computing

    International Nuclear Information System (INIS)

    Verkerk, C.

    1987-01-01

    These proceedings contain written versions of lectures delivered at the 1986 CERN School of Computing. Two lecture series treated trends in computer architecture and architectural requirements for high-energy physics. Three other courses concentrated on object-oriented programming, on objects in Ada, and on modularization and re-usability. Expert systems, their applications, and knowledge engineering for graphics were treated by three lectures. The programme of the School covered, in addition, practical aspects of networks, buses for high-energy physics, the design of data-acquisition systems, and examples of on-line systems for particle physics experiments. Optical storage methods, software for distributed systems, symbolic formula manipulation, and solid modelling and rendering were also covered. Experience with transputers was the subject of an unscheduled presentation. (orig.)

  7. Software development on the DIII-D control and data acquisition computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; McHarg, B.B. Jr.; Piglowski, D.

    1997-11-01

    The various software systems developed for the DIII-D tokamak have played a highly visible and important role in tokamak operations and fusion research. Because of the heavy reliance on in-house developed software encompassing all aspects of operating the tokamak, much attention has been given to the careful design, development and maintenance of these software systems. Software systems responsible for tokamak control and monitoring, neutral beam injection, and data acquisition demand the highest level of reliability during plasma operations. These systems made up of hundreds of programs totaling thousands of lines of code have presented a wide variety of software design and development issues ranging from low level hardware communications, database management, and distributed process control, to man machine interfaces. The focus of this paper will be to describe how software is developed and managed for the DIII-D control and data acquisition computers. It will include an overview and status of software systems implemented for tokamak control, neutral beam control, and data acquisition. The issues and challenges faced developing and managing the large amounts of software in support of the dynamic and everchanging needs of the DIII-D experimental program will be addressed

  8. Noncoherent Symbol Synchronization Techniques

    Science.gov (United States)

    Simon, Marvin

    2005-01-01

    Traditional methods for establishing symbol synchronization (sync) in digital communication receivers assume that carrier sync has already been established, i.e., the problem is addressed at the baseband level assuming that a 'perfect' estimate of carrier phase is available. We refer to this approach as coherent symbol sync. Since, for NRZ signaling, a suppressed carrier sync loop such as an I-Q Costas loop includes integrate-and-dump (I and D) filters in its in-phase (1) and quadrature (Q) arms, the traditional approach is to first track the carrier in the absence of symbol sync information, then feed back the symbol sync estimate to these filters, and then iterate between the two to a desirable operating level In this paper, we revisit the symbol sync problem by examining methods for obtaining such sync in the absence of carrier phase information, i.e., so-called noncoherent symbol sync loops. We compare the performance of these loops with that of a well-known coherent symbol sync loop and examine the conditions under which one is preferable over the other.

  9. Development of innovative computer software to facilitate the setup and computation of water quality index.

    Science.gov (United States)

    Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira

    2013-04-26

    Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.

  10. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  11. Symbolic manipulation methods in general relativity and fluid mechanics

    International Nuclear Information System (INIS)

    Cohen, I.

    1976-03-01

    Algebraic manipulation by computer, or automatic symbol manipulation (ASM) has not been used much in theoretical physics, especially if one compares it with numerical methods. Three examples of the use of ASM as a tool in theoretical physics are discussed. (Auth.)

  12. Translator from the symbol coding language for the BUTs-20 processor of the in-core reactor control system

    International Nuclear Information System (INIS)

    Vorob'ev, D.M.; Golovanov, M.N.; Levin, G.L.; Parfenova, T.K.; Filatov, V.P.

    1978-01-01

    A symbolic-language code translator is described; it has been developed for automation of making up programs for in-core control systems. The translator is written in the ASSEMBLER language which is included in the software of the M-6000 computer. Two scannings of the source program are required for making up the operating program in the internal language of the BUTs-2O processor. The flowsheet and listing of the interrogation program of an analog-to-digital converter are presented. It is emphasized that the translator proposed allows a time reduction for constructing programs for the in-core control systems by a factor of 10-15 and an improvement of their quality

  13. The impact of symbolic and non-symbolic quantity on spatial learning.

    Directory of Open Access Journals (Sweden)

    Koleen McCrink

    Full Text Available An implicit mapping of number to space via a "mental number line" occurs automatically in adulthood. Here, we systematically explore the influence of differing representations of quantity (no quantity, non-symbolic magnitudes, and symbolic numbers and directional flow of stimuli (random flow, left-to-right, or right-to-left on learning and attention via a match-to-sample working memory task. When recalling a cognitively demanding string of spatial locations, subjects performed best when information was presented right-to-left. When non-symbolic or symbolic numerical arrays were embedded in these spatial locations, and mental number line congruency prompted, this effect was attenuated and in some cases reversed. In particular, low-performing female participants who viewed increasing non-symbolic number arrays paired with the spatial locations exhibited better recall for left-to-right directional flow information relative to right-to-left, and better processing for the left side of space relative to the right side of space. The presence of symbolic number during spatial learning enhanced recall to a greater degree than non-symbolic number--especially for female participants, and especially when cognitive load is high--and this difference was independent of directional flow of information. We conclude that quantity representations have the potential to scaffold spatial memory, but this potential is subtle, and mediated by the nature of the quantity and the gender and performance level of the learner.

  14. Primary Health Care Software-A Computer Based Data Management System

    Directory of Open Access Journals (Sweden)

    Tuli K

    1990-01-01

    Full Text Available Realising the duplication and time consumption in the usual manual system of data collection necessitated experimentation with computer based management system for primary health care in the primary health centers. The details of the population as available in the existing manual system were used for computerizing the data. Software was designed for data entry and analysis. It was written in Dbase III plus language. It was so designed that a person with no knowledge about computer could use it, A cost analysis was done and the computer system was found more cost effective than the usual manual system.

  15. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  16. Development of the JFT-2M data analysis software system on the mainframe computer

    International Nuclear Information System (INIS)

    Matsuda, Toshiaki; Amagai, Akira; Suda, Shuji; Maemura, Katsumi; Hata, Ken-ichiro.

    1990-11-01

    We developed software system on the FACOM mainframe computer to analyze JFT-2M experimental data archived by JFT-2M data acquisition system. Then we can reduce and distribute the CPU load of the data acquisition system. And we can analyze JFT-2M experimental data by using complicated computational code with raw data, such as equilibrium calculation and transport analysis, and useful software package like SAS statistic package on the mainframe. (author)

  17. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  18. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  19. Graphical symbol recognition

    OpenAIRE

    K.C. , Santosh; Wendling , Laurent

    2015-01-01

    International audience; The chapter focuses on one of the key issues in document image processing i.e., graphical symbol recognition. Graphical symbol recognition is a sub-field of a larger research domain: pattern recognition. The chapter covers several approaches (i.e., statistical, structural and syntactic) and specially designed symbol recognition techniques inspired by real-world industrial problems. It, in general, contains research problems, state-of-the-art methods that convey basic s...

  20. [Symbol: see text]2 Optimized predictive image coding with [Symbol: see text]∞ bound.

    Science.gov (United States)

    Chuah, Sceuchin; Dumitrescu, Sorina; Wu, Xiaolin

    2013-12-01

    In many scientific, medical, and defense applications of image/video compression, an [Symbol: see text]∞ error bound is required. However, pure[Symbol: see text]∞-optimized image coding, colloquially known as near-lossless image coding, is prone to structured errors such as contours and speckles if the bit rate is not sufficiently high; moreover, most of the previous [Symbol: see text]∞-based image coding methods suffer from poor rate control. In contrast, the [Symbol: see text]2 error metric aims for average fidelity and hence preserves the subtlety of smooth waveforms better than the ∞ error metric and it offers fine granularity in rate control, but pure [Symbol: see text]2-based image coding methods (e.g., JPEG 2000) cannot bound individual errors as the [Symbol: see text]∞-based methods can. This paper presents a new compression approach to retain the benefits and circumvent the pitfalls of the two error metrics. A common approach of near-lossless image coding is to embed into a DPCM prediction loop a uniform scalar quantizer of residual errors. The said uniform scalar quantizer is replaced, in the proposed new approach, by a set of context-based [Symbol: see text]2-optimized quantizers. The optimization criterion is to minimize a weighted sum of the [Symbol: see text]2 distortion and the entropy while maintaining a strict [Symbol: see text]∞ error bound. The resulting method obtains good rate-distortion performance in both [Symbol: see text]2 and [Symbol: see text]∞ metrics and also increases the rate granularity. Compared with JPEG 2000, the new method not only guarantees lower [Symbol: see text]∞ error for all bit rates, but also it achieves higher PSNR for relatively high bit rates.

  1. Symbol in View of Ambiguity

    Directory of Open Access Journals (Sweden)

    Mohamad reza Yousefi

    2013-11-01

    Full Text Available Abstract Symbol from the perspective of rhetorical word, is phrase or sentence that apparent meaning, also inspires to reader a wide range of semantic.Since exploring the complex social and political ideas in the most mysticalway and indirectreflectionsocial and political thoughts symbolically is easier, so the symbol and symbolism especially in Persian literatureespeciallyin the realm of Persian poetry, has a special appearance.In addition to the factors mentioned in the contemporary literature, according familiar in literature and the emergence of particular schools interest toambiguoussymbolization has spreadfurther, especially the symbol has all the features of art ambiguity in the poem and it isone the major factors causing uncertainty.Thus, the precise definitions and symbols of contemporary poetry could be dominant in the unwinding ambiguous symbol detection of cryptic allusions and metaphors that matches the cursor symbol to help readers.  In the literature, especially language poetry, the inability of language toreflecting obscure mystical ideas, avoid to directexpression of political and social concerns of the reader in the course of participate to creation ambiguous literary works is the main motivation towards symbol and symbolization.According widespread use of symbol and its different of species can be viewed from different perspectives.The creation of ambiguity is the main purposes of using symbols (especially in poetry, so many poets have tried to achieve this goal have to formation of similar symbols and the explanation and resolution of this issue can open new window for understanding the poetry in front of an audience.  In this paper examines the ambiguity of symbols in terms of its precise boundaries are reviewed. Ambiguity is one of the important processes and also is the key Iranian poetry its means is today poetry. In such poetry ambiguity is a need to explore the new world from a different perspective, or explore this

  2. Symbol generators with program control

    International Nuclear Information System (INIS)

    Gryaznov, V.M.; Tomik, J.

    1974-01-01

    Methods of constructing symbol generators are described which ensure a program variation of symbol shape and setup. The symbols are formed on the basis of a point microraster. A symbol description code contains information on a symbol shape, with one digit corresponding to each microraster element. For a microraster discrete by-pass the description code is transformed into succession of illuminating pulses by means of a shift register

  3. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  4. Political symbols and political transitions

    Directory of Open Access Journals (Sweden)

    Herrero de Miñón, Miguel

    2006-11-01

    Full Text Available Politics, Law and Psychology are fields that come together in the symbolic. This text takes evidence from those three areas to develop an analysis of political symbols and political transitions. The development of the analysis goes through three stages. The first succinctly describes the concept of transition and its meaning. The second closely examines the notion of the symbol, in terms of its definition, to explain aspects that allow us to understand it, characterise it and make its functions clear. Finally, from the author's experience as a witness and as an actor, I suggest three ways of understanding symbols in the processes of political transition: as symbols of change, as symbols of acknowledgment, and as symbols of support.

  5. Symbolic Algebra Development for Higher-Order Electron Propagator Formulation and Implementation.

    Science.gov (United States)

    Tamayo-Mendoza, Teresa; Flores-Moreno, Roberto

    2014-06-10

    Through the use of symbolic algebra, implemented in a program, the algebraic expression of the elements of the self-energy matrix for the electron propagator to different orders were obtained. In addition, a module for the software package Lowdin was automatically generated. Second- and third-order electron propagator results have been calculated to test the correct operation of the program. It was found that the Fortran 90 modules obtained automatically with our algorithm succeeded in calculating ionization energies with the second- and third-order electron propagator in the diagonal approximation. The strategy for the development of this symbolic algebra program is described in detail. This represents a solid starting point for the automatic derivation and implementation of higher-order electron propagator methods.

  6. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  7. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  8. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  9. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  10. Designing of a Computer Software for Detection of Approximal Caries in Posterior Teeth

    International Nuclear Information System (INIS)

    Valizadeh, Solmaz; Goodini, Mostafa; Ehsani, Sara; Mohseni, Hadis; Azimi, Fateme; Bakhshandeh, Hooman

    2015-01-01

    Radiographs, adjunct to clinical examination are always valuable complementary methods for dental caries detection. Recently, progressing in digital imaging system provides possibility of software designing for automatically dental caries detection. The aim of this study was to develop and assess the function of diagnostic computer software designed for evaluation of approximal caries in posterior teeth. This software should be able to indicate the depth and location of caries on digital radiographic images. Digital radiographs were obtained of 93 teeth including 183 proximal surfaces. These images were used as a database for designing the software and training the software designer. In the design phase, considering the summed density of pixels in rows and columns of the images, the teeth were separated from each other and the unnecessary regions; for example, the root area in the alveolar bone was eliminated. Therefore, based on summed intensities, each image was segmented such that each segment contained only one tooth. Subsequently, based on the fuzzy logic, a well-known data-clustering algorithm named fuzzy c-means (FCM) was applied to the images to cluster or segment each tooth. This algorithm is referred to as a soft clustering method, which assigns data elements to one or more clusters with a specific membership function. Using the extracted clusters, the tooth border was determined and assessed for cavity. The results of histological analysis were used as the gold standard for comparison with the results obtained from the software. Depth of caries was measured, and finally Intraclass Correlation Coefficient (ICC) and Bland-Altman plot were used to show the agreement between the methods. The software diagnosed 60% of enamel caries. The ICC (for detection of enamel caries) between the computer software and histological analysis results was determined as 0.609 (95% confidence interval [CI] = 0.159-0.849) (P = 0.006). Also, the computer program diagnosed 97% of

  11. Special software for computing the special functions of wave catastrophes

    Directory of Open Access Journals (Sweden)

    Andrey S. Kryukovsky

    2015-01-01

    Full Text Available The method of ordinary differential equations in the context of calculating the special functions of wave catastrophes is considered. Complementary numerical methods and algorithms are described. The paper shows approaches to accelerate such calculations using capabilities of modern computing systems. Methods for calculating the special functions of wave catastrophes are considered in the framework of parallel computing and distributed systems. The paper covers the development process of special software for calculating of special functions, questions of portability, extensibility and interoperability.

  12. Bruce Springsteen as a Symbol

    DEFF Research Database (Denmark)

    Gitz-Johansen, Thomas

    2018-01-01

    The article explores how Bruce Springsteen and his music function as a symbol. The article first presents the Jungian theory of symbols and of music as symbol. The central argument of the article is that, by functioning symbolically, Springsteen has the potential to influence the psyche of his au...

  13. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  14. Computer-aided software development

    International Nuclear Information System (INIS)

    Teichroew, D.; Hershey, E.A. III; Yamamoto, Y.

    1978-01-01

    In recent years, as the hardware cost/capability ratio has continued to decrease and as much of the routine data processing has been computerized, the emphasis in software development has shifted from just getting systems operational to the maintenance of existing systems, reduction of duplication by integration, selective addition of new applications, systems that are more usable, maintainable, portable and reliable and to improving the productivity of software developers. This paper examines a number of trends that are changing the methods by which software is being produced and used. (Auth.)

  15. Efficient multi-objective calibration of a computationally intensive hydrologic model with parallel computing software in Python

    Science.gov (United States)

    With enhanced data availability, distributed watershed models for large areas with high spatial and temporal resolution are increasingly used to understand water budgets and examine effects of human activities and climate change/variability on water resources. Developing parallel computing software...

  16. SU-F-I-43: A Software-Based Statistical Method to Compute Low Contrast Detectability in Computed Tomography Images

    Energy Technology Data Exchange (ETDEWEB)

    Chacko, M; Aldoohan, S [University of Oklahoma Health Sciences Center, Oklahoma City, OK (United States)

    2016-06-15

    Purpose: The low contrast detectability (LCD) of a CT scanner is its ability to detect and display faint lesions. The current approach to quantify LCD is achieved using vendor-specific methods and phantoms, typically by subjectively observing the smallest size object at a contrast level above phantom background. However, this approach does not yield clinically applicable values for LCD. The current study proposes a statistical LCD metric using software tools to not only to assess scanner performance, but also to quantify the key factors affecting LCD. This approach was developed using uniform QC phantoms, and its applicability was then extended under simulated clinical conditions. Methods: MATLAB software was developed to compute LCD using a uniform image of a QC phantom. For a given virtual object size, the software randomly samples the image within a selected area, and uses statistical analysis based on Student’s t-distribution to compute the LCD as the minimal Hounsfield Unit’s that can be distinguished from the background at the 95% confidence level. Its validity was assessed by comparison with the behavior of a known QC phantom under various scan protocols and a tissue-mimicking phantom. The contributions of beam quality and scattered radiation upon the computed LCD were quantified by using various external beam-hardening filters and phantom lengths. Results: As expected, the LCD was inversely related to object size under all scan conditions. The type of image reconstruction kernel filter and tissue/organ type strongly influenced the background noise characteristics and therefore, the computed LCD for the associated image. Conclusion: The proposed metric and its associated software tools are vendor-independent and can be used to analyze any LCD scanner performance. Furthermore, the method employed can be used in conjunction with the relationships established in this study between LCD and tissue type to extend these concepts to patients’ clinical CT

  17. Analysis of chromium-51 release assay data using personal computer spreadsheet software

    International Nuclear Information System (INIS)

    Lefor, A.T.; Steinberg, S.M.; Wiebke, E.A.

    1988-01-01

    The Chromium-51 release assay is a widely used technique to assess the lysis of labeled target cells in vitro. We have developed a simple technique to analyze data from Chromium-51 release assays using the widely available LOTUS 1-2-3 spreadsheet software. This package calculates percentage specific cytotoxicity and lytic units by linear regression. It uses all data points to compute the linear regression and can determine if there is a statistically significant difference between two lysis curves. The system is simple to use and easily modified, since its implementation requires neither knowledge of computer programming nor custom designed software. This package can help save considerable time when analyzing data from Chromium-51 release assays

  18. Clash of symbols a ride through the riches of glyphs

    CERN Document Server

    Webb, Stephen

    2018-01-01

    From the ampersat and amerpsand, via smileys and runes to the ubiquitous presence of mathematical and other symbols in sciences and technology: both old and modern documents abound with many familiar as well as lesser known characters, symbols and other glyphs. Yet, who would be readily able to answer any question like: ‘who chose π to represent the ratio of a circle’s diameter to its circumference?’ or ‘what’s the reasoning behind having a ⌘ key on my computer keyboard?’  This book is precisely for those who have always asked themselves this sort of questions. So, here are the stories behind one hundred glyphs, the book being evenly divided into five parts, with each featuring 20 symbols. Part 1, called Character sketches, looks at some of the glyphs we use in writing. Part 2, called Signs of the times, discusses some glyphs used in pol­itics, religion, and other areas of everyday life. Some of these symbols are common; others are used only rarely. Some are modern inventions; others, which...

  19. Cluster implementation for parallel computation within MATLAB software environment

    International Nuclear Information System (INIS)

    Santana, Antonio O. de; Dantas, Carlos C.; Charamba, Luiz G. da R.; Souza Neto, Wilson F. de; Melo, Silvio B. Melo; Lima, Emerson A. de O.

    2013-01-01

    A cluster for parallel computation with MATLAB software the COCGT - Cluster for Optimizing Computing in Gamma ray Transmission methods, is implemented. The implementation correspond to creation of a local net of computers, facilities and configurations of software, as well as the accomplishment of cluster tests for determine and optimizing of performance in the data processing. The COCGT implementation was required by data computation from gamma transmission measurements applied to fluid dynamic and tomography reconstruction in a FCC-Fluid Catalytic Cracking cold pilot unity, and simulation data as well. As an initial test the determination of SVD - Singular Values Decomposition - of random matrix with dimension (n , n), n=1000, using the Girco's law modified, revealed that COCGT was faster in comparison to the literature [1] cluster, which is similar and operates at the same conditions. Solution of a system of linear equations provided a new test for the COCGT performance by processing a square matrix with n=10000, computing time was 27 s and for square matrix with n=12000, computation time was 45 s. For determination of the cluster behavior in relation to 'parfor' (parallel for-loop) and 'spmd' (single program multiple data), two codes were used containing those two commands and the same problem: determination of SVD of a square matrix with n= 1000. The execution of codes by means of COCGT proved: 1) for the code with 'parfor', the performance improved with the labs number from 1 to 8 labs; 2) for the code 'spmd', just 1 lab (core) was enough to process and give results in less than 1 s. In similar situation, with the difference that now the SVD will be determined from square matrix with n1500, for code with 'parfor', and n=7000, for code with 'spmd'. That results take to conclusions: 1) for the code with 'parfor', the behavior was the same already described above; 2) for code with 'spmd', the same besides having produced a larger performance, it supports a

  20. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and other crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)

  1. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  2. Reciprocity Laws for the Higher Tame Symbol and the Witt Symbol on an Algebraic Surface

    OpenAIRE

    Syder, Kirsty

    2013-01-01

    Parshin's higher Witt pairing on an arithmetic surface can be combined with the higher tame pairing to form a symbol taking values in the absolute abelian Galois group of the function field. We prove reciprocity laws for this symbol using techniques of Morrow for the Witt symbol and Romo for the higher tame symbol.

  3. The myocardial microangiopathy in human and experimental diabetes mellitus. (A microscopic, ultrastructural, morphometric and computer-assisted symbolic-logic analysis).

    Science.gov (United States)

    Taşcă, C; Stefăneanu, L; Vasilescu, C

    1986-01-01

    The following microscopical aspects were found in the small intramural arteries in the myocardium of 30 diabetic patients: endothelial proliferations with focal protuberances leading to partial narrowing of the lumen, increased thickness of the arterial wall due to fibrosis and accumulations of neutral mucopolysaccharides: alteration of elastic fibres. Morphometrically, the arterial wall thickness and the arterial diameter were increased whereas the arterial density decreased in the diabetic heart. In 25 rats with streptozotocin-induced diabetes the small intramyocardial arteries were investigated at 11 to 40 weeks of diabetic state. Using morphometrical analysis a constant increase of arterial wall thickness paralleling the diabetes duration was found. Microscopically, the lesions consist in endothelial proliferation with bridging across the vascular lumen and slight perivascular and diffuse fibrosis. Ultrastructurally, the capillary basal lamina was thickened in the diabetic myocardium. In order to investigate the morphometrical data we used symbolic-logic as a decision method, by applying an original computer program based on the Quine-McCluskey algorithm. All our results together with the final symbolic-logic expression suggest that damage of the small intramyocardial arteries plays an important role in the pathogenesis of diabetic cardiomyopathy.

  4. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  5. 77 FR 50720 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION: Draft... Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1207 is proposed Revision 1 of... for Digital Computer Software Used in Safety Systems of Nuclear Power Plants'' is temporarily...

  6. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    Science.gov (United States)

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  7. Symbolic computation with finite biquandles

    OpenAIRE

    Creel, Conrad; Nelson, Sam

    2007-01-01

    A method of computing a basis for the second Yang-Baxter cohomology of a finite biquandle with coefficients in Q and Z_p from a matrix presentation of the finite biquandle is described. We also describe a method for computing the Yang-Baxter cocycle invariants of an oriented knot or link represented as a signed Gauss code. We provide a URL for our Maple implementations of these algorithms.

  8. Features of commercial computer software systems for medical examiners and coroners.

    Science.gov (United States)

    Hanzlick, R L; Parrish, R G; Ing, R

    1993-12-01

    There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.

  9. THE COMPUTER SIMULATOR-CONTROLLER FOR LEARNING SIGN AND SYMBOLIC MEANS OF PHYSICS

    Directory of Open Access Journals (Sweden)

    N.I. Tikhonskaya

    2012-10-01

    Full Text Available The promising application of information technology for learning sign and symbolic means of physics is proposed and theoretically proved in this article. This direction is connected with the idea of training students the presentation of information in different forms.

  10. Ionizing-radiation warning - Supplementary symbol

    International Nuclear Information System (INIS)

    2007-01-01

    This International Standard specifies the symbol to warn of the presence of a dangerous level of ionizing radiation from a high-level sealed radioactive source that can cause death or serious injury if handled carelessly. This symbol is not intended to replace the basic ionizing radiation symbol [ISO 361, ISO 7010:2003, Table 1 (Reference number W003)], but to supplement it by providing further information on the danger associated with the source and the necessity for untrained or uninformed members of the public to stay away from it. This symbol is recommended for use with International Atomic Energy Agency (IAEA) Category 1, 2, and 3 sealed radioactive sources. These sources are defined by the IAEA as having the ability to cause death or serious injuries. The paper informs about scope, shape, proportions and colour of the symbol, and application of the symbol. An annex provides the technical specifications of the symbol

  11. Target recognition and scene interpretation in image/video understanding systems based on network-symbolic models

    Science.gov (United States)

    Kuvich, Gary

    2004-08-01

    Vision is only a part of a system that converts visual information into knowledge structures. These structures drive the vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, which is an interpretation of visual information in terms of these knowledge models. These mechanisms provide a reliable recognition if the object is occluded or cannot be recognized as a whole. It is hard to split the entire system apart, and reliable solutions to the target recognition problems are possible only within the solution of a more generic Image Understanding Problem. Brain reduces informational and computational complexities, using implicit symbolic coding of features, hierarchical compression, and selective processing of visual information. Biologically inspired Network-Symbolic representation, where both systematic structural/logical methods and neural/statistical methods are parts of a single mechanism, is the most feasible for such models. It converts visual information into relational Network-Symbolic structures, avoiding artificial precise computations of 3-dimensional models. Network-Symbolic Transformations derive abstract structures, which allows for invariant recognition of an object as exemplar of a class. Active vision helps creating consistent models. Attention, separation of figure from ground and perceptual grouping are special kinds of network-symbolic transformations. Such Image/Video Understanding Systems will be reliably recognizing targets.

  12. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  13. A state-of-the-art report on software operation structure of the digital control computer system

    International Nuclear Information System (INIS)

    Kim, Bong Kee; Lee, Kyung Hoh; Joo, Jae Yoon; Jang, Yung Woo; Shin, Hyun Kook

    1994-06-01

    CANDU Nuclear Power Plants including Wolsong 1 and 2/3/4 are controlled by a real-time plant control computer system. This report was written to provide an overview on the station control computer software which belongs to one of the most advanced real-time computing application area, along with the Fuel Handling Machine design concepts. The combination of well designed control computer and Fuel Handling Machine allow changing fuel bundles while the plant is in operation. Design methodologies and software structure are discussed along with the interface between the two systems. 29 figs., 2 tabs., 20 refs. (Author)

  14. A state-of-the-art report on software operation structure of the digital control computer system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bong Kee; Lee, Kyung Hoh; Joo, Jae Yoon; Jang, Yung Woo; Shin, Hyun Kook [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-06-01

    CANDU Nuclear Power Plants including Wolsong 1 and 2/3/4 are controlled by a real-time plant control computer system. This report was written to provide an overview on the station control computer software which belongs to one of the most advanced real-time computing application area, along with the Fuel Handling Machine design concepts. The combination of well designed control computer and Fuel Handling Machine allow changing fuel bundles while the plant is in operation. Design methodologies and software structure are discussed along with the interface between the two systems. 29 figs., 2 tabs., 20 refs. (Author).

  15. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  16. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Graf, F.A. Jr.

    1995-02-27

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System`s pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System.

  17. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    International Nuclear Information System (INIS)

    Graf, F.A. Jr.

    1995-01-01

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System's pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System

  18. FREE SOFTWARE IN ELECTRONIC LEARNING FUTURE TEACHERS OF MATHEMATICS, PHYSICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Vladyslav Ye. Velychko

    2016-05-01

    Full Text Available Popularity of the use of free software in the IT industry is much higher than its popular use in educational activities. Disadvantages of free software and problems of its implementation in the educational process is a limiting factor for its use in the education system, however, openness, accessibility and functionality are the main factors for the introduction of free software in the educational process. Nevertheless, for future teachers of mathematics, physics and informatics free software is designed as well as possible because of the specificity of its creation, and therefore, there is a question of the system analysis of the possibilities of using open source software in e-learning for future teachers of mathematics, physics and computer science.

  19. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  20. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    National Research Council Canada - National Science Library

    Skjellum, Anthony

    2004-01-01

    Polymorphous Computing Architectures (PCA) rapidly "morph" (reorganize) software and hardware configurations in order to achieve high performance on computation styles ranging from specialized streaming to general threaded applications...

  1. A new modification of summary-based analysis method for large software system testing

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The automated testing tools becoming a frequent practice require thorough computer-aided testing of large software systems, including system inter-component interfaces. To achieve a good coverage, one should overcome scalability problems of different methods of analysis. These problems arise from impossibility to analyze all the execution paths. The objective of this research is to build a method for inter-procedural analysis, which efficiency enables us to analyse large software systems (such as Android OS codebase as a whole for a reasonable time (no more than 4 hours. This article reviews existing methods of software analysis to detect their potential defects. It focuses on the symbolic execution method since it is widely used both in static analysis of source code and in hybrid analysis of object files and intermediate representation (concolic testing. The method of symbolic execution involves separation of a set of input data values into equivalence classes while choosing an execution path. The paper also considers advantages of this method and its shortcomings. One of the main scalability problems is related to inter-procedural analysis. Analysis time grows rapidly if an inlining method is used for inter-procedural analysis. So this work proposes a summary-based analysis method to solve scalability problems. Clang Static Analyzer, an open source static analyzer (a part of the LLVM project, has been chosen as a target system. It allows us to compare performance of inlining and summary-based inter-procedural analysis. A mathematical model for preliminary estimations is described in order to identify possible factors of performance improvement.

  2. Interactive Numerical and Symbolic Analysis: A New Paradigm for Teaching Electronics

    Directory of Open Access Journals (Sweden)

    Jean-Claude Thomassian

    2008-09-01

    Full Text Available Analog Insydes, Mathematica’s symbolic circuit analysis toolbox, uses modern algorithms of expression simplification depending on comparisons with a numerical reference solution of the circuit under investigation. Some insight is offered on how the complexity of an expression barrier is overcome followed by two classical examples, a BJT emitter follower and a MOSFET common-gate amplifier stage to illustrate the proposed method at work. A concluding section discusses that time spent teaching introductory electronics by computer-aided circuit analysis, interactive numerical and symbolic, is a worthwhile investment.

  3. Optical MSD symbolic substitution system based on a higher ordered rule

    Science.gov (United States)

    Reddy, A. K.; Mallikarjun, Tatipamula; Raina, J. P.

    1992-12-01

    The advantages provided by Photonic Computing has been well documented. An Optical arithmetic processor has to take full advantage of the massive parallelism in optical signals. Such a processor, using the Modified - Signed - Digit (MSD) number . (i) representation, has been presented here based (2) on the symbolic substitution 1ogi. The higher order symbolic substitution rules are formulated for the addition operation, which is carried out in just two steps. Based on the addition operation, the other arithmetic operations - subtraction, multiplication and division - are implemented. Finally, the usefulness of this MSD system is studied.

  4. Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics

    Science.gov (United States)

    Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.

    2015-12-01

    Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.

  5. Software Defects, Scientific Computation and the Scientific Method

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by, for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This talk explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on an amalgam of statistical mechanics and Hartley/Shannon information theory which suggests that software systems have strong implementation independent behaviour and supports the widely observed phenomenon that defects clust...

  6. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  7. A New Minimum Trees-Based Approach for Shape Matching with Improved Time Computing: Application to Graphical Symbols Recognition

    Science.gov (United States)

    Franco, Patrick; Ogier, Jean-Marc; Loonis, Pierre; Mullot, Rémy

    Recently we have developed a model for shape description and matching. Based on minimum spanning trees construction and specifics stages like the mixture, it seems to have many desirable properties. Recognition invariance in front shift, rotated and noisy shape was checked through median scale tests related to GREC symbol reference database. Even if extracting the topology of a shape by mapping the shortest path connecting all the pixels seems to be powerful, the construction of graph induces an expensive algorithmic cost. In this article we discuss on the ways to reduce time computing. An alternative solution based on image compression concepts is provided and evaluated. The model no longer operates in the image space but in a compact space, namely the Discrete Cosine space. The use of block discrete cosine transform is discussed and justified. The experimental results led on the GREC2003 database show that the proposed method is characterized by a good discrimination power, a real robustness to noise with an acceptable time computing.

  8. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    Science.gov (United States)

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  9. Explorations In Theoretical Computer Science For Kids (using paper toys)

    DEFF Research Database (Denmark)

    Valente, Andrea

    2004-01-01

    The computational card (c-cards for short) project is a study and realization of an educational tool based on playing cards. C-cards are an educational tool to introduce children 8 to 10 (or older) to the concept of computation, seen as manipulation of symbols. The game provides teachers...... and learners with a physical, tangible metaphor for exploring core concepts of computer science, such as deterministic and probabilistic state machines, frequencies and probability distributions, and the central elements of Shannon's information theory, like information, communication, errors and error...... detection. Our idea is implemented both with paper cards and by an editor/simulator software (a prototype realized in javascript). We also designed the structure of a course in (theoretical) computer science, based on c-cards, and we will test it this summer....

  10. The dilemma of the symbols: analogies between philosophy, biology and artificial life.

    Science.gov (United States)

    Spadaro, Salvatore

    2013-01-01

    This article analyzes some analogies going from Artificial Life questions about the symbol-matter connection to Artificial Intelligence questions about symbol-grounding. It focuses on the notion of the interpretability of syntax and how the symbols are integrated in a unity ("binding problem"). Utilizing the DNA code as a model, this paper discusses how syntactic features could be defined as high-grade characteristics of the non syntactic relations in a material-dynamic structure, by using an emergentist approach. This topic furnishes the ground for a confutation of J. Searle's statement that syntax is observer-relative, as he wrote in his book "Mind: A Brief Introduction". Moreover the evolving discussion also modifies the classic symbol-processing doctrine in the mind which Searle attacks as a strong AL argument, that life could be implemented in a computational mode. Lastly, this paper furnishes a new way of support for the autonomous systems thesis in Artificial Life and Artificial Intelligence, using, inter alia, the "adaptive resonance theory" (ART).

  11. Computer software configuration management plan for the Honeywell modular automation system

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software management plan for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This type of system will be used to control new thermal stabilization furnaces, a vertical denitrator calciner, and a pyrolysis furnace

  12. The manual of a computer software 'FBR Plant Planning Design Prototype System'

    International Nuclear Information System (INIS)

    2003-10-01

    This is a manual of a computer software 'FBR Plant Planning Design Prototype System', which enables users to conduct case studies of deviated FBR design concepts based on 'MONJU'. The calculations simply proceed as the user clicks displayed buttons, therefore step-by-step explanation is supposed not be necessary. The following pages introduce only particular features of this software, i.e, each interactive screens, functions of buttons and consequences after clicks, and the quitting procedure. (author)

  13. The influence of math anxiety on symbolic and non-symbolic magnitude processing.

    Science.gov (United States)

    Dietrich, Julia F; Huber, Stefan; Moeller, Korbinian; Klein, Elise

    2015-01-01

    Deficits in basic numerical abilities have been investigated repeatedly as potential risk factors of math anxiety. Previous research suggested that also a deficient approximate number system (ANS), which is discussed as being the foundation for later math abilities, underlies math anxiety. However, these studies examined this hypothesis by investigating ANS acuity using a symbolic number comparison task. Recent evidence questions the view that ANS acuity can be assessed using a symbolic number comparison task. To investigate whether there is an association between math anxiety and ANS acuity, we employed both a symbolic number comparison task and a non-symbolic dot comparison task, which is currently the standard task to assess ANS acuity. We replicated previous findings regarding the association between math anxiety and the symbolic distance effect for response times. High math anxious individuals showed a larger distance effect than less math anxious individuals. However, our results revealed no association between math anxiety and ANS acuity assessed using a non-symbolic dot comparison task. Thus, our results did not provide evidence for the hypothesis that a deficient ANS underlies math anxiety. Therefore, we propose that a deficient ANS does not constitute a risk factor for the development of math anxiety. Moreover, our results suggest that previous interpretations regarding the interaction of math anxiety and the symbolic distance effect have to be updated. We suggest that impaired number comparison processes in high math anxious individuals might account for the results rather than deficient ANS representations. Finally, impaired number comparison processes might constitute a risk factor for the development of math anxiety. Implications for current models regarding the origins of math anxiety are discussed.

  14. The influence of math anxiety on symbolic and non-symbolic magnitude processing

    Directory of Open Access Journals (Sweden)

    Julia Felicitas Dietrich

    2015-10-01

    Full Text Available Deficits in basic numerical abilities have been investigated repeatedly as potential risk factors of math anxiety. Previous research suggested that also a deficient approximate number system (ANS, which is discussed as being the foundation for later math abilities, underlies math anxiety. However, these studies examined this hypothesis by investigating ANS acuity using a symbolic number comparison task. Recent evidence questions the view that ANS acuity can be assessed using a symbolic number comparison task. To investigate whether there is an association between math anxiety and ANS acuity, we employed both a symbolic number comparison task and a non-symbolic dot comparison task, which is currently the standard task to assess ANS acuity. We replicated previous findings regarding the association between math anxiety and the symbolic distance effect for response times. High math anxious individuals showed a larger distance effect than less math anxious individuals. However, our results revealed no association between math anxiety and ANS acuity assessed using a non-symbolic dot comparison task. Thus, our results did not provide evidence for the hypothesis that a deficient ANS underlies math anxiety. Therefore, we propose that a deficient ANS does not constitute a risk factor for the development of math anxiety. Moreover, our results suggest that previous interpretations regarding the interaction of math anxiety and the symbolic distance effect have to be updated. We suggest that impaired number comparison processes in high math anxious individuals might account for the results rather than deficient ANS representations. Finally, impaired number comparison processes might constitute a risk factor for the development of math anxiety. Implications for current models regarding the origins of math anxiety are discussed.

  15. Mathematical symbol hypothesis recognition with rejection option

    OpenAIRE

    Julca-Aguilar , Frank; Hirata , Nina ,; Viard-Gaudin , Christian; Mouchère , Harold; Medjkoune , Sofiane

    2014-01-01

    International audience; In the context of handwritten mathematical expressions recognition, a first step consist on grouping strokes (segmentation) to form symbol hypotheses: groups of strokes that might represent a symbol. Then, the symbol recognition step needs to cope with the identification of wrong segmented symbols (false hypotheses). However, previous works on symbol recognition consider only correctly segmented symbols. In this work, we focus on the problem of mathematical symbol reco...

  16. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  17. Symbol in Point View of Ambiguity

    Directory of Open Access Journals (Sweden)

    Dr. M. R. Yousefi

    Full Text Available Symbol from the perspective of rhetorical word, is phrase or sentence that apparent meaning, also inspires to reader a wide range of semantic.Since exploring the complex social and political ideas in the most mysticalway and indirectreflectionsocial and political thoughts symbolically is easier, so the symbol and symbolism especially in Persian literatureespeciallyin the realm of Persian poetry, has a special appearance.In addition to the factors mentioned in the contemporary literature, according familiar in literature and the emergence of particular schools interest toambiguoussymbolization has spreadfurther, especially the symbol has all the features of art ambiguity in the poem and it isone the major factors causing uncertainty.Thus, the precise definitions and symbols of contemporary poetry could be dominant in the unwinding ambiguous symbol detection of cryptic allusions and metaphors that matches the cursor symbol to help readers.In the literature, especially language poetry, the inability of language toreflecting obscure mystical ideas, avoid to directexpression of political and social concerns of the reader in the course of participate to creation ambiguous literary works is the main motivation towards symbol and symbolization.According widespread use of symbol and its different of species can be viewed from different perspectives.The creation of ambiguity is the main purposes of using symbols (especially in poetry, so many poets have tried to achieve this goal have to formation of similar symbols and the explanation and resolution of this issue can open new window for understanding the poetry in front of an audience.In this paper examines the ambiguity of symbols in terms of its precise boundaries are reviewed. Ambiguity is one of the important processes and also is the key Iranian poetry; its means is today poetry. In such poetry ambiguity is a need to explore the new world from a different perspective, or explore this complex world

  18. Computer-Aided Prototyping Systems (CAPS) within the software acquisition process: a case study

    OpenAIRE

    Ellis, Mary Kay

    1993-01-01

    Approved for public release; distribution is unlimited This thesis provides a case study which examines the benefits derived from the practice of computer-aided prototyping within the software acquisition process. An experimental prototyping systems currently in research is the Computer Aided Prototyping System (CAPS) managed under the Computer Science department of the Naval Postgraduate School, Monterey, California. This thesis determines the qualitative value which may be realized by ...

  19. USERDA computer software summaries: numbers 240 through 324

    International Nuclear Information System (INIS)

    1976-12-01

    Since 1960 the Argonne Code Center has served as a U.S. Atomic Energy Commission information center for computer programs developed and used primarily for the solution of problems in nuclear physics, reactor design, reactor engineering and operation. The Center, through a network of registered installations, collects, validates, maintains, and distributes a library of these computer programs and publishes a compilation of abstracts describing them. In 1972 the scope of the Center's activities was officially expanded to include computer programs developed in all of the U.S. Atomic Energy Commission program areas and the compilation and publicatuon of this report. The Computer Software Summary report contains summaries of computer programs at the specification stage, under development, being checked out, in use, or available at ERDA offices, laboratories, and contractor installations. Programs are divided into the following categories : cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and reactor economics; space-independent k;inetics; pace--time kinetics, coupled neutronics--hydrodynamics--thermodynmics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shielddesign programs; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; controlled thermonuclear research; biology and medicine; and data

  20. Clutter-free Visualization of Large Point Symbols at Multiple Scales by Offset Quadtrees

    Directory of Open Access Journals (Sweden)

    ZHANG Xiang

    2016-08-01

    Full Text Available To address the cartographic problems in map mash-up applications in the Web 2.0 context, this paper studies a clutter-free technique for visualizing large symbols on Web maps. Basically, a quadtree is used to select one symbol in each grid cell at each zoom level. To resolve the symbol overlaps between neighboring quad-grids, multiple offsets are applied to the quadtree and a voting strategy is used to compute the significant level of symbols for their selection at multiple scales. The method is able to resolve spatial conflicts without explicit conflict detection, thus enabling a highly efficient processing. Also the resulting map forms a visual hierarchy of semantic importance. We discuss issues such as the relative importance, symbol-to-grid size ratio, and effective offset schemes, and propose two extensions to make better use of the free space available on the map. Experiments were carried out to validate the technique,which demonstrates its robustness and efficiency (a non-optimal implementation leads to a sub-second processing for datasets of a 105 magnitude.

  1. Challenges to Software/Computing for Experimentation at the LHC

    Science.gov (United States)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  2. Comparison of two three-dimensional cephalometric analysis computer software

    OpenAIRE

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-01-01

    Background: Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Materials and Methods: Twenty cone beam computed tomography images were obtained using i-CAT® imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (Unive...

  3. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    Science.gov (United States)

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-01-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In…

  4. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  5. Symbolic-Numeric Integration of the Dynamical Cosserat Equations

    KAUST Repository

    Lyakhov, Dmitry A.

    2017-08-29

    We devise a symbolic-numeric approach to the integration of the dynamical part of the Cosserat equations, a system of nonlinear partial differential equations describing the mechanical behavior of slender structures, like fibers and rods. This is based on our previous results on the construction of a closed form general solution to the kinematic part of the Cosserat system. Our approach combines methods of numerical exponential integration and symbolic integration of the intermediate system of nonlinear ordinary differential equations describing the dynamics of one of the arbitrary vector-functions in the general solution of the kinematic part in terms of the module of the twist vector-function. We present an experimental comparison with the well-established generalized \\\\alpha -method illustrating the computational efficiency of our approach for problems in structural mechanics.

  6. Symbolic-Numeric Integration of the Dynamical Cosserat Equations

    KAUST Repository

    Lyakhov, Dmitry A.; Gerdt, Vladimir P.; Weber, Andreas G.; Michels, Dominik L.

    2017-01-01

    We devise a symbolic-numeric approach to the integration of the dynamical part of the Cosserat equations, a system of nonlinear partial differential equations describing the mechanical behavior of slender structures, like fibers and rods. This is based on our previous results on the construction of a closed form general solution to the kinematic part of the Cosserat system. Our approach combines methods of numerical exponential integration and symbolic integration of the intermediate system of nonlinear ordinary differential equations describing the dynamics of one of the arbitrary vector-functions in the general solution of the kinematic part in terms of the module of the twist vector-function. We present an experimental comparison with the well-established generalized \\alpha -method illustrating the computational efficiency of our approach for problems in structural mechanics.

  7. Low-complexity linewidth-tolerant time domain sub-symbol optical phase noise suppression in CO-OFDM systems.

    Science.gov (United States)

    Hong, Xuezhi; Hong, Xiaojian; Zhang, Junwei; He, Sailing

    2016-03-07

    Two linewidth-tolerant optical phase noise suppression algorithms, non-decision aided sub-symbol optical phase noise suppression (NDA-SPS) and partial-decision aided sub-symbol optical phase noise suppression (PDA-SPS), based on low-complexity time domain sub-symbol processing are proposed for coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. High accuracy carrier phase estimation is achieved in the NDA-SPS algorithm without decision error propagation. Compared with NDA-SPS, partial-decision aided estimation is introduced in PDA-SPS to reduce the pilot-overhead by half, yet only a small performance degradation is induced. The principles and computational complexities of the proposed algorithms are theoretically analyzed. By adopting specially designed comb-type pilot subcarriers, multiplier-free observation-based matrix generation is realized in the proposed algorithms. Computationally intensive discrete Fourier transform (DFT) or inverse DFT (IDFT) operations, which are usually carried out in other high-performance inter-carrier-interference (ICI) mitigation algorithms multiple times, are completely avoided. Compared with several other sub-symbol algorithms, the proposed algorithms with lower complexities offer considerably larger laser linewidth tolerances as demonstrated by Monte-Carlo simulations. Numerical analysis verifies that the optimal performance of PDA-SPS can be achieved with moderate numbers of sub-symbols.

  8. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  9. 22 CFR 42.11 - Classification symbols.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Classification symbols. 42.11 Section 42.11... NATIONALITY ACT, AS AMENDED Classification and Foreign State Chargeability § 42.11 Classification symbols. A... visa symbol to show the classification of the alien. Immigrants Symbol Class Section of law Immediate...

  10. Analysis of radioactive waste contamination in soils: solution via symbolic manipulation

    International Nuclear Information System (INIS)

    Cotta, R.M.; Mikhailov, M.D.; Ruperti, N.J. Jr.

    1998-01-01

    A demonstration is made of the automatic symbolic-numerical solution of the one-dimensional linearized Burgers equation with linear decay, which models the migration of radionuclides in porous media, by using the generalized integral transform technique and the Mathematica software system. An example is considered to allow for comparisons between the proposed hybrid numerical-analytical solution and the exact solution. Different filtering strategies are also reviewed in terms of the effects on convergence rates. (author)

  11. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  12. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Science.gov (United States)

    2012-08-22

    ... regulations with respect to software verification and auditing of digital computer software used in the safety... Standards and Records,'' which requires, in part, that a quality assurance program be established and implemented to provide adequate assurance that systems and components important to safety will satisfactorily...

  13. Technology survey of computer software as applicable to the MIUS project

    Science.gov (United States)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  14. 78 FR 47014 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Science.gov (United States)

    2013-08-02

    .... ML12354A524. 3. Revision 1 of RG 1.170, ``Test Documentation for Digital Computer Software used in Safety... is in ADAMS at Accession No. ML12354A531. 4. Revision 1 of RG 1.171, ``Software Unit Testing for... Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION...

  15. Aging and the number sense: preserved basic non-symbolic numerical processing and enhanced basic symbolic processing

    Directory of Open Access Journals (Sweden)

    Jade Eloise eNorris

    2015-07-01

    Full Text Available Aging often leads to general cognitive decline in domains such as memory and attention. The effect of aging on numerical cognition, particularly on foundational numerical skills known as the Number Sense, is not well known. Early research focused on the effect of aging on arithmetic. Recent studies have begun to investigate the impact of healthy aging on basic numerical skills, but focused on non-symbolic quantity discrimination alone. Moreover, contradictory findings have emerged. The current study aimed to further investigate the impact of aging on basic non-symbolic and symbolic numerical skills. A group of 25 younger (18-25 and 25 older adults (60-77 participated in non-symbolic and symbolic numerical comparison tasks. Mathematical and spelling abilities were also measured. Results showed that aging had no effect on foundational non-symbolic numerical skills, as both groups performed similarly (RTs, accuracy and Weber fractions (w. All participants showed decreased non-symbolic acuity (accuracy and w in trials requiring inhibition. However, aging appears to be associated with a greater decline in discrimination speed in such trials. Furthermore, aging seems to have a positive impact on mathematical ability and basic symbolic numerical processing, as older participants attained significantly higher mathematical achievement scores, and performed significantly better on the symbolic comparison task than younger participants. The findings suggest that aging and its lifetime exposure to numbers may lead to better mathematical achievement and stronger basic symbolic numerical skills. Our results further support the observation that basic non-symbolic numerical skills are resilient to aging, but that aging may exacerbate poorer performance on trials requiring inhibitory processes. These findings lend further support to the notion that preserved basic numerical skills in aging may reflect the preservation of an innate, primitive and embedded Number

  16. What's New in Software? Computers and the Writing Process: Strategies That Work.

    Science.gov (United States)

    Ellsworth, Nancy J.

    1990-01-01

    The computer can be a powerful tool to help students who are having difficulty learning the skills of prewriting, composition, revision, and editing. Specific software is suggested for each phase, as well as for classroom publishing. (Author/JDD)

  17. Symbol synchronization in convolutionally coded systems

    Science.gov (United States)

    Baumert, L. D.; Mceliece, R. J.; Van Tilborg, H. C. A.

    1979-01-01

    Alternate symbol inversion is sometimes applied to the output of convolutional encoders to guarantee sufficient richness of symbol transition for the receiver symbol synchronizer. A bound is given for the length of the transition-free symbol stream in such systems, and those convolutional codes are characterized in which arbitrarily long transition free runs occur.

  18. 7 CFR 29.1008 - Combination symbols.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Combination symbols. 29.1008 Section 29.1008..., 13, 14 and Foreign Type 92) § 29.1008 Combination symbols. A color or group symbol used with another symbol to form the third factor of a grademark to denote a particular side or characteristic of the...

  19. The Analysis of Mythological Symbols in Shahnameh

    Directory of Open Access Journals (Sweden)

    موسی پرنیان

    2012-05-01

    Full Text Available Recognizing symbols of Shahnameh requires an understanding of the context and condition of creation and emergence of symbol, myth and epic. Symbol has a relationship with consciousness and unconsciousness of man and constitutes the language of mythologies, legends, and epics. Thus the language of mythological and epic works is symbolic. The main theme in Iranian mythologies is the dual nature of creation, and during the passage from myth to epic the conflict between the two forces of good and evil appear in various aspects of existence. Some characters that represent symbolic and coded concepts more than other elements can be considered as symbols of the evolution of gods to kings and against them there are devilish kings as symbols of drought (Apush. The other symbolic elements analyzed in this study are: epic-romance stories, imaginary creatures, symbolic dreams of kings and heroes, symbolic numbers, symbolic patterns of flags, the symbolism of water, fire and charisma The findings of the study illustrate that people, more than other elements, are the constitutive elements of mythological symbols, and the tension between these human elements depicts the mutual conflict between good and evil in Ferdowsi’s Shahnameh. Like other elements, symbolic characters (especially kings are of symbolic value and constitute a part of constructing elements of mythological symbols in Shahname. Moreover their reputation is dependent on the extent of their benefit from “God charisma“ as the most pivotal element of their personality. Kings like Afrasiab and Zahak, due to lack of it, are the most disreputable kings. On the other hand, Fereidoon and Kaikhosro are on the top of the most reputable kings because of continuous benefit from that. This study has been conducted on the basis of library resources and has applied a descriptive-analytic method.

  20. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  1. A Graphical User Interface for the Computational Fluid Dynamics Software OpenFOAM

    OpenAIRE

    Melbø, Henrik Kaald

    2014-01-01

    A graphical user interface for the computational fluid dynamics software OpenFOAM has been constructed. OpenFOAM is a open source and powerful numerical software, but has much to be wanted in the field of user friendliness. In this thesis the basic operation of OpenFOAM will be introduced and the thesis will emerge in a graphical user interface written in PyQt. The graphical user interface will make the use of OpenFOAM simpler, and hopefully make this powerful tool more available for the gene...

  2. Nuclear reactors; graphical symbols

    International Nuclear Information System (INIS)

    1987-11-01

    This standard contains graphical symbols that reveal the type of nuclear reactor and is used to design graphical and technical presentations. Distinguishing features for nuclear reactors are laid down in graphical symbols. (orig.) [de

  3. On the Symbolic Verification of Timed Systems

    DEFF Research Database (Denmark)

    Moeller, Jesper; Lichtenberg, Jacob; Andersen, Henrik Reif

    1999-01-01

    This paper describes how to analyze a timed system symbolically. That is, given a symbolic representation of a set of (timed) states (as an expression), we describe how to determine an expression that represents the set of states that can be reached either by firing a discrete transition...... or by advancing time. These operations are used to determine the set of reachable states symbolically. We also show how to symbolically determine the set of states that can reach a given set of states (i.e., a backwards step), thus making it possible to verify TCTL-formulae symbolically. The analysis is fully...... symbolic in the sense that both the discrete and the continuous part of the state space are represented symbolically. Furthermore, both the synchronous and asynchronous concurrent composition of timed systems can be performed symbolically. The symbolic representations are given as formulae expressed...

  4. Symbolic Violence and Victimisation

    DEFF Research Database (Denmark)

    Pedersen, Bodil Maria

    2009-01-01

    has been criticised for over-generalisations, as well as for disregarding culture and the embeddedness of psychological problems in situated societal processes. The proposed paper is a contribution to this critique. It will draw on Bourdieu's concept of symbolic violence (1992). The concept connects......Nay (1999). It also undertakes a critical discussion of symbolic violence in the meanings given to victimisation and its aftermaths, as when conceptualised with the help of PTSD (e.g. may the use of concepts of this kind and the practices developed in relation to it constitute symbolic violence...... and contribute to victimisation?) Furthermore the analysis aims at unfolding an understanding of victimisation inclusive of connections between cultural/ societal practices, aspects of symbolic violence and lives of concrete subjects. The discussion takes its point of departure in theoretical deliberations...

  5. The Effects of Computer-Aided Design Software on Engineering Students' Spatial Visualisation Skills

    Science.gov (United States)

    Kösa, Temel; Karakus, Fatih

    2018-01-01

    The purpose of this study was to determine the influence of computer-aided design (CAD) software-based instruction on the spatial visualisation skills of freshman engineering students in a computer-aided engineering drawing course. A quasi-experimental design was applied, using the Purdue Spatial Visualization Test-Visualization of Rotations…

  6. Computer-aided software understanding systems to enhance confidence of scientific codes

    International Nuclear Information System (INIS)

    Sheng, G.; Oeren, T.I.

    1991-01-01

    A unique characteristic of nuclear waste disposal is the very long time span over which the combined engineered and natural containment system must remain effective: hundreds of thousands of years. Since there is no precedent in human history for such an endeavour, simulation with the use of computers is the only means we have of forecasting possible future outcomes quantitatively. The need for reliable models and software to make such forecasts so far into the future is obvious. One of the critical elements necessary to ensure reliability is the degree of reviewability of the computer program. Among others, there are two very important reasons for this. Firstly, if there is to be any chance at all of validating the conceptual models as implemented by the computer code, peer reviewers must be able to see and understand what the program is doing. It is all but impossible to achieve this understanding by just looking at the code due to possible unfamiliarity with the language and often due as well to the length and complexity of the code. Secondly, a thorough understanding of the code is also necessary to carry out code maintenance activities which include among others, error detection, error correction and code modification for purposes of enhancing its performance, functionality or to adapt it to a changed environment. The emerging concepts of computer-aided software understanding and reverse engineering can answer precisely these needs. This paper will discuss the role they can play in enhancing the confidence one has on computer codes and several examples will be provided. Finally a brief discussion of combining state-of-art forward engineering systems with reverse engineering systems will show how powerfully they can contribute to the overall quality assurance of a computer program. (13 refs., 7 figs.)

  7. Epistemic Opacity, Confirmation Holism and Technical Debt: Computer Simulation in the Light of Empirical Software Engineering

    OpenAIRE

    Newman , Julian

    2015-01-01

    Epistemic opacity vis a vis human agents has been presented as an essential, ineliminable characteristic of computer simulation models resulting from the characteristics of the human cognitive agent. This paper argues, on the contrary, that such epistemic opacity as does occur in computer simulations is not a consequence of human limitations but of a failure on the part of model developers to adopt good software engineering practice for managing human error and ensuring the software artefact ...

  8. Symbol synchronization for the TDRSS decoder

    Science.gov (United States)

    Costello, D. J., Jr.

    1983-01-01

    Each 8 bits out of the Viterbi decoder correspond to one symbol of the R/S code. Synchronization must be maintained here so that each 8-bit symbol delivered to the R/S decoder corresponds to an 8-bit symbol from the R/S encoder. Lack of synchronization, would cause an error in almost every R/S symbol since even a - 1-bit sync slip shifts every bit in each 8-bit symbol by one position, therby confusing the mapping betweeen 8-bit sequences and symbols. The error correcting capability of the R/S code would be exceeded. Possible ways to correcting this condition include: (1) designing the R/S decoder to recognize the overload and shifting the output sequence of the inner decoder to establish a different sync state; (2) using the characteristics of the inner decoder to establish symbol synchronization for the outer code, with or without a deinterleaver and an interleaver; and (3) modifying the encoder to alternate periodically between two sets of generators.

  9. Deficient symbol processing in Alzheimer disease.

    Science.gov (United States)

    Toepper, Max; Steuwe, Carolin; Beblo, Thomas; Bauer, Eva; Boedeker, Sebastian; Thomas, Christine; Markowitsch, Hans J; Driessen, Martin; Sammer, Gebhard

    2014-01-01

    Symbols and signs have been suggested to improve the orientation of patients suffering from Alzheimer disease (AD). However, there are hardly any studies that confirm whether AD patients benefit from signs or symbols and which symbol characteristics might improve or impede their symbol comprehension. To address these issues, 30 AD patients and 30 matched healthy controls performed a symbol processing task (SPT) with 4 different item categories. A repeated-measures analysis of variance was run to identify impact of different item categories on performance accuracy in both the experimental groups. Moreover, SPT scores were correlated with neuropsychological test scores in a broad range of other cognitive domains. Finally, diagnostic accuracy of the SPT was calculated by a receiver-operating characteristic curve analysis. Results revealed a global symbol processing dysfunction in AD that was associated with semantic memory and executive deficits. Moreover, AD patients showed a disproportional performance decline at SPT items with visual distraction. Finally, the SPT total score showed high sensitivity and specificity in differentiating between AD patients and healthy controls. The present findings suggest that specific symbol features impede symbol processing in AD and argue for a diagnostic benefit of the SPT in neuropsychological assessment.

  10. Sound Symbolism in Basic Vocabulary

    Directory of Open Access Journals (Sweden)

    Søren Wichmann

    2010-04-01

    Full Text Available The relationship between meanings of words and their sound shapes is to a large extent arbitrary, but it is well known that languages exhibit sound symbolism effects violating arbitrariness. Evidence for sound symbolism is typically anecdotal, however. Here we present a systematic approach. Using a selection of basic vocabulary in nearly one half of the world’s languages we find commonalities among sound shapes for words referring to same concepts. These are interpreted as due to sound symbolism. Studying the effects of sound symbolism cross-linguistically is of key importance for the understanding of language evolution.

  11. Software Infrastructure for Computer-aided Drug Discovery and Development, a Practical Example with Guidelines.

    Science.gov (United States)

    Moretti, Loris; Sartori, Luca

    2016-09-01

    In the field of Computer-Aided Drug Discovery and Development (CADDD) the proper software infrastructure is essential for everyday investigations. The creation of such an environment should be carefully planned and implemented with certain features in order to be productive and efficient. Here we describe a solution to integrate standard computational services into a functional unit that empowers modelling applications for drug discovery. This system allows users with various level of expertise to run in silico experiments automatically and without the burden of file formatting for different software, managing the actual computation, keeping track of the activities and graphical rendering of the structural outcomes. To showcase the potential of this approach, performances of five different docking programs on an Hiv-1 protease test set are presented. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  13. Current trends in hardware and software for brain-computer interfaces (BCIs)

    Science.gov (United States)

    Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  14. The symbolism of zombie

    Directory of Open Access Journals (Sweden)

    Nadine BOUDOU

    2015-07-01

    Full Text Available The objective of this article is to show why the zombie can be presented as a justifiable object of search for the symbolic communication. The zombie exists as symbol because the word the leading to a qualification became of current usage, what allows a widened communication. The diversity of the interpretations that he makes possible testifies of its ambivalence. That he is defined as a symbol or as a metaphor we shall see that, far from being that a lasted fad, the zombie is rich in different senses.

  15. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  16. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  17. [Rod of Asclepius. Symbol of medicine].

    Science.gov (United States)

    Young, Pablo; Finn, Bárbara C; Bruetman, Julio E; Cesaro Gelos, Jorge; Trimarchi, Hernán

    2013-09-01

    Symbolism is one of the most archaic forms of human thoughts. Symbol derives from the Latin word symbolum, and the latter from the Greek symbolon or symballo, which means "I coincide, I make matches". The Medicine symbol represents a whole series of historical and ethical values. Asclepius Rod with one serpent entwined, has traditionally been the symbol of scientific medicine. In a misconception that has lasted 500 years, the Caduceus of Hermes, entwined by two serpents and with two wings, has been considered the symbol of Medicine. However, the Caduceus is the current symbol of Commerce. Asclepius Rod and the Caduceus of Hermes represent two professions, Medicine and Commerce that, in ethical practice, should not be mixed. Physicians should be aware of their real emblem, its historical origin and meaning.

  18. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    OpenAIRE

    Olena V. Semenikhina; Maryna H. Drushliak

    2014-01-01

    The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and ...

  19. Computational mathematics and mathematical computer software. Vychislitel'naia matematika i matematicheskoe obespechenie EVM

    Energy Technology Data Exchange (ETDEWEB)

    Tikhonov, A.N.; Samarskii, A.A.

    1985-01-01

    Various aspects of mathematical modeling and problem-oriented computer software are examined with reference to numerical methods in mathematical physics, methods for solving inverse problems, development of automatic systems for experimental data processing, and mathematical modeling in plasma physics. Papers are presented on some properties of difference schemes in one-dimensional gas dynamics, an algorithm for processing signals reflected from multipoint targets, and the application of simplified Navier-Stokes equations for calculating flow of a viscous gas past long bodies.

  20. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures.

    Science.gov (United States)

    Ceroni, Alessio; Dell, Anne; Haslam, Stuart M

    2007-08-07

    Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other applications to create intuitive and appealing user

  1. The GlycanBuilder: a fast, intuitive and flexible software tool for building and displaying glycan structures

    Directory of Open Access Journals (Sweden)

    Dell Anne

    2007-08-01

    Full Text Available Abstract Background Carbohydrates play a critical role in human diseases and their potential utility as biomarkers for pathological conditions is a major driver for characterization of the glycome. However, the additional complexity of glycans compared to proteins and nucleic acids has slowed the advancement of glycomics in comparison to genomics and proteomics. The branched nature of carbohydrates, the great diversity of their constituents and the numerous alternative symbolic notations, make the input and display of glycans not as straightforward as for example the amino-acid sequence of a protein. Every glycoinformatic tool providing a user interface would benefit from a fast, intuitive, appealing mechanism for input and output of glycan structures in a computer readable format. Results A software tool for building and displaying glycan structures using a chosen symbolic notation is described here. The "GlycanBuilder" uses an automatic rendering algorithm to draw the saccharide symbols and to place them on the drawing board. The information about the symbolic notation is derived from a configurable graphical model as a set of rules governing the aspect and placement of residues and linkages. The algorithm is able to represent a structure using only few traversals of the tree and is inherently fast. The tool uses an XML format for import and export of encoded structures. Conclusion The rendering algorithm described here is able to produce high-quality representations of glycan structures in a chosen symbolic notation. The automated rendering process enables the "GlycanBuilder" to be used both as a user-independent component for displaying glycans and as an easy-to-use drawing tool. The "GlycanBuilder" can be integrated in web pages as a Java applet for the visual editing of glycans. The same component is available as a web service to render an encoded structure into a graphical format. Finally, the "GlycanBuilder" can be integrated into other

  2. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  3. A New Adaptive Structural Signature for Symbol Recognition by Using a Galois Lattice as a Classifier.

    Science.gov (United States)

    Coustaty, M; Bertet, K; Visani, M; Ogier, J

    2011-08-01

    In this paper, we propose a new approach for symbol recognition using structural signatures and a Galois lattice as a classifier. The structural signatures are based on topological graphs computed from segments which are extracted from the symbol images by using an adapted Hough transform. These structural signatures-that can be seen as dynamic paths which carry high-level information-are robust toward various transformations. They are classified by using a Galois lattice as a classifier. The performance of the proposed approach is evaluated based on the GREC'03 symbol database, and the experimental results we obtain are encouraging.

  4. An FMRI-compatible Symbol Search task.

    Science.gov (United States)

    Liebel, Spencer W; Clark, Uraina S; Xu, Xiaomeng; Riskin-Jones, Hannah H; Hawkshead, Brittany E; Schwarz, Nicolette F; Labbe, Donald; Jerskey, Beth A; Sweet, Lawrence H

    2015-03-01

    Our objective was to determine whether a Symbol Search paradigm developed for functional magnetic resonance imaging (FMRI) is a reliable and valid measure of cognitive processing speed (CPS) in healthy older adults. As all older adults are expected to experience cognitive declines due to aging, and CPS is one of the domains most affected by age, establishing a reliable and valid measure of CPS that can be administered inside an MR scanner may prove invaluable in future clinical and research settings. We evaluated the reliability and construct validity of a newly developed FMRI Symbol Search task by comparing participants' performance in and outside of the scanner and to the widely used and standardized Symbol Search subtest of the Wechsler Adult Intelligence Scale (WAIS). A brief battery of neuropsychological measures was also administered to assess the convergent and discriminant validity of the FMRI Symbol Search task. The FMRI Symbol Search task demonstrated high test-retest reliability when compared to performance on the same task administered out of the scanner (r=.791; pSymbol Search (r=.717; pSymbol Search task were also observed. The FMRI Symbol Search task is a reliable and valid measure of CPS in healthy older adults and exhibits expected sensitivity to the effects of age on CPS performance.

  5. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  6. Multiple symbol differential detection

    Science.gov (United States)

    Divsalar, Dariush (Inventor); Simon, Marvin K. (Inventor)

    1991-01-01

    A differential detection technique for multiple phase shift keying (MPSK) signals is provided which uses a multiple symbol observation interval on the basis of which a joint decision is made regarding the phase of the received symbols. In accordance with the invention, a first difference phase is created between first and second received symbols. Next, the first difference phase is correlated with the possible values thereof to provide a first plurality of intermediate output signals. A second difference phase is next created between second and third received symbols. The second difference phase is correlated with plural possible values thereof to provide a second plurality of intermediate output signals. Next, a third difference phase is created between the first and third symbols. The third difference phase is correlated with plural possible values thereof to provide a third plurality of intermediate output signals. Each of the first plurality of intermediate outputs are combined with each of the second plurality of intermediate outputs and each of the third plurality of intermediate outputs to provide a plurality of possible output values. Finally, a joint decision is made by choosing from the plurality of possible output values the value which represents the best combined correlation of the first, second and third difference values with the possible values thereof.

  7. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  8. Kindergarteners' performance in a sound-symbol paradigm predicts early reading.

    Science.gov (United States)

    Horbach, Josefine; Scharke, Wolfgang; Cröll, Jennifer; Heim, Stefan; Günther, Thomas

    2015-11-01

    The current study examined the role of serial processing of newly learned sound-symbol associations in early reading acquisition. A computer-based sound-symbol paradigm (SSP) was administered to 243 children during their last year of kindergarten (T1), and their reading performance was assessed 1 year later in first grade (T2). Results showed that performance on the SSP measured before formal reading instruction was associated with later reading development. At T1, early readers performed significantly better than nonreaders in learning correspondences between sounds and symbols as well as in applying those correspondences in a serial manner. At T2, SSP performance measured at T1 was positively associated with reading performance. Importantly, serial application of newly learned correspondences at T1 explained unique variance in first-grade reading performance in nonreaders over and above other verbal predictors, including phonological awareness, verbal short-term memory, and rapid automatized naming. Consequently, the SSP provides a promising way to study aspects of reading in preliterate children. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Development of software for computing forming information using a component based approach

    Directory of Open Access Journals (Sweden)

    Kwang Hee Ko

    2009-12-01

    Full Text Available In shipbuilding industry, the manufacturing technology has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology, however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary to create a “plug-in” framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a framework for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology, which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  10. When math operations have visuospatial meanings versus purely symbolic definitions: Which solving stages and brain regions are affected?

    Science.gov (United States)

    Pyke, Aryn A; Fincham, Jon M; Anderson, John R

    2017-06-01

    How does processing differ during purely symbolic problem solving versus when mathematical operations can be mentally associated with meaningful (here, visuospatial) referents? Learners were trained on novel math operations (↓, ↑), that were defined strictly symbolically or in terms of a visuospatial interpretation (operands mapped to dimensions of shaded areas, answer = total area). During testing (scanner session), no visuospatial representations were displayed. However, we expected visuospatially-trained learners to form mental visuospatial representations for problems, and exhibit distinct activations. Since some solution intervals were long (~10s) and visuospatial representations might only be instantiated in some stages during solving, group differences were difficult to detect when treating the solving interval as a whole. However, an HSMM-MVPA process (Anderson and Fincham, 2014a) to parse fMRI data identified four distinct problem-solving stages in each group, dubbed: 1) encode; 2) plan; 3) compute; and 4) respond. We assessed stage-specific differences across groups. During encoding, several regions implicated in general semantic processing and/or mental imagery were more active in visuospatially-trained learners, including: bilateral supramarginal, precuneus, cuneus, parahippocampus, and left middle temporal regions. Four of these regions again emerged in the computation stage: precuneus, right supramarginal/angular, left supramarginal/inferior parietal, and left parahippocampal gyrus. Thus, mental visuospatial representations may not just inform initial problem interpretation (followed by symbolic computation), but may scaffold on-going computation. In the second stage, higher activations were found among symbolically-trained solvers in frontal regions (R. medial and inferior and L. superior) and the right angular and middle temporal gyrus. Activations in contrasting regions may shed light on solvers' degree of use of symbolic versus mental

  11. Trends in computerized structural analysis and synthesis; Proceedings of the Symposium, Washington, D.C., October 30-November 1, 1978

    Science.gov (United States)

    Noor, A. K. (Editor); Mccomb, H. G., Jr.

    1978-01-01

    The subjects considered are related to future directions of structural applications and potential of new computing systems, advances and trends in data management and engineering software development, advances in applied mathematics and symbolic computing, computer-aided instruction and interactive computer graphics, nonlinear analysis, dynamic analysis and transient response, structural synthesis, structural analysis and design systems, advanced structural applications, supercomputers, numerical analysis, and trends in software systems. Attention is given to the reliability and optimality of the finite element method, computerized symbolic manipulation in structural mechanics, a standard computer graphics subroutine package, and a drag method as a finite element mesh generation scheme.

  12. Contextualizing symbol, symbolizing context

    Science.gov (United States)

    Maudy, Septiani Yugni; Suryadi, Didi; Mulyana, Endang

    2017-08-01

    When students learn algebra for the first time, inevitably they are experiencing transition from arithmetic to algebraic thinking. Once students could apprehend this essential mathematical knowledge, they are cultivating their ability in solving daily life problems by applying algebra. However, as we dig into this transitional stage, we identified possible students' learning obstacles to be dealt with seriously in order to forestall subsequent hindrance in studying more advance algebra. We come to realize this recurring problem as we undertook the processes of re-personalization and re-contextualization in which we scrutinize the very basic questions: 1) what is variable, linear equation with one variable and their relationship with the arithmetic-algebraic thinking? 2) Why student should learn such concepts? 3) How to teach those concepts to students? By positioning ourselves as a seventh grade student, we address the possibility of children to think arithmetically when confronted with the problems of linear equation with one variable. To help them thinking algebraically, Bruner's modes of representation developed contextually from concrete to abstract were delivered to enhance their interpretation toward the idea of variables. Hence, from the outset we designed the context for student to think symbolically initiated by exploring various symbols that could be contextualized in order to bridge student traversing the arithmetic-algebraic fruitfully.

  13. Programming Languages or Generic Software Tools, for Beginners' Courses in Computer Literacy?

    Science.gov (United States)

    Neuwirth, Erich

    1987-01-01

    Discussion of methods that can be used to teach beginner courses in computer literacy focuses on students aged 10-12. The value of using a programing language versus using a generic software package is highlighted; Logo and Prolog are reviewed; and the use of databases is discussed. (LRW)

  14. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  15. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  16. The future of commodity computing and many-core versus the interests of HEP software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major tradeoffs forced upon the software domain by the changing landscape of parallel technologies – hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  17. Developing Teaching Material Software Assisted for Numerical Methods

    Science.gov (United States)

    Handayani, A. D.; Herman, T.; Fatimah, S.

    2017-09-01

    The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.

  18. The Model of the Software Running on a Computer Equipment Hardware Included in the Grid network

    Directory of Open Access Journals (Sweden)

    T. A. Mityushkina

    2012-12-01

    Full Text Available A new approach to building a cloud computing environment using Grid networks is proposed in this paper. The authors describe the functional capabilities, algorithm, model of software running on a computer equipment hardware included in the Grid network, that will allow to implement cloud computing environment using Grid technologies.

  19. 36 CFR 264.11 - Use of symbol.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Use of symbol. 264.11 Section... MANAGEMENT Mount St. Helens National Volcanic Monument Symbol § 264.11 Use of symbol. Except as provided in § 264.12, use of the Mount St. Helens National Volcanic Monument official symbol, including a facsimile...

  20. Army-NASA aircrew/aircraft integration program (A3I) software detailed design document, phase 3

    Science.gov (United States)

    Banda, Carolyn; Chiu, Alex; Helms, Gretchen; Hsieh, Tehming; Lui, Andrew; Murray, Jerry; Shankar, Renuka

    1990-01-01

    The capabilities and design approach of the MIDAS (Man-machine Integration Design and Analysis System) computer-aided engineering (CAE) workstation under development by the Army-NASA Aircrew/Aircraft Integration Program is detailed. This workstation uses graphic, symbolic, and numeric prototyping tools and human performance models as part of an integrated design/analysis environment for crewstation human engineering. Developed incrementally, the requirements and design for Phase 3 (Dec. 1987 to Jun. 1989) are described. Software tools/models developed or significantly modified during this phase included: an interactive 3-D graphic cockpit design editor; multiple-perspective graphic views to observe simulation scenarios; symbolic methods to model the mission decomposition, equipment functions, pilot tasking and loading, as well as control the simulation; a 3-D dynamic anthropometric model; an intermachine communications package; and a training assessment component. These components were successfully used during Phase 3 to demonstrate the complex interactions and human engineering findings involved with a proposed cockpit communications design change in a simulated AH-64A Apache helicopter/mission that maps to empirical data from a similar study and AH-1 Cobra flight test.

  1. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    Science.gov (United States)

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some

  2. The effects of user factors and symbol referents on public symbol design using the stereotype production method.

    Science.gov (United States)

    Ng, Annie W Y; Siu, Kin Wai Michael; Chan, Chetwyn C H

    2012-01-01

    This study investigated the influence of user factors and symbol referents on public symbol design among older people, using the stereotype production method for collecting user ideas during the symbol design process. Thirty-one older adults were asked to draw images based on 28 public symbol referents and to indicate their familiarity with and ease with which they visualised each referent. Differences were found between the pictorial solutions generated by males and females. However, symbol design was not influenced by participants' education level, vividness of visual imagery, object imagery preference or spatial imagery preference. Both familiar and unfamiliar referents were illustrated pictorially without much difficulty by users. The more visual the referent, the less difficulty the users had in illustrating it. The findings of this study should aid the optimisation of the stereotype production method for user-involved symbol design. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  3. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  4. Goethe Gossips with Grass: Using Computer Chatting Software in an Introductory Literature Course.

    Science.gov (United States)

    Fraser, Catherine C.

    1999-01-01

    Students in a third-year introduction to German literature course chatted over networked computers, using "FirstClass" software. A brief description of the course design is provided with detailed information on how the three chat sessions were organized. (Author/VWL)

  5. 7 CFR 29.3012 - Color symbols.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Color symbols. 29.3012 Section 29.3012 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Color symbols. As applied to Burley, single color symbols are as follows: L—buff, F—tan, R—red, D—dark...

  6. 7 CFR 29.1066 - Symbol (S).

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Symbol (S). 29.1066 Section 29.1066 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Type 92) § 29.1066 Symbol (S). As applied to Flue-cured tobacco the symbol (S) when used (a) as the...

  7. 7 CFR 29.3510 - Color symbols.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Color symbols. 29.3510 Section 29.3510 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Type 95) § 29.3510 Color symbols. As applied to Dark Air-cured tobacco, color symbols are L—light brown...

  8. 7 CFR 29.2259 - Color symbols.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Color symbols. 29.2259 Section 29.2259 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... symbols. As applied to this type, color symbols are: L—light brown, F—medium brown, D—dark brown, M—mixed...

  9. 7 CFR 29.1007 - Color symbols.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Color symbols. 29.1007 Section 29.1007 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Type 92) § 29.1007 Color symbols. As applied to flue-cured tobacco, color symbols are L—lemon, F—orange...

  10. Data science and symbolic AI: Synergies, challenges and opportunities

    KAUST Repository

    Hoehndorf, Robert

    2017-06-02

    Symbolic approaches to artificial intelligence represent things within a domain of knowledge through physical symbols, combine symbols into symbol expressions, and manipulate symbols and symbol expressions through inference processes. While a large part of Data Science relies on statistics and applies statistical approaches to artificial intelligence, there is an increasing potential for successfully applying symbolic approaches as well. Symbolic representations and symbolic inference are close to human cognitive representations and therefore comprehensible and interpretable; they are widely used to represent data and metadata, and their specific semantic content must be taken into account for analysis of such information; and human communication largely relies on symbols, making symbolic representations a crucial part in the analysis of natural language. Here we discuss the role symbolic representations and inference can play in Data Science, highlight the research challenges from the perspective of the data scientist, and argue that symbolic methods should become a crucial component of the data scientists’ toolbox.

  11. Data science and symbolic AI: Synergies, challenges and opportunities

    KAUST Repository

    Hoehndorf, Robert; Queralt-Rosinach, Nú ria

    2017-01-01

    Symbolic approaches to artificial intelligence represent things within a domain of knowledge through physical symbols, combine symbols into symbol expressions, and manipulate symbols and symbol expressions through inference processes. While a large part of Data Science relies on statistics and applies statistical approaches to artificial intelligence, there is an increasing potential for successfully applying symbolic approaches as well. Symbolic representations and symbolic inference are close to human cognitive representations and therefore comprehensible and interpretable; they are widely used to represent data and metadata, and their specific semantic content must be taken into account for analysis of such information; and human communication largely relies on symbols, making symbolic representations a crucial part in the analysis of natural language. Here we discuss the role symbolic representations and inference can play in Data Science, highlight the research challenges from the perspective of the data scientist, and argue that symbolic methods should become a crucial component of the data scientists’ toolbox.

  12. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  13. X-ray image processing software for computing object size and object location coordinates from acquired optical and x-ray images

    International Nuclear Information System (INIS)

    Tiwari, Akash; Tiwari, Shyam Sunder; Tiwari, Railesha; Panday, Lokesh; Panday, Jeet; Suri, Nitin

    2004-01-01

    X-ray and Visible image data processing software has been developed in Visual Basic for real time online and offline image information processing for NDT and Medical Applications. Software computes two dimension image size parameters from its sharp boundary lines by raster scanning the image contrast data. Code accepts bit map image data and hunts for multiple tumors of different sizes that may be present in the image definition and then computes size of each tumor and locates its approximate center for registering its location coordinates. Presence of foreign metal and glass balls industrial product such as chocolate and other food items imaged out using x-ray imaging technique are detected by the software and their size and position co-ordinates are computed by the software. Paper discusses ways and means to compute size and coordinated of air bubble like objects present in the x-ray and optical images and their multiple existences in image of interest. (author)

  14. Evaluation of group theoretical characteristics using the symbolic manipulation language MAPLE

    International Nuclear Information System (INIS)

    Taneri, U.; Paldus, J.

    1994-01-01

    Relying on theoretical developments exploiting quasispin and the pseudo-orthogonal group in the Hubbard model of cyclic polyenes, the general expressions for generating polynomials, providing the dimensional information for relevant irreducible representations, were derived. These generating polynomials result from 1-dimensional formulas through rather tedious algebraic manipulations involving ratios of polynomials with fractional powers. It is shown that these expressions may be efficiently handled using the symbolic manipulation language MAPLE and the dimensional information for an arbitrary spin, isospin, and quasimomentum obtained. Exploitation of symbolic computation for other group theoretical problems that are relevant in quantum chemical calculations and their relationship with Guassian polynomial based combinatorial approaches is also briefly addressed and various possible applications outlined

  15. The Impact Of Using Computer Software On Vocabulary Learning Of Iranian EFL University Students

    Directory of Open Access Journals (Sweden)

    Samira Pahlavanpoorfard

    2014-07-01

    Full Text Available Today, using computer is common in all fields. Education is not an exception. In fact, this approach of technology has been using increasingly in language classrooms. We have witnessed there are more and more language teachers are using computers in their classrooms. This research study investigates the impact of using computer   on vocabulary learning of Iranian EFL university students. To this end, a sample of 40 university students in Islamic Azad University, Larestan branch were randomly assigned into the experimental and control groups. Prior the treatment and to catch the initial deferences between the participants, all the students sat for a pre-test that was an Oxford Placement Test. Then the students were received the treatment for 10 weeks. The students in the experimental group were taught by computer software for vocabulary learning while the students in the control group were taught through traditional method for vocabulary learning. After the treatment, all the students sat for a post-test. The statistical analysis through running Independent-Sample T-tests revealed thatthe students in the experimental group who used the computer software for vocabulary learning performed better than the students in the control group were taught through traditional method for vocabulary learning.

  16. Sound-Symbolism Boosts Novel Word Learning

    Science.gov (United States)

    Lockwood, Gwilym; Dingemanse, Mark; Hagoort, Peter

    2016-01-01

    The existence of sound-symbolism (or a non-arbitrary link between form and meaning) is well-attested. However, sound-symbolism has mostly been investigated with nonwords in forced choice tasks, neither of which are representative of natural language. This study uses ideophones, which are naturally occurring sound-symbolic words that depict sensory…

  17. Symbol-String Sensitivity and Children's Reading

    Science.gov (United States)

    Pammer, Kristen; Lavis, Ruth; Hansen, Peter; Cornelissen, Piers L.

    2004-01-01

    In this study of primary school children, a novel "symbol-string" task is used to assess sensitivity to the position of briefly presented non-alphabetic but letter-like symbols. The results demonstrate that sensitivity in the symbol-string task explains a unique proportion of the variability in children's contextual reading accuracy. Moreover,…

  18. Performance of the split-symbol moments SNR estimator in the presence of inter-symbol interference

    Science.gov (United States)

    Shah, B.; Hinedi, S.

    1989-01-01

    The Split-Symbol Moments Estimator (SSME) is an algorithm that is designed to estimate symbol signal-to-noise ratio (SNR) in the presence of additive white Gaussian noise (AWGN). The performance of the SSME algorithm in band-limited channels is examined. The effects of the resulting inter-symbol interference (ISI) are quantified. All results obtained are in closed form and can be easily evaluated numerically for performance prediction purposes. Furthermore, they are validated through digital simulations.

  19. Software Development Processes Applied to Computational Icing Simulation

    Science.gov (United States)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  20. Symbol recognition with kernel density matching.

    Science.gov (United States)

    Zhang, Wan; Wenyin, Liu; Zhang, Kun

    2006-12-01

    We propose a novel approach to similarity assessment for graphic symbols. Symbols are represented as 2D kernel densities and their similarity is measured by the Kullback-Leibler divergence. Symbol orientation is found by gradient-based angle searching or independent component analysis. Experimental results show the outstanding performance of this approach in various situations.

  1. History of international symbol for ionizing radiation

    International Nuclear Information System (INIS)

    Franic, Z.

    1996-01-01

    The year 1996 marks the 50th anniversary of the radiation warning symbol as we currently know it. It was (except the colours used) doodled out at the University of California, Berkeley, sometime in 1946 by a small group of people. The key guy responsible was Nelson Garden, then the head of the Health Chemistry Group, at the Radiation Laboratory. The radiation warning symbol should not be confused with the civil defence symbol (circle divided into six equal sections, three of these being black and three yellow), designed to identify fallout shelters. The basic radiation symbol was eventually internationally standardized by ISO code: 361-1975 (E). Variations of this symbol are frequently used in logotypes radiation protection organizations or associations. Particularly nice are those of International Radiation Protection Association (IRPA) and Croatian Radiation Protection Association (CRPA) that combines traditional Croatian motives with high technology. However, apart from speculations, there is no definite answer why did the Berkeley people chose this particular symbol. Whatever the reason was, it was very good choice because the ionizing radiation symbol is simple, readily identifiable, i.e., not similar to other warning symbols, and discernible at a large distance. (author)

  2. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  3. A software package to process an INIS magnetic tape on the VAX computer

    International Nuclear Information System (INIS)

    Omar, A.A.; Mohamed, F.A.

    1991-01-01

    This paper presents a software package whose function is to process the magnetic tapes distributed by the Atomic Energy Agency, on the VAX computers. These tapes contain abstracts of papers in the different branches of nuclear field and is supplied from the international Nuclear Information system (INIS). Two goals are aimed from this paper. First it gives a procedure to process any foreign magnetic tape on the VAX computers. Second, it solves the problem of reading the INIS tapes on a non IBM computer and thus allowing the specialists to gain from the large amount of information contained in these tapes. 11 fig

  4. 7 CFR 29.2509 - Color symbols.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Color symbols. 29.2509 Section 29.2509 Agriculture...-Cured Tobacco (u.s. Types 22, 23, and Foreign Type 96) § 29.2509 Color symbols. As applied to these types, color symbols are L—light brown, F—medium brown, D—dark brown, M—mixed or variegated VF—greenish...

  5. Associations of non-symbolic and symbolic numerical magnitude processing with mathematical competence: a meta-analysis.

    Science.gov (United States)

    Schneider, Michael; Beeres, Kassandra; Coban, Leyla; Merz, Simon; Susan Schmidt, S; Stricker, Johannes; De Smedt, Bert

    2017-05-01

    Many studies have investigated the association between numerical magnitude processing skills, as assessed by the numerical magnitude comparison task, and broader mathematical competence, e.g. counting, arithmetic, or algebra. Most correlations were positive but varied considerably in their strengths. It remains unclear whether and to what extent the strength of these associations differs systematically between non-symbolic and symbolic magnitude comparison tasks and whether age, magnitude comparison measures or mathematical competence measures are additional moderators. We investigated these questions by means of a meta-analysis. The literature search yielded 45 articles reporting 284 effect sizes found with 17,201 participants. Effect sizes were combined by means of a two-level random-effects regression model. The effect size was significantly higher for the symbolic (r = .302, 95% CI [.243, .361]) than for the non-symbolic (r = .241, 95% CI [.198, .284]) magnitude comparison task and decreased very slightly with age. The correlation was higher for solution rates and Weber fractions than for alternative measures of comparison proficiency. It was higher for mathematical competencies that rely more heavily on the processing of magnitudes (i.e. mental arithmetic and early mathematical abilities) than for others. The results support the view that magnitude processing is reliably associated with mathematical competence over the lifespan in a wide range of tasks, measures and mathematical subdomains. The association is stronger for symbolic than for non-symbolic numerical magnitude processing. So symbolic magnitude processing might be a more eligible candidate to be targeted by diagnostic screening instruments and interventions for school-aged children and for adults. © 2016 John Wiley & Sons Ltd.

  6. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  7. Social Symbolic Work in Context

    DEFF Research Database (Denmark)

    Brincker, Benedikte

    ‘the good organisation’ may offer a supportive organisational framework for social symbolic work, thus promoting regional development in peripheral and poorly developed regions. Exploring what qualifies as a ‘good organisation’, the paper identifies three key elements: management, motivation......This paper reports on a research project that explores social symbolic work. The social symbolic work in question seeks to introduce education in entrepreneurship into the school curriculum in a remote part of Greenland – in order to contribute to regional development. The paper investigates how...

  8. (3 + 1)-dimensional cylindrical Korteweg-de Vries equation for nonextensive dust acoustic waves: Symbolic computation and exact solutions

    International Nuclear Information System (INIS)

    Guo Shimin; Wang Hongli; Mei Liquan

    2012-01-01

    By combining the effects of bounded cylindrical geometry, azimuthal and axial perturbations, the nonlinear dust acoustic waves (DAWs) in an unmagnetized plasma consisting of negatively charged dust grains, nonextensive ions, and nonextensive electrons are studied in this paper. Using the reductive perturbation method, a (3 + 1)-dimensional variable-coefficient cylindrical Korteweg-de Vries (KdV) equation describing the nonlinear propagation of DAWs is derived. Via the homogeneous balance principle, improved F-expansion technique and symbolic computation, the exact traveling and solitary wave solutions of the KdV equation are presented in terms of Jacobi elliptic functions. Moreover, the effects of the plasma parameters on the solitary wave structures are discussed in detail. The obtained results could help in providing a good fit between theoretical analysis and real applications in space physics and future laboratory plasma experiments where long-range interactions are present.

  9. ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments

    International Nuclear Information System (INIS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2015-01-01

    Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented

  10. The rhetoric of disenchantment through symbolism

    Directory of Open Access Journals (Sweden)

    Théophile Munyangeyo

    2012-10-01

    Full Text Available The symbolism of flowers has always been a significant part of cultures around the world due to their functional meaning in daily life. From their decorative to their aromatic role, flowers and their symbolic meaning trigger emotions, convey wishes and represent thoughts that can not be explicitly expressed. In this regard, an elaborate language based on flower symbolism was developed in many societies, to convey clear messages to the recipient. However, in some cultural contexts, although the flower symbolism has social connotations, it is mainly associated with economic references. As flowers are an essential precursor to fruits, they are inevitably a source of expectations and hence foster a set of hopes and dreams, which can ultimately lead to excitement or disappointment.Through a discourse analysis based on factional narratives, this article explores the parameters through which the symbolism of bifaceted meaning of flowers fictionalises a space that refers to the social reality. This association between the fictional world and social reference has highlighted that writing can profoundly be a means of representing social events through the rhetoric of symbolism. Through a sociological reading approach, this paper aims to analyse how the symbolism of flowers informs the rhetoric of disenchantment that can foster a content-based pedagogy in language learning where silencing practices engender imagery to exercise the freedom of expression.

  11. The shift to Cloud Computing : The impact of disruptive technology on the enterprise software business ecosystem

    NARCIS (Netherlands)

    Nieuwenhuis, Lambert J.M.; Ehrenhard, Michel L.; Prause, Lars

    2017-01-01

    The rapid diffusion of Cloud Computing influences the way enterprise software is developed, distributed, and implemented. This uptake of Cloud Computing has profound implications for the IT industry and related industries, as it does not only affect the vendors' business models but also the other

  12. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  13. Symbolic reasoning about myocardial scintigrams in PROLOG

    International Nuclear Information System (INIS)

    Rosenberg, S.; Itti, R.; Benjelloun, L.

    1986-01-01

    PROLOG (PROgramming in LOGic) is the declarative programming language at the heart of the Japanese fifth-generation computer project. It is proposed that PROLOG is a suitable tool for symbolic image processing, once standard preprocessing has been done. In the present application, the problem of prediction of coronary anatomy from myocardial scintigrams is addressed. Uncertainty is dealt with by a combination of fuzzy-set theoretic and probabilistic reasoning. Heuristic classification rules are based on clinical experience and on a set of 247 myocardial scintigrams with their corresponding coronary angiograms. (orig.)

  14. Symbolic reasoning about myocardial scintigrams in PROLOG

    Energy Technology Data Exchange (ETDEWEB)

    Rosenberg, S; Itti, R; Benjelloun, L

    1986-06-01

    PROLOG (PROgramming in LOGic) is the declarative programming language at the heart of the Japanese fifth-generation computer project. It is proposed that PROLOG is a suitable tool for symbolic image processing, once standard preprocessing has been done. In the present application, the problem of prediction of coronary anatomy from myocardial scintigrams is addressed. Uncertainty is dealt with by a combination of fuzzy-set theoretic and probabilistic reasoning. Heuristic classification rules are based on clinical experience and on a set of 247 myocardial scintigrams with their corresponding coronary angiograms.

  15. Two cultures - one symbol

    Directory of Open Access Journals (Sweden)

    O. G. Shostak

    2003-06-01

    Full Text Available This paper is dedicated to the question of similarities in the approach to the multilevel symbolism in Slav and Native American cultures. Ambivalent symbol of the snake is analyzed in the frame of mythological thinking. At the end the author comes to the conclusion that elements of mythological thinking are still present in everyday life and influence human behavior levels

  16. QuBiLS-MIDAS: a parallel free-software for molecular descriptors computation based on multilinear algebraic maps.

    Science.gov (United States)

    García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto

    2014-07-05

    The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.

  17. The number and its symbolism in ancient Greece

    Directory of Open Access Journals (Sweden)

    Doc. dr Milena Bogdanović

    2013-07-01

    Full Text Available The symbols are of particular importance. They are the heart of the creative life; rather they are its core. They reveal the secrets of the unconscious mind open to the unknown and the infinite. While talking or gestures while express, we use the symbols, noting it or not. All spiritual science, all art and all art techniques encounter on their way symbols. History confirms that the symbols of each object can be obtained symbolic value, whether natural (rocks, trees, animals, planets, fire, lightning, etc... or abstract (geometrical shape, number, pace, ideas, etc.... The use of numbers as symbols is as old as language itself, but one that precedes writing, which symbolize numbers (that is, where the reality behind the external characters. The sheer numbers and their symbolism in ancient Greece and is closely associated with the philosophy and mathematics (namely arithmetic. They summarize their view of the world and everything around them. This paper draws attention to the symbolism of the numbers that were in ancient Greece.

  18. Trauma and Symbolic Violence

    DEFF Research Database (Denmark)

    Pedersen, Bodil Maria

    2011-01-01

    - to praxis, and drawing on the concept of symbolic violence, this article contributes to their critique. In order to develop the analysis of difficulties victims may experience, they will be reconceptualised using critical psychological concepts such as 1st person perspectives and participation. The analysis...... seeks to undertake a discussion of personal meanings attributed to 'traumatisation'. It raises questions as to whether concepts of this kind and related practices may constitute symbolic violence and contribute to victimisation through looping-processes. Furthermore it aims at unfolding an understanding...... inclusive of connections between societal practices, aspects of symbolic violence, and the conduct of lives. The analysis is based on an empirical study of victimisation through rape and other forms of sexualised coercion....

  19. Graphic symbols as "the mind on paper": links between children's interpretive theory of mind and symbol understanding.

    Science.gov (United States)

    Myers, Lauren J; Liben, Lynn S

    2012-01-01

    Children gradually develop interpretive theory of mind (iToM)-the understanding that different people may interpret identical events or stimuli differently. The present study tested whether more advanced iToM underlies children's recognition that map symbols' meanings must be communicated to others when symbols are iconic (resemble their referents). Children (6-9 years; N = 80) made maps using either iconic or abstract symbols. After accounting for age, intelligence, vocabulary, and memory, iToM predicted children's success in communicating symbols' meaning to a naïve map-user when mapping tasks involved iconic (but not abstract) symbols. Findings suggest children's growing appreciation of alternative representations and of the intentional assignment of meaning, and support the contention that ToM progresses beyond mastery of false belief. © 2011 The Authors. Child Development © 2011 Society for Research in Child Development, Inc.

  20. Symbolic dynamics of noisy chaos

    Energy Technology Data Exchange (ETDEWEB)

    Crutchfield, J P; Packard, N H

    1983-05-01

    One model of randomness observed in physical systems is that low-dimensional deterministic chaotic attractors underly the observations. A phenomenological theory of chaotic dynamics requires an accounting of the information flow fromthe observed system to the observer, the amount of information available in observations, and just how this information affects predictions of the system's future behavior. In an effort to develop such a description, the information theory of highly discretized observations of random behavior is discussed. Metric entropy and topological entropy are well-defined invariant measures of such an attractor's level of chaos, and are computable using symbolic dynamics. Real physical systems that display low dimensional dynamics are, however, inevitably coupled to high-dimensional randomness, e.g. thermal noise. We investigate the effects of such fluctuations coupled to deterministic chaotic systems, in particular, the metric entropy's response to the fluctuations. It is found that the entropy increases with a power law in the noise level, and that the convergence of the entropy and the effect of fluctuations can be cast as a scaling theory. It is also argued that in addition to the metric entropy, there is a second scaling invariant quantity that characterizes a deterministic system with added fluctuations: I/sub 0/, the maximum average information obtainable about the initial condition that produces a particular sequence of measurements (or symbols). 46 references, 14 figures, 1 table.

  1. Does skill retention benefit from retentivity and symbolic rehearsal? - two studies with a simulated process control task.

    Science.gov (United States)

    Kluge, Annette; Frank, Barbara; Maafi, Sanaz; Kuzmanovska, Aleksandra

    2016-05-01

    Two experiments were designed to compare two symbolic rehearsal refresher interventions (imaginary practice, a hidden introspective process) and investigate the role of retentivity in skill retention. Retentivity is investigated as the ability to memorise and reproduce information and associations that were learned a short time ago. Both experiments comprised initial training (week 1), a symbolic rehearsal for the experimental group (week 2) and a retention assessment (week 3). In the first study, the experimental group received a symbolic rehearsal, while the control group received no rehearsal. In the second study, the experimental group received the same symbolic rehearsal used in study 1, enhanced with rehearsal tasks addressing human-computer interaction. The results showed that both symbolic rehearsal interventions were equally likely to mitigate skill decay. The retentivity showed medium to high correlations with skill retention in both studies, and the results suggest that subjects high in retentivity benefit more from a symbolic rehearsal refresher intervention. Practitioner Summary: Skill decay becomes a problem in situations in which jobs require the correct mastery of non-routine situations. Two experimental studies with simulated process control tasks showed that symbolic rehearsal and retentivity can significantly mitigate skill decay and that subjects higher in retentivity benefit more from refresher interventions.

  2. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    Directory of Open Access Journals (Sweden)

    Olena V. Semenikhina

    2014-08-01

    Full Text Available The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and methodological features of the use of interactive mathematical environments are presented. Problems, which are arising from the use of computer tools, among which rethinking forms and methods of training by teacher, the search for creative problems, the problem of rational choice of environment, check the e-solutions, common mistakes in the use of computer tools are selected.

  3. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  4. A low-cost system for graphical process monitoring with colour video symbol display units

    International Nuclear Information System (INIS)

    Grauer, H.; Jarsch, V.; Mueller, W.

    1977-01-01

    A system for computer controlled graphic process supervision, using color symbol video displays is described. It has the following characteristics: - compact unit: no external memory for image storage - problem oriented simple descriptive cut to the process program - no restriction of the graphical representation of process variables - computer and display independent, by implementation of colours and parameterized code creation for the display. (WB) [de

  5. A transformation with symbolic computation and abundant new soliton-like solutions for the (1+2)-dimensional generalized Burgers equation

    International Nuclear Information System (INIS)

    Yan Zhenya

    2002-01-01

    In this paper, an auto-Baecklund transformation is presented for the generalized Burgers equation: u t +u xy + αuu y +αu x ∂ -1 x u y =0 (α is constant) by using an ansatz and symbolic computation. Particularly, this equation is transformed into a (1+2)-dimensional generalized heat equation ω t + ω xy =0 by the Cole-Hopf transformation. This shows that this equation is C-integrable. Abundant types of new soliton-like solutions are obtained by virtue of the obtained transformation. These solutions contain n-soliton-like solutions, shock wave solutions and singular soliton-like solutions, which may be of important significance in explaining some physical phenomena. The approach can also be extended to other types of nonlinear partial differential equations in mathematical physics

  6. Multiple-Symbol Decision-Feedback Space-Time Differential Decoding in Fading Channels

    Directory of Open Access Journals (Sweden)

    Wang Xiaodong

    2002-01-01

    Full Text Available Space-time differential coding (STDC is an effective technique for exploiting transmitter diversity while it does not require the channel state information at the receiver. However, like conventional differential modulation schemes, it exhibits an error floor in fading channels. In this paper, we develop an STDC decoding technique based on multiple-symbol detection and decision-feedback, which makes use of the second-order statistic of the fading processes and has a very low computational complexity. This decoding method can significantly lower the error floor of the conventional STDC decoding algorithm, especially in fast fading channels. The application of the proposed multiple-symbol decision-feedback STDC decoding technique in orthogonal frequency-division multiplexing (OFDM system is also discussed.

  7. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  8. Symbolic and Nonsymbolic Equivalence Tasks: The Influence of Symbols on Students with Mathematics Difficulty

    Science.gov (United States)

    Driver, Melissa K.; Powell, Sarah R.

    2015-01-01

    Students often experience difficulty with attaching meaning to mathematics symbols. Many students react to symbols, such as the equal sign, as a command to "do something" or "write an answer" without reflecting upon the proper relational meaning of the equal sign. One method for assessing equal-sign understanding is through…

  9. Seeing red? : The agency of computer software in the production and management of students’ school absences

    OpenAIRE

    Bodén, Linnea

    2013-01-01

    An increasing number of Swedish municipalities use digital software to manage the registration of students’ school absences. The software is regarded as a problem-solving tool to make registration more efficient, but its effects on the educational setting have been largely neglected. Focusing on an event with two students from a class of 11-year-olds, the aim of the paper is to explore schools’ common uses of computer software for registering absence in order to understand how materialities –...

  10. Nonlinear Algorithms for Channel Equalization and Map Symbol Detection.

    Science.gov (United States)

    Giridhar, K.

    The transfer of information through a communication medium invariably results in various kinds of distortion to the transmitted signal. In this dissertation, a feed -forward neural network-based equalizer, and a family of maximum a posteriori (MAP) symbol detectors are proposed for signal recovery in the presence of intersymbol interference (ISI) and additive white Gaussian noise. The proposed neural network-based equalizer employs a novel bit-mapping strategy to handle multilevel data signals in an equivalent bipolar representation. It uses a training procedure to learn the channel characteristics, and at the end of training, the multilevel symbols are recovered from the corresponding inverse bit-mapping. When the channel characteristics are unknown and no training sequences are available, blind estimation of the channel (or its inverse) and simultaneous data recovery is required. Convergence properties of several existing Bussgang-type blind equalization algorithms are studied through computer simulations, and a unique gain independent approach is used to obtain a fair comparison of their rates of convergence. Although simple to implement, the slow convergence of these Bussgang-type blind equalizers make them unsuitable for many high data-rate applications. Rapidly converging blind algorithms based on the principle of MAP symbol-by -symbol detection are proposed, which adaptively estimate the channel impulse response (CIR) and simultaneously decode the received data sequence. Assuming a linear and Gaussian measurement model, the near-optimal blind MAP symbol detector (MAPSD) consists of a parallel bank of conditional Kalman channel estimators, where the conditioning is done on each possible data subsequence that can convolve with the CIR. This algorithm is also extended to the recovery of convolutionally encoded waveforms in the presence of ISI. Since the complexity of the MAPSD algorithm increases exponentially with the length of the assumed CIR, a suboptimal

  11. Consumers recall and recognition for brand symbols

    OpenAIRE

    Subhani, Muhammad Imtiaz; Hasan, Syed Akif; Osman, Ms. Amber

    2012-01-01

    Brand Symbols are important for any brand in helping consumers to remember one’s brand at the point of purchase. In advertising different ways are used to grab attention in consumers’ mind and majorly it’s through brand recall and recognition. This research captivates the Brand Symbol concept and determines whether symbols play an important role in creating a differential impact with other brands. Secondly, it also answers that whether brand symbol is the cause of creating positive associatio...

  12. Integrating Symbolic and Statistical Methods for Testing Intelligent Systems Applications to Machine Learning and Computer Vision

    Energy Technology Data Exchange (ETDEWEB)

    Jha, Sumit Kumar [University of Central Florida, Orlando; Pullum, Laura L [ORNL; Ramanathan, Arvind [ORNL

    2016-01-01

    Embedded intelligent systems ranging from tiny im- plantable biomedical devices to large swarms of autonomous un- manned aerial systems are becoming pervasive in our daily lives. While we depend on the flawless functioning of such intelligent systems, and often take their behavioral correctness and safety for granted, it is notoriously difficult to generate test cases that expose subtle errors in the implementations of machine learning algorithms. Hence, the validation of intelligent systems is usually achieved by studying their behavior on representative data sets, using methods such as cross-validation and bootstrapping.In this paper, we present a new testing methodology for studying the correctness of intelligent systems. Our approach uses symbolic decision procedures coupled with statistical hypothesis testing to. We also use our algorithm to analyze the robustness of a human detection algorithm built using the OpenCV open-source computer vision library. We show that the human detection implementation can fail to detect humans in perturbed video frames even when the perturbations are so small that the corresponding frames look identical to the naked eye.

  13. Self-symbols as implicit motivators

    NARCIS (Netherlands)

    Holland, R.W.; Wennekers, A.M.; Bijlstra, G.; Jongenelen, M.M.; van Knippenberg, A.

    2009-01-01

    The present research explored the nonconscious motivational influence of self-symbols. In line with recent findings on the motivational influence of positive affect, we hypothesized that positive affect associated with self-symbols may boost motivation. In Study 1 people drank more of a beverage

  14. Self-symbols as implicit motivators

    NARCIS (Netherlands)

    Holland, R.W.; Wennekers, A.M.; Bijlstra, G.; Jongenelen, M.M.; Knippenberg, A.F.M. van

    2009-01-01

    The present research explored the nonconscious motivational influence of self-symbols. In line with recent findings on the motivational influence of positive affect, we hypothesized that positive affect associated with self-symbols may boost motivation. In Study I people drank more of a beverage

  15. The symbolic economy of drugs.

    Science.gov (United States)

    Lentacker, Antoine

    2016-02-01

    This essay reviews four recent studies representing a new direction in the history of pharmaceuticals and pharmaceutical science. To this end, it introduces the notion of a symbolic economy of drugs, defined as the production, circulation, and reception of signs that convey information about drugs and establish trust in them. Each of the studies under review focuses on one key signifier in this symbolic economy, namely the brand, the patent, the clinical trial, and the drug itself. Drawing on Pierre Bourdieu's theory of the economy of symbolic goods, I conceptualize these signifiers as symbolic assets, that is, as instruments of communication and credit, delivering knowledge, carrying value, and producing authority. The notion of a symbolic economy is offered with a threefold intention. First, I introduce it in order to highlight the implications of historical and anthropological work for a broader theory of the economy of drugs, thus suggesting a language for interdisciplinary conversations in the study of pharmaceuticals. Second, I deploy it in an attempt to emphasize the contributions of the recent scholarship on drugs to a critical understanding of our own contemporary ways of organizing access to drugs and information about drugs. Finally, I suggest ways in which it might be of use to scholars of other commodities and technologies.

  16. Symbolic computation of the Hartree-Fock energy from a chiral EFT three-nucleon interaction at N2LO

    International Nuclear Information System (INIS)

    Gebremariam, B.; Bogner, S.K.; Duguet, T.

    2010-01-01

    We present the first of a two-part Mathematica notebook collection that implements a symbolic approach for the application of the density matrix expansion (DME) to the Hartree-Fock (HF) energy from a chiral effective field theory (EFT) three-nucleon interaction at N 2 LO. The final output from the notebooks is a Skyrme-like energy density functional that provides a quasi-local approximation to the non-local HF energy. In this paper, we discuss the derivation of the HF energy and its simplification in terms of the scalar/vector-isoscalar/isovector parts of the one-body density matrix. Furthermore, a set of steps is described and illustrated on how to extend the approach to other three-nucleon interactions. Program summary: Program title: SymbHFNNN; Catalogue identifier: AEGC v 1 0 ; Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEGC_v1_0.html; Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland; Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html; No. of lines in distributed program, including test data, etc.: 96 666; No. of bytes in distributed program, including test data, etc.: 378 083; Distribution format: tar.gz; Programming language: Mathematica 7.1; Computer: Any computer running Mathematica 6.0 and later versions; Operating system: Windows Xp, Linux/Unix; RAM: 256 Mb; Classification: 5, 17.16, 17.22; Nature of problem: The calculation of the HF energy from the chiral EFT three-nucleon interaction at N 2 LO involves tremendous spin-isospin algebra. The problem is compounded by the need to eventually obtain a quasi-local approximation to the HF energy, which requires the HF energy to be expressed in terms of scalar/vector-isoscalar/isovector parts of the one-body density matrix. The Mathematica notebooks discussed in this paper solve the latter issue. Solution method: The HF energy from the chiral EFT three-nucleon interaction at N 2 LO is cast into a form suitable for an automatic

  17. Exploring a business to business recurring revenue framework for the delivery of software as a service through a cloud computing channel

    OpenAIRE

    Dempsey, David

    2015-01-01

    Cloud Computing (CC) is creating a new paradigm for the distribution of computer software applications. Within this context CC enabled Software as a Service (SaaS) fundamentally changes the revenue expectations and business model for the application software industry. This study considers the revenue expectation of the CC industry and its dependency on renewal subscriptions, while the study focuses on SaaS in the Business-to-Business (B2B) domain, delivered through the CC chann...

  18. The Optimal Pricing of Computer Software and Other Products with High Switching Costs

    OpenAIRE

    Pekka Ahtiala

    2004-01-01

    The paper studies the determinants of the optimum prices of computer programs and their upgrades. It is based on the notion that because of the human capital invested in the use of a computer program by its user, this product has high switching costs, and on the finding that pirates are responsible for generating over 80 per cent of new software sales. A model to maximize the present value of the program to the program house is constructed to determine the optimal prices of initial programs a...

  19. Recurrent autoassociative networks and holistic computations

    NARCIS (Netherlands)

    Stoianov, [No Value; Amari, SI; Giles, CL; Gori, M; Piuri,

    2000-01-01

    The paper presents an experimental study of holistic computations over distributed representations (DRs) of sequences developed by the Recurrent Autoassociative Networks (KAN). Three groups of holistic operators are studied: extracting symbols at fixed position, extracting symbols at a variable

  20. 7 CFR 29.3013 - Combination color symbols.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Combination color symbols. 29.3013 Section 29.3013..., Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE COMMODITY STANDARDS AND STANDARD CONTAINER... Type 93) § 29.3013 Combination color symbols. As applied to Burley, combination color symbols are as...