WorldWideScience

Sample records for sampling-based computational strategy

  1. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  2. Development of a Sampling-Based Global Sensitivity Analysis Workflow for Multiscale Computational Cancer Models

    Science.gov (United States)

    Wang, Zhihui; Deisboeck, Thomas S.; Cristini, Vittorio

    2014-01-01

    There are two challenges that researchers face when performing global sensitivity analysis (GSA) on multiscale in silico cancer models. The first is increased computational intensity, since a multiscale cancer model generally takes longer to run than does a scale-specific model. The second problem is the lack of a best GSA method that fits all types of models, which implies that multiple methods and their sequence need to be taken into account. In this article, we therefore propose a sampling-based GSA workflow consisting of three phases – pre-analysis, analysis, and post-analysis – by integrating Monte Carlo and resampling methods with the repeated use of analysis of variance (ANOVA); we then exemplify this workflow using a two-dimensional multiscale lung cancer model. By accounting for all parameter rankings produced by multiple GSA methods, a summarized ranking is created at the end of the workflow based on the weighted mean of the rankings for each input parameter. For the cancer model investigated here, this analysis reveals that ERK, a downstream molecule of the EGFR signaling pathway, has the most important impact on regulating both the tumor volume and expansion rate in the algorithm used. PMID:25257020

  3. Cloud computing strategies

    CERN Document Server

    Chorafas, Dimitris N

    2011-01-01

    A guide to managing cloud projects, Cloud Computing Strategies provides the understanding required to evaluate the technology and determine how it can be best applied to improve business and enhance your overall corporate strategy. Based on extensive research, it examines the opportunities and challenges that loom in the cloud. It explains exactly what cloud computing is, what it has to offer, and calls attention to the important issues management needs to consider before passing the point of no return regarding financial commitments.

  4. Evaluation strategies for monadic computations

    Directory of Open Access Journals (Sweden)

    Tomas Petricek

    2012-02-01

    Full Text Available Monads have become a powerful tool for structuring effectful computations in functional programming, because they make the order of effects explicit. When translating pure code to a monadic version, we need to specify evaluation order explicitly. Two standard translations give call-by-value and call-by-name semantics. The resulting programs have different structure and types, which makes revisiting the choice difficult. In this paper, we translate pure code to monadic using an additional operation malias that abstracts out the evaluation strategy. The malias operation is based on computational comonads; we use a categorical framework to specify the laws that are required to hold about the operation. For any monad, we show implementations of malias that give call-by-value and call-by-name semantics. Although we do not give call-by-need semantics for all monads, we show how to turn certain monads into an extended monad with call-by-need semantics, which partly answers an open question. Moreover, using our unified translation, it is possible to change the evaluation strategy of functional code translated to the monadic form without changing its structure or types.

  5. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    Science.gov (United States)

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. A Sample-Based Forest Monitoring Strategy Using Landsat, AVHRR and MODIS Data to Estimate Gross Forest Cover Loss in Malaysia between 1990 and 2005

    Directory of Open Access Journals (Sweden)

    Peter Potapov

    2013-04-01

    Full Text Available Insular Southeast Asia is a hotspot of humid tropical forest cover loss. A sample-based monitoring approach quantifying forest cover loss from Landsat imagery was implemented to estimate gross forest cover loss for two eras, 1990–2000 and 2000–2005. For each time interval, a probability sample of 18.5 km × 18.5 km blocks was selected, and pairs of Landsat images acquired per sample block were interpreted to quantify forest cover area and gross forest cover loss. Stratified random sampling was implemented for 2000–2005 with MODIS-derived forest cover loss used to define the strata. A probability proportional to x (πpx design was implemented for 1990–2000 with AVHRR-derived forest cover loss used as the x variable to increase the likelihood of including forest loss area in the sample. The estimated annual gross forest cover loss for Malaysia was 0.43 Mha/yr (SE = 0.04 during 1990–2000 and 0.64 Mha/yr (SE = 0.055 during 2000–2005. Our use of the πpx sampling design represents a first practical trial of this design for sampling satellite imagery. Although the design performed adequately in this study, a thorough comparative investigation of the πpx design relative to other sampling strategies is needed before general design recommendations can be put forth.

  7. Integrating Computer-Mediated Communication Strategy Instruction

    Science.gov (United States)

    McNeil, Levi

    2016-01-01

    Communication strategies (CSs) play important roles in resolving problematic second language interaction and facilitating language learning. While studies in face-to-face contexts demonstrate the benefits of communication strategy instruction (CSI), there have been few attempts to integrate computer-mediated communication and CSI. The study…

  8. Precise fixpoint computation through strategy iteration

    DEFF Research Database (Denmark)

    Gawlitza, Thomas; Seidl, Helmut

    2007-01-01

    We present a practical algorithm for computing least solutions of systems of equations over the integers with addition, multiplication with positive constants, maximum and minimum. The algorithm is based on strategy iteration. Its run-time (w.r.t. the uniform cost measure) is independent of the s......We present a practical algorithm for computing least solutions of systems of equations over the integers with addition, multiplication with positive constants, maximum and minimum. The algorithm is based on strategy iteration. Its run-time (w.r.t. the uniform cost measure) is independent...

  9. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  10. Parallel Computing Strategies for Irregular Algorithms

    Science.gov (United States)

    Biswas, Rupak; Oliker, Leonid; Shan, Hongzhang; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Parallel computing promises several orders of magnitude increase in our ability to solve realistic computationally-intensive problems, but relies on their efficient mapping and execution on large-scale multiprocessor architectures. Unfortunately, many important applications are irregular and dynamic in nature, making their effective parallel implementation a daunting task. Moreover, with the proliferation of parallel architectures and programming paradigms, the typical scientist is faced with a plethora of questions that must be answered in order to obtain an acceptable parallel implementation of the solution algorithm. In this paper, we consider three representative irregular applications: unstructured remeshing, sparse matrix computations, and N-body problems, and parallelize them using various popular programming paradigms on a wide spectrum of computer platforms ranging from state-of-the-art supercomputers to PC clusters. We present the underlying problems, the solution algorithms, and the parallel implementation strategies. Smart load-balancing, partitioning, and ordering techniques are used to enhance parallel performance. Overall results demonstrate the complexity of efficiently parallelizing irregular algorithms.

  11. Computer games: Apprehension of learning strategies

    Directory of Open Access Journals (Sweden)

    Carlos Antonio Bruno da Silva

    2003-12-01

    Full Text Available Computer games and mainly videogames have proved to be an important tendency in Brazilian children’s play. They are part of the playful culture, which associates modern technology to traditional play preserving the importance of the latter. Based on Vygotsky and Chadwick’s ideas, this work studies the alternatives in the use of videogame by the occupational therapist, educator or parents, aiming prevention of learning difficulty by means of apprehension of learning strategies. Sixty children were investigated under dialectic, descriptive qualitative/quantitative focus. There was a semi-structured interview, direct observation and focused group applied to this intentional sample. Out of the 60 children playing in 3 videogame rental shops in Fortaleza-CE and Quixadá-CE, 30 aged 4 to 6 years old and the other 30 aged 7 and 8. Results indicate that the determination that the videogame is played in-group favors the apprehension of learning and affective strategies, processing, and meta-cognition. Therefore, videogame can be considered an excellent resource in terms of preventing learning difficulties, enabling children to their reality.

  12. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  13. Replacement strategy for obsolete plant computers

    International Nuclear Information System (INIS)

    Schaefer, J.P.

    1985-01-01

    The plant computers of the first generation of larger nuclear power plants are reaching the end of their useful life time with respect to the hardware. The software would be no reason for a system exchange but new tasks for the supervisory computer system, availability questions of maintenance personnel and spare parts and the demand for improved operating procedures for the computer users have stimulated the considerations on how to exchange a computer system in a nuclear power plant without extending plant outage times due to exchange works. In the Federal Republic of Germany the planning phase of such backfitting projects is well under way, some projects are about to be implemented. The base for these backfitting projects is a modular supervisory computer concept which has been designated for the new line of KWU PWR's. The main characteristic of this computer system is the splitting of the system into a data acquisition level and a data processing level. This principle allows an extension of the processing level or even repeated replacements of the processing computers. With the existing computer system still in operation the new system can be installed in a step-by-step procedure. As soon as the first of the redundant process computers of the data processing level is in operation and the data link to the data acquisition computers is established the old computer system can be taken out of service. Then the back-up processing computer can be commissioned to complete the new system. (author)

  14. Cloud computing and ROI a new framework for it strategy

    CERN Document Server

    Mohapatra, Sanjay

    2014-01-01

    This book develops an IT strategy for cloud computing that helps businesses evaluate their readiness for cloud services and calculate the ROI. The framework provided helps reduce risks involved in transitioning from traditional "on site" IT strategy to virtual "cloud computing." Since the advent of cloud computing, many organizations have made substantial gains implementing this innovation. Cloud computing allows companies to focus more on their core competencies, as IT enablement is taken care of through cloud services. Cloud Computing and ROI includes case studies covering retail, automobile and food processing industries. Each of these case studies have successfully implemented the cloud computing framework and their strategies are explained. As cloud computing may not be ideal for all businesses, criteria?are also offered to help determine if this strategy should be adopt.

  15. Computer Ethics Topics and Teaching Strategies.

    Science.gov (United States)

    DeLay, Jeanine A.

    An overview of six major issues in computer ethics is provided in this paper: (1) unauthorized and illegal database entry, surveillance and monitoring, and privacy issues; (2) piracy and intellectual property theft; (3) equity and equal access; (4) philosophical implications of artificial intelligence and computer rights; (5) social consequences…

  16. Computer-based learning: games as an instructional strategy.

    Science.gov (United States)

    Blake, J; Goodman, J

    1999-01-01

    Games are a creative teaching strategy that enhances learning and problem solving. Gaming strategies are being used by the authors to make learning interesting, stimulating and fun. This article focuses on the development and implementation of computer games as an instructional strategy. Positive outcomes have resulted from the use of games in the classroom.

  17. Cloud Computing:Strategies for Cloud Computing Adoption

    OpenAIRE

    Shimba, Faith

    2010-01-01

    The advent of cloud computing in recent years has sparked an interest from different organisations, institutions and users to take advantage of web applications. This is a result of the new economic model for the Information Technology (IT) department that cloud computing promises. The model promises a shift from an organisation required to invest heavily for limited IT resources that are internally managed, to a model where the organisation can buy or rent resources that are managed by a clo...

  18. Food Security Strategy Based on Computer Innovation

    OpenAIRE

    Ruihui Mu

    2015-01-01

    Case analysis to identify innovative strategies for food security occurred in the Oriental Hotel, voluntarily implement food safety control. Food security strategy investigation and the reasons for their use of multiple data sources, including accommodation and catering industry to implement and document interviews with key decision makers in the hotel performed to observe the business environment were examined. This finding suggests that addressing food security, not only is the food control...

  19. ANL statement of site strategy for computing workstations

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R. (ed.); Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O' Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.

  20. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  1. Investigation of measuring strategies in computed tomography

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Cantatore, Angela

    2011-01-01

    Computed tomography has entered the industrial world in 1980’s as a technique for non-destructive testing and has nowadays become a revolutionary tool for dimensional metrology, suitable for actual/nominal comparison and verification of geometrical and dimensional tolerances. This paper evaluates...

  2. The Strategy Blueprint: A Strategy Process Computer-Aided Design Tool

    OpenAIRE

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based tool, known as ‘the Strategy Blueprint’, consisting of a combination of nine strategy techniques, which can help organizations define the most suitable strategy, based on the internal and external f...

  3. Computer Solution to the Game of Pure Strategy

    Directory of Open Access Journals (Sweden)

    Laurent Bartholdi

    2012-11-01

    Full Text Available We numerically solve the classical "Game of Pure Strategy" using linear programming. We notice an intricate even-odd behaviour in the results of our computations that seems to encourage odd or maximal bids.

  4. Scalable Strategies for Computing with Massive Data

    Directory of Open Access Journals (Sweden)

    Michael Kane

    2013-11-01

    Full Text Available This paper presents two complementary statistical computing frameworks that address challenges in parallel processing and the analysis of massive data. First, the foreach package allows users of the R programming environment to define parallel loops that may be run sequentially on a single machine, in parallel on a symmetric multiprocessing (SMP machine, or in cluster environments without platform-specific code. Second, the bigmemory package implements memory- and file-mapped data structures that provide (a access to arbitrarily large data while retaining a look and feel that is familiar to R users and (b data structures that are shared across processor cores in order to support efficient parallel computing techniques. Although these packages may be used independently, this paper shows how they can be used in combination to address challenges that have effectively been beyond the reach of researchers who lack specialized software development skills or expensive hardware.

  5. Multiscale methods in turbulent combustion: strategies and computational challenges

    International Nuclear Information System (INIS)

    Echekki, Tarek

    2009-01-01

    A principal challenge in modeling turbulent combustion flows is associated with their complex, multiscale nature. Traditional paradigms in the modeling of these flows have attempted to address this nature through different strategies, including exploiting the separation of turbulence and combustion scales and a reduced description of the composition space. The resulting moment-based methods often yield reasonable predictions of flow and reactive scalars' statistics under certain conditions. However, these methods must constantly evolve to address combustion at different regimes, modes or with dominant chemistries. In recent years, alternative multiscale strategies have emerged, which although in part inspired by the traditional approaches, also draw upon basic tools from computational science, applied mathematics and the increasing availability of powerful computational resources. This review presents a general overview of different strategies adopted for multiscale solutions of turbulent combustion flows. Within these strategies, some specific models are discussed or outlined to illustrate their capabilities and underlying assumptions. These strategies may be classified under four different classes, including (i) closure models for atomistic processes, (ii) multigrid and multiresolution strategies, (iii) flame-embedding strategies and (iv) hybrid large-eddy simulation-low-dimensional strategies. A combination of these strategies and models can potentially represent a robust alternative strategy to moment-based models; but a significant challenge remains in the development of computational frameworks for these approaches as well as their underlying theories. (topical review)

  6. Teaching Concept Mapping and University Level Study Strategies Using Computers.

    Science.gov (United States)

    Mikulecky, Larry; And Others

    1989-01-01

    Assesses the utility and effectiveness of three interactive computer programs and associated print materials in instructing and modeling for undergraduates how to comprehend and reconceptualize scientific textbook material. Finds that "how to" reading strategies can be taught via computer and transferred to new material. (RS)

  7. Hardware and software maintenance strategies for upgrading vintage computers

    International Nuclear Information System (INIS)

    Wang, B.C.; Buijs, W.J.; Banting, R.D.

    1992-01-01

    The paper focuses on the maintenance of the computer hardware and software for digital control computers (DCC). Specific design and problems related to various maintenance strategies are reviewed. A foundation was required for a reliable computer maintenance and upgrading program to provide operation of the DCC with high availability and reliability for 40 years. This involved a carefully planned and executed maintenance and upgrading program, involving complementary hardware and software strategies. The computer system was designed on a modular basis, with large sections easily replaceable, to facilitate maintenance and improve availability of the system. Advances in computer hardware have made it possible to replace DCC peripheral devices with reliable, inexpensive, and widely available components from PC-based systems (PC = personal computer). By providing a high speed link from the DCC to a PC, it is now possible to use many commercial software packages to process data from the plant. 1 fig

  8. Strategies for MCMC computation inquantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánēz-Escriche, Noelia; Sorensen, Daniel

    another extension of the linear mixed model introducing genetic random effects influencing the log residual variances of the observations thereby producing a genetically structured variance heterogeneity. Considerable computational problems arise when abandoning the standard linear mixed model. Maximum...... the various algorithms in the context of the heterogeneous variance model. Apart from being a model of great interest in its own right, this model has proven to be a hard test for MCMC methods. We compare the performances of the different algorithms when applied to three real datasets which differ markedly...... results of applying two MCMC schemes to data sets with pig litter sizes, rabbit litter sizes, and snail weights. Some concluding remarks are given in Section 5....

  9. Three-dimensional protein structure prediction: Methods and computational strategies.

    Science.gov (United States)

    Dorn, Márcio; E Silva, Mariel Barbachan; Buriol, Luciana S; Lamb, Luis C

    2014-10-12

    A long standing problem in structural bioinformatics is to determine the three-dimensional (3-D) structure of a protein when only a sequence of amino acid residues is given. Many computational methodologies and algorithms have been proposed as a solution to the 3-D Protein Structure Prediction (3-D-PSP) problem. These methods can be divided in four main classes: (a) first principle methods without database information; (b) first principle methods with database information; (c) fold recognition and threading methods; and (d) comparative modeling methods and sequence alignment strategies. Deterministic computational techniques, optimization techniques, data mining and machine learning approaches are typically used in the construction of computational solutions for the PSP problem. Our main goal with this work is to review the methods and computational strategies that are currently used in 3-D protein prediction. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. A New Adaptive Checkpointing Strategy for Mobile Computing

    Institute of Scientific and Technical Information of China (English)

    MENChaoguang; ZUODecheng; YANGXiaozong

    2005-01-01

    Adaptive checkpointing strategy is an efficient recovery scheme, which is suitable for mobile computing system. However, all existing adaptive checkpointing schemes are not correct to recover system when failure occurs in some special period. In this paper, the issues that will lead to system inconsistency are first discussed and then a new adaptive strategy that can recover system to correct consistent state is proposed. Our algorithm improves system recovery performance because only failure process needs rollback through logging.

  11. The Strategy Blueprint : A Strategy Process Computer-Aided Design Tool

    NARCIS (Netherlands)

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based

  12. Efficient Strategy Computation in Zero-Sum Asymmetric Repeated Games

    KAUST Repository

    Li, Lichun

    2017-03-06

    Zero-sum asymmetric games model decision making scenarios involving two competing players who have different information about the game being played. A particular case is that of nested information, where one (informed) player has superior information over the other (uninformed) player. This paper considers the case of nested information in repeated zero-sum games and studies the computation of strategies for both the informed and uninformed players for finite-horizon and discounted infinite-horizon nested information games. For finite-horizon settings, we exploit that for both players, the security strategy, and also the opponent\\'s corresponding best response depend only on the informed player\\'s history of actions. Using this property, we refine the sequence form, and formulate an LP computation of player strategies that is linear in the size of the uninformed player\\'s action set. For the infinite-horizon discounted game, we construct LP formulations to compute the approximated security strategies for both players, and provide a bound on the performance difference between the approximated security strategies and the security strategies. Finally, we illustrate the results on a network interdiction game between an informed system administrator and uniformed intruder.

  13. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  14. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  15. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  16. CLOUD COMPUTING ADOPTION STRATEGIES AT PT TASPEN INDONESIA, Tbk

    Directory of Open Access Journals (Sweden)

    Julirzal Sarmedy

    2014-10-01

    Full Text Available PT. Taspen as Indonesian institution, is responsible for managing social insuranceprograms of civil servants. With branch offices and business partners who are geographicallydispersed throughout Indonesia, information technology is very important to support thebusiness processes. Cloud computing is a model of information technology services that couldpotentially increase the effectiveness and efficiency of PT. Taspen information system. Thisstudy examines the phenomenon exists at PT. Taspen in order to adopt cloud computing inthe information system, by using the framework of Technology-Organization-Environment,Diffusion of Innovation theory, and Partial Least Square method. Organizational factor isthe most dominant for PT. Taspen to adopt cloud computing. Referring to these findings,then a SWOT analysis and TOWS matrix are performed, which in this study recommendsthe implementation of a strategy model of cloud computing services that are private andgradually in process.

  17. Computational assessment of visual search strategies in volumetric medical images.

    Science.gov (United States)

    Wen, Gezheng; Aizenman, Avigael; Drew, Trafton; Wolfe, Jeremy M; Haygood, Tamara Miner; Markey, Mia K

    2016-01-01

    When searching through volumetric images [e.g., computed tomography (CT)], radiologists appear to use two different search strategies: "drilling" (restrict eye movements to a small region of the image while quickly scrolling through slices), or "scanning" (search over large areas at a given depth before moving on to the next slice). To computationally identify the type of image information that is used in these two strategies, 23 naïve observers were instructed with either "drilling" or "scanning" when searching for target T's in 20 volumes of faux lung CTs. We computed saliency maps using both classical two-dimensional (2-D) saliency, and a three-dimensional (3-D) dynamic saliency that captures the characteristics of scrolling through slices. Comparing observers' gaze distributions with the saliency maps showed that search strategy alters the type of saliency that attracts fixations. Drillers' fixations aligned better with dynamic saliency and scanners with 2-D saliency. The computed saliency was greater for detected targets than for missed targets. Similar results were observed in data from 19 radiologists who searched five stacks of clinical chest CTs for lung nodules. Dynamic saliency may be superior to the 2-D saliency for detecting targets embedded in volumetric images, and thus "drilling" may be more efficient than "scanning."

  18. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  19. Computing security strategies in finite horizon repeated Bayesian games

    KAUST Repository

    Lichun Li

    2017-07-10

    This paper studies security strategies in two-player zero-sum repeated Bayesian games with finite horizon. In such games, each player has a private type which is independently chosen according to a publicly known a priori probability. Players\\' types are fixed all through the game. The game is played for finite stages. At every stage, players simultaneously choose their actions which are observed by the public. The one-stage payoff of player 1 (or penalty to player 2) depends on both players types and actions, and is not directly observed by any player. While player 1 aims to maximize the total payoff over the game, player 2 wants to minimize it. This paper provides each player two ways to compute the security strategy, i.e. the optimal strategy in the worst case. First, a security strategy that directly depends on both players\\' history actions is derived by refining the sequence form. Noticing that history action space grows exponentially with respect to the time horizon, this paper further presents a security strategy that depends on player\\'s fixed sized sufficient statistics. The sufficient statistics is shown to consist of the belief on one\\'s own type, the regret on the other player\\'s type, and the stage, and is independent of the other player\\'s strategy.

  20. Retrieval and organizational strategies in conceptual memory a computer model

    CERN Document Server

    Kolodner, Janet L

    2014-01-01

    'Someday we expect that computers will be able to keep us informed about the news. People have imagined being able to ask their home computers questions such as "What's going on in the world?"…'. Originally published in 1984, this book is a fascinating look at the world of memory and computers before the internet became the mainstream phenomenon it is today. It looks at the early development of a computer system that could keep us informed in a way that we now take for granted. Presenting a theory of remembering, based on human information processing, it begins to address many of the hard problems implicated in the quest to make computers remember. The book had two purposes in presenting this theory of remembering. First, to be used in implementing intelligent computer systems, including fact retrieval systems and intelligent systems in general. Any intelligent program needs to use and store and use a great deal of knowledge. The strategies and structures in the book were designed to be used for that purpos...

  1. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  2. A strategy to compute plastic post-buckling of structures

    International Nuclear Information System (INIS)

    Combescure, A.

    1983-08-01

    The paper gives a general framework to the different strategies used to compute the post-buckling of structures. Two particular strategies are studied in more details and it is shown how they can be applied in the plastic regime. All the methods suppose that the loads F are proportional to a simple parameter lambda; more precisely: eq (1) F = lambda F 0 . The paper shows how these methods can be implemented in a very simple way. In the elastic case we show the application of the method to the calculation of post buckling response of a clamped arch. The method is also applied to a very simple case of two bars which can be calculated analytically. In the plastic range, the method is applied to the post-buckling of an imperfect ring which can be calculated analytically. Another example is the comparison of the comparison of the computed post-buckling of a thin cylinder under axial compression, and of the experimental behavior on the same cylinder. The limitation of these types of strategies are also mentionned and the physical signifiance of calculations in the post-buckling regime are discussed

  3. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  4. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).

  5. Cleared risks? Proactive strategies as service in Cloud Computing contexts

    Directory of Open Access Journals (Sweden)

    Manuela Moro-Cabero

    2018-02-01

    Full Text Available The increasing use of Cloud Computing services at organizations is an undeniable reality nowadays. Record managers must then adopt a proactive and compromised position, giving advice to users. However, as a consequence of lack of knowledge in the function and regulation of the field, the implementation of those services faces no little confrontation. This descriptive essay, supported by a relevant number of heterogeneous resources, systematizes the menaces of managing and storing information with these services. At the same time, the study establishes a set of action-strategies for both reaching and hiring agreements. The objective of the paper is, therefore, to make the professional aware of the potential of these services as assessing and controlling tools, as well as ensuring the digital continuity and the record resources preservation in the Cloud.

  6. Strategy Ranges: Describing Change in Prospective Elementary Teachers' Approaches to Mental Computation of Sums and Differences

    Science.gov (United States)

    Whitacre, Ian

    2015-01-01

    This study investigated the sets of mental computation strategies used by prospective elementary teachers to compute sums and differences of whole numbers. In the context of an intervention designed to improve the number sense of prospective elementary teachers, participants were interviewed pre/post, and their mental computation strategies were…

  7. Vanguard/rearguard strategy for the evaluation of the degradation of yoghurt samples based on the direct analysis of the volatiles profile through headspace-gas chromatography-mass spectrometry.

    Science.gov (United States)

    Carrillo-Carrión, C; Cárdenas, S; Valcárcel, M

    2007-02-02

    A vanguard/rearguard analytical strategy for the monitoring of the degradation of yoghurt samples is proposed. The method is based on the headspace-gas chromatography-mass spectrometry (HS-GC-MS) instrumental coupling. In this combination, the chromatographic column is firstly used as an interface between the HS and the MS (vanguard mode) avoiding separation of the volatile components by maintaining the chromatographic oven at high, constant temperature. By changing the thermal conditions of the oven, the aldehydes can be properly separated for individual identification/quantification (rearguard mode). In the vanguard method, the quantification of the volatile aldehydes was calculated through partial least square and given as a total index. The rearguard method permits the detection of the aldehydes at concentrations between 12 and 35 ng/g. Both methods were applied to the study of the environmental factors favouring the presence of the volatile aldehydes (C(5)-C(9)) in the yoghurt samples. Principal component analysis of the total concentration of aldehydes with the time (from 0 to 30 days) demonstrates the capability of the HS-MS coupling for the estimation of the quality losses of the samples. The results were corroborated by the HS-GC-MS which also indicates that pentanal was present in the yoghurt from the beginning of the study and the combination of light/oxygen was the most negative influence for sample conservation.

  8. A strategy to compute plastic post-buckling of structures

    International Nuclear Information System (INIS)

    Combescure, A.

    1983-01-01

    All the methods presented here give in some cases, some interesting computed solutions. It has been remarked that the different strategies do not always give the same post buckling path. More foundamentally, it has been observed that the post buckling path, when buckling is unstable, is characterized by a dynamic movement. All inertial effects are neglected in all the approaches presented here. So that the post buckling load deflections curve is valid only if there is a very little kinetic energy associated with the post buckling. The method is also, as it is presented, limited to a load depending of a simple parameter lambda. The case of more than one parameter is not very clear yet. In conclusion, the method presented here gives a way to solve class of the post buckling behavior of a structure. If the post buckling occurs with a small kinetic energy (displacement controlled buckling) and if the loads depend of only one parameter. These methods should give good results even into the plastic range. If the buckling is unstable and that a large kinetic energy is involved with the post buckling these methods are not realistic. (orig./RW)

  9. Strategies for improving approximate Bayesian computation tests for synchronous diversification.

    Science.gov (United States)

    Overcast, Isaac; Bagley, Justin C; Hickerson, Michael J

    2017-08-24

    Estimating the variability in isolation times across co-distributed taxon pairs that may have experienced the same allopatric isolating mechanism is a core goal of comparative phylogeography. The use of hierarchical Approximate Bayesian Computation (ABC) and coalescent models to infer temporal dynamics of lineage co-diversification has been a contentious topic in recent years. Key issues that remain unresolved include the choice of an appropriate prior on the number of co-divergence events (Ψ), as well as the optimal strategies for data summarization. Through simulation-based cross validation we explore the impact of the strategy for sorting summary statistics and the choice of prior on Ψ on the estimation of co-divergence variability. We also introduce a new setting (β) that can potentially improve estimation of Ψ by enforcing a minimal temporal difference between pulses of co-divergence. We apply this new method to three empirical datasets: one dataset each of co-distributed taxon pairs of Panamanian frogs and freshwater fishes, and a large set of Neotropical butterfly sister-taxon pairs. We demonstrate that the choice of prior on Ψ has little impact on inference, but that sorting summary statistics yields substantially more reliable estimates of co-divergence variability despite violations of assumptions about exchangeability. We find the implementation of β improves estimation of Ψ, with improvement being most dramatic given larger numbers of taxon pairs. We find equivocal support for synchronous co-divergence for both of the Panamanian groups, but we find considerable support for asynchronous divergence among the Neotropical butterflies. Our simulation experiments demonstrate that using sorted summary statistics results in improved estimates of the variability in divergence times, whereas the choice of hyperprior on Ψ has negligible effect. Additionally, we demonstrate that estimating the number of pulses of co-divergence across co-distributed taxon

  10. The Effects of Computer-Assisted Feedback Strategies in Technology Education: A Comparison of Learning Outcomes

    Science.gov (United States)

    Adams, Ruifang Hope; Strickland, Jane

    2012-01-01

    This study investigated the effects of computer-assisted feedback strategies that have been utilized by university students in a technology education curriculum. Specifically, the study examined the effectiveness of the computer-assisted feedback strategy "Knowledge of Response feedback" (KOR), and the "Knowledge of Correct Responses feedback"…

  11. Mental Computation or Standard Algorithm? Children's Strategy Choices on Multi-Digit Subtractions

    Science.gov (United States)

    Torbeyns, Joke; Verschaffel, Lieven

    2016-01-01

    This study analyzed children's use of mental computation strategies and the standard algorithm on multi-digit subtractions. Fifty-eight Flemish 4th graders of varying mathematical achievement level were individually offered subtractions that either stimulated the use of mental computation strategies or the standard algorithm in one choice and two…

  12. English Language Learners' Strategies for Reading Computer-Based Texts at Home and in School

    Science.gov (United States)

    Park, Ho-Ryong; Kim, Deoksoon

    2016-01-01

    This study investigated four elementary-level English language learners' (ELLs') use of strategies for reading computer-based texts at home and in school. The ELLs in this study were in the fourth and fifth grades in a public elementary school. We identify the ELLs' strategies for reading computer-based texts in home and school environments. We…

  13. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  14. Pedagogical Strategies to Increase Pre-service Teachers’ Confidence in Computer Learning

    Directory of Open Access Journals (Sweden)

    Li-Ling Chen

    2004-07-01

    Full Text Available Pre-service teachers’ attitudes towards computers significantly influence their future adoption of integrating computer technology into their teaching. What are the pedagogical strategies that a teacher education instructor or an instructional designer can incorporate to enhance a pre-service teacher’s comfort level in using computers? In this exploratory report, the researcher synthesizes related literature, provides a comprehensive list of theory-based instructional strategies, and describes a study of the perceptions of 189 pre-service teachers regarding strategies related to increasing their comfort in using computers.

  15. Listening Strategy Use and Influential Factors in Web-Based Computer Assisted Language Learning

    Science.gov (United States)

    Chen, L.; Zhang, R.; Liu, C.

    2014-01-01

    This study investigates second and foreign language (L2) learners' listening strategy use and factors that influence their strategy use in a Web-based computer assisted language learning (CALL) system. A strategy inventory, a factor questionnaire and a standardized listening test were used to collect data from a group of 82 Chinese students…

  16. The research of computer network security and protection strategy

    Science.gov (United States)

    He, Jian

    2017-05-01

    With the widespread popularity of computer network applications, its security is also received a high degree of attention. Factors affecting the safety of network is complex, for to do a good job of network security is a systematic work, has the high challenge. For safety and reliability problems of computer network system, this paper combined with practical work experience, from the threat of network security, security technology, network some Suggestions and measures for the system design principle, in order to make the masses of users in computer networks to enhance safety awareness and master certain network security technology.

  17. Efficient Strategy Computation in Zero-Sum Asymmetric Repeated Games

    KAUST Repository

    Li, Lichun; Shamma, Jeff S.

    2017-01-01

    -horizon nested information games. For finite-horizon settings, we exploit that for both players, the security strategy, and also the opponent's corresponding best response depend only on the informed player's history of actions. Using this property, we refine

  18. Darwinian Spacecraft: Soft Computing Strategies Breeding Better, Faster Cheaper

    Science.gov (United States)

    Noever, David A.; Baskaran, Subbiah

    1999-01-01

    Computers can create infinite lists of combinations to try to solve a particular problem, a process called "soft-computing." This process uses statistical comparables, neural networks, genetic algorithms, fuzzy variables in uncertain environments, and flexible machine learning to create a system which will allow spacecraft to increase robustness, and metric evaluation. These concepts will allow for the development of a spacecraft which will allow missions to be performed at lower costs.

  19. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Kothe, Douglas B [ORNL; Nam, Hai Ah [ORNL

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  20. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    International Nuclear Information System (INIS)

    Joubert, Wayne; Kothe, Douglas B.; Nam, Hai Ah

    2009-01-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  1. Problems and accommodation strategies reported by computer users with rheumatoid arthritis or fibromyalgia.

    Science.gov (United States)

    Baker, Nancy A; Rubinstein, Elaine N; Rogers, Joan C

    2012-09-01

    Little is known about the problems experienced by and the accommodation strategies used by computer users with rheumatoid arthritis (RA) or fibromyalgia (FM). This study (1) describes specific problems and accommodation strategies used by people with RA and FM during computer use; and (2) examines if there were significant differences in the problems and accommodation strategies between the different equipment items for each diagnosis. Subjects were recruited from the Arthritis Network Disease Registry. Respondents completed a self-report survey, the Computer Problems Survey. Data were analyzed descriptively (percentages; 95% confidence intervals). Differences in the number of problems and accommodation strategies were calculated using nonparametric tests (Friedman's test and Wilcoxon Signed Rank Test). Eighty-four percent of respondents reported at least one problem with at least one equipment item (RA = 81.5%; FM = 88.9%), with most respondents reporting problems with their chair. Respondents most commonly used timing accommodation strategies to cope with mouse and keyboard problems, personal accommodation strategies to cope with chair problems and environmental accommodation strategies to cope with monitor problems. The number of problems during computer use was substantial in our sample, and our respondents with RA and FM may not implement the most effective strategies to deal with their chair, keyboard, or mouse problems. This study suggests that workers with RA and FM might potentially benefit from education and interventions to assist with the development of accommodation strategies to reduce problems related to computer use.

  2. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  3. Computing security strategies in finite horizon repeated Bayesian games

    KAUST Repository

    Lichun Li; Langbort, Cedric; Shamma, Jeff S.

    2017-01-01

    in the worst case. First, a security strategy that directly depends on both players' history actions is derived by refining the sequence form. Noticing that history action space grows exponentially with respect to the time horizon, this paper further presents a

  4. Strategies for Involving Communities in the Funding of Computer ...

    African Journals Online (AJOL)

    Toshiba

    An International Multidisciplinary Journal, Ethiopia. Vol. 8 (1), Serial ... This is a survey design which investigated the strategies for involving communities in ... logical operations on the data and finally gives the result in the form of information. .... Adesina. (1981) observed that the major constraints of educational financing in.

  5. Computer Networking Strategies for Building Collaboration among Science Educators.

    Science.gov (United States)

    Aust, Ronald

    The development and dissemination of science materials can be associated with technical delivery systems such as the Unified Network for Informatics in Teacher Education (UNITE). The UNITE project was designed to investigate ways for using computer networking to improve communications and collaboration among university schools of education and…

  6. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  7. A strategy for reducing turnaround time in design optimization using a distributed computer system

    Science.gov (United States)

    Young, Katherine C.; Padula, Sharon L.; Rogers, James L.

    1988-01-01

    There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.

  8. Computing strategy of Alpha-Magnetic Spectrometer experiment

    International Nuclear Information System (INIS)

    Choutko, V.; Klimentov, A.

    2003-01-01

    Alpha-Magnetic Spectrometer (AMS) is an experiment to search in the space for dark matter, missing matter, and antimatter scheduled for being flown on the International Space Station in the fall of year 2005 for at least 3 consecutive years. This paper gives an overview of the AMS software with emphasis on the distributed production system based on client/server approach. We also describe our choice of hardware components to build a processing farm with TByte RAID arrays of IDE disks and highlight the strategies that make our system different from many other experimental systems

  9. Interactive uncertainty reduction strategies and verbal affection in computer-mediated communication

    NARCIS (Netherlands)

    Antheunis, M.L.; Schouten, A.P.; Valkenburg, P.M.; Peter, J.

    2012-01-01

    The goal of this study was to investigate the language-based strategies that computer-mediated communication (CMC) users employ to reduce uncertainty in the absence of nonverbal cues. Specifically, this study investigated the prevalence of three interactive uncertainty reduction strategies (i.e.,

  10. Effects of Computer-Assisted Jigsaw II Cooperative Learning Strategy on Physics Achievement and Retention

    Science.gov (United States)

    Gambari, Isiaka Amosa; Yusuf, Mudasiru Olalere

    2016-01-01

    This study investigated the effects of computer-assisted Jigsaw II cooperative strategy on physics achievement and retention. The study also determined how moderating variables of achievement levels as it affects students' performance in physics when Jigsaw II cooperative learning is used as an instructional strategy. Purposive sampling technique…

  11. Note-Taking with Computers: Exploring Alternative Strategies for Improved Recall

    Science.gov (United States)

    Bui, Dung C.; Myerson, Joel; Hale, Sandra

    2013-01-01

    Three experiments examined note-taking strategies and their relation to recall. In Experiment 1, participants were instructed either to take organized lecture notes or to try and transcribe the lecture, and they either took their notes by hand or typed them into a computer. Those instructed to transcribe the lecture using a computer showed the…

  12. New Perspectives on Computational and Cognitive Strategies for Word Sense Disambiguation

    CERN Document Server

    Kwong, Oi Yee

    2013-01-01

    Cognitive and Computational Strategies for Word Sense Disambiguation examines cognitive strategies by humans and computational strategies by machines, for WSD in parallel.  Focusing on a psychologically valid property of words and senses, author Oi Yee Kwong discusses their concreteness or abstractness and draws on psycholinguistic data to examine the extent to which existing lexical resources resemble the mental lexicon as far as the concreteness distinction is concerned. The text also investigates the contribution of different knowledge sources to WSD in relation to this very intrinsic nature of words and senses. 

  13. Current strategies for dosage reduction in computed tomography

    International Nuclear Information System (INIS)

    May, M.S.; Wuest, W.; Lell, M.M.; Uder, M.; Kalender, W.A.; Schmidt, B.

    2012-01-01

    The potential risks of radiation exposure associated with computed tomography (CT) imaging are reason for ongoing concern for both medical staff and patients. Radiation dose reduction is, according to the as low as reasonably achievable principle, an important issue in clinical routine, research and development. The complex interaction of preparation, examination and post-processing provides a high potential for optimization on the one hand but on the other a high risk for errors. The radiologist is responsible for the quality of the CT examination which requires specialized and up-to-date knowledge. Most of the techniques for radiation dose reduction are independent of the system and manufacturer. The basic principle should be radiation dose optimization without loss of diagnostic image quality rather than just reduction. (orig.) [de

  14. Sample-Based Extreme Learning Machine with Missing Data

    Directory of Open Access Journals (Sweden)

    Hang Gao

    2015-01-01

    Full Text Available Extreme learning machine (ELM has been extensively studied in machine learning community during the last few decades due to its high efficiency and the unification of classification, regression, and so forth. Though bearing such merits, existing ELM algorithms cannot efficiently handle the issue of missing data, which is relatively common in practical applications. The problem of missing data is commonly handled by imputation (i.e., replacing missing values with substituted values according to available information. However, imputation methods are not always effective. In this paper, we propose a sample-based learning framework to address this issue. Based on this framework, we develop two sample-based ELM algorithms for classification and regression, respectively. Comprehensive experiments have been conducted in synthetic data sets, UCI benchmark data sets, and a real world fingerprint image data set. As indicated, without introducing extra computational complexity, the proposed algorithms do more accurate and stable learning than other state-of-the-art ones, especially in the case of higher missing ratio.

  15. Defense strategies for cloud computing multi-site server infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Ma, Chris Y. T. [Hang Seng Management College, Hon Kong; He, Fei [Texas A& M University, Kingsville, TX, USA

    2018-01-01

    We consider cloud computing server infrastructures for big data applications, which consist of multiple server sites connected over a wide-area network. The sites house a number of servers, network elements and local-area connections, and the wide-area network plays a critical, asymmetric role of providing vital connectivity between them. We model this infrastructure as a system of systems, wherein the sites and wide-area network are represented by their cyber and physical components. These components can be disabled by cyber and physical attacks, and also can be protected against them using component reinforcements. The effects of attacks propagate within the systems, and also beyond them via the wide-area network.We characterize these effects using correlations at two levels using: (a) aggregate failure correlation function that specifies the infrastructure failure probability given the failure of an individual site or network, and (b) first-order differential conditions on system survival probabilities that characterize the component-level correlations within individual systems. We formulate a game between an attacker and a provider using utility functions composed of survival probability and cost terms. At Nash Equilibrium, we derive expressions for the expected capacity of the infrastructure given by the number of operational servers connected to the network for sum-form, product-form and composite utility functions.

  16. A Novel Strategy for Mechanism Based Computational Drug Discovery

    Science.gov (United States)

    Subha, Kalyaanamoorthy; Kumar, Gopal Ramesh; Rajalakshmi, Rajasekaran; Aravindhan, Ganesan

    2010-01-01

    Glioma, the common brain tumor, which arises from the glial cells, offers worse prognosis and therapy than any other tumors. Despite the genetic and pathological diversities of malignant gliomas, common signaling pathways that drive cellular proliferation, survival, invasion and angiogenesis have been identified. Very often, various tyrosine kinase receptors are inappropriately activated in human brain tumors and contribute to tumor malignancy. During such tumourous states where multiple pathways are involved, a few of them are responsbile for cell differentiation, proliferation and anti-apoptosis. Computational simulation studies of normal EGFR signaling in glioma together with the mutant EGFR mediated signaling and the MAPK signaling in glioma were carried out. There were no significant cross talks observed between the mutant EGFR and the MAPK pathways and thus from the simulation results, we propose a novel concept of ‘multiple-targeting’ that combines EGFR and Ras targeted therapy thereby providing a better therapeutic value against glioma. Diallyl Disulfide (DADS) that has been commonly used for Ras inhibition in glioma was taken for analyses and the effect of inhibiting the EGFR downstream signaling protein with this DADS was analyzed using the simulation and docking studies. PMID:24179383

  17. A Novel Strategy for Mechanism Based Computational Drug Discovery

    Directory of Open Access Journals (Sweden)

    Kalyaanamoorthy Subha

    2010-01-01

    Full Text Available Glioma, the common brain tumor, which arises from the glial cells, offers worse prognosis and therapy than any other tumors. Despite the genetic and pathological diversities of malignant gliomas, common signaling pathways that drive cellular proliferation, survival, invasion and angiogenesis have been identified. Very often, various tyrosine kinase receptors are inappropriately activated in human brain tumors and contribute to tumor malignancy. During such tumourous states where multiple pathways are involved, a few of them are responsbile for cell differentiation, proliferation and anti-apoptosis. Computational simulation studies of normal EGFR signaling in glioma together with the mutant EGFR mediated signaling and the MAPK signaling in glioma were carried out. There were no significant cross talks observed between the mutant EGFR and the MAPK pathways and thus from the simulation results, we propose a novel concept of ‘multiple-targeting’ that combines EGFR and Ras targeted therapy thereby providing a better therapeutic value against glioma. Diallyl Disulfide (DADS that has been commonly used for Ras inhibition in glioma was taken for analyses and the effect of inhibiting the EGFR downstream signaling protein with this DADS was analyzed using the simulation and docking studies.

  18. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  19. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    Science.gov (United States)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  20. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    Science.gov (United States)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity

  1. Using machine learning to accelerate sampling-based inversion

    Science.gov (United States)

    Valentine, A. P.; Sambridge, M.

    2017-12-01

    In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.

  2. Attenuation Correction Strategies for Positron Emission Tomography/Computed Tomography and 4-Dimensional Positron Emission Tomography/Computed Tomography

    OpenAIRE

    Pan, Tinsu; Zaidi, Habib

    2013-01-01

    This article discusses attenuation correction strategies in positron emission tomography/computed tomography (PET/CT) and 4 dimensional PET/CT imaging. Average CT scan derived from averaging the high temporal resolution CT images is effective in improving the registration of the CT and the PET images and quantification of the PET data. It underscores list mode data acquisition in 4 dimensional PET and introduces 4 dimensional CT popular in thoracic treatment planning to 4 dimensional PET/CT. ...

  3. Axial power deviation control strategy and computer simulation for Daya Bay Nuclear Power Station

    International Nuclear Information System (INIS)

    Liao Yehong; Zhou Xiaoling, Xiao Min

    2004-01-01

    Daya Bay Nuclear Power Station has very tight operation diagram especially at its right side. Therefore the successful control of axial power deviation for PWR is crucial to nuclear safety. After analyzing various core characters' effect on axial power distribution, several axial power deviation control strategies has been proposed to comply with different power varying operation scenario. Application and computer simulation of the strategies has shown that our prediction of axial power deviation evolution are comparable to the measurement values, and that our control strategies are effective. Engineering experience shows that the application of our methodology can predict accurately the transient of axial power deviation, and therefore has become a useful tool for reactor operation and safety control. This paper presents the axial power control characteristics, reactor operation strategy research, computer simulation, and comparison to measurement results in Daya Bay Nuclear Power Station. (author)

  4. Computing the Pareto-Nash equilibrium set in finite multi-objective mixed-strategy games

    Directory of Open Access Journals (Sweden)

    Victoria Lozan

    2013-10-01

    Full Text Available The Pareto-Nash equilibrium set (PNES is described as intersection of graphs of efficient response mappings. The problem of PNES computing in finite multi-objective mixed-strategy games (Pareto-Nash games is considered. A method for PNES computing is studied. Mathematics Subject Classification 2010: 91A05, 91A06, 91A10, 91A43, 91A44.

  5. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [ORNL

    2017-08-01

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes were used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge

  6. A Computer Assisted Method to Track Listening Strategies in Second Language Learning

    Science.gov (United States)

    Roussel, Stephanie

    2011-01-01

    Many studies about listening strategies are based on what learners report while listening to an oral message in the second language (Vandergrift, 2003; Graham, 2006). By recording a video of the computer screen while L2 learners (L1 French) were listening to an MP3-track in German, this study uses a novel approach and recent developments in…

  7. Portraits of PBL: Course Objectives and Students' Study Strategies in Computer Engineering, Psychology and Physiotherapy.

    Science.gov (United States)

    Dahlgren, Madeleine Abrandt

    2000-01-01

    Compares the role of course objectives in relation to students' study strategies in problem-based learning (PBL). Results comprise data from three PBL programs at Linkopings University (Sweden), in physiotherapy, psychology, and computer engineering. Faculty provided course objectives to function as supportive structures and guides for students'…

  8. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    Science.gov (United States)

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  9. Computer-generated versus nurse-determined strategy for incubator humidity and time to regain birthweight

    NARCIS (Netherlands)

    Helder, Onno K.; Mulder, Paul G. H.; van Goudoever, Johannes B.

    2008-01-01

    To compare effects on premature infants' weight gain of a computer-generated and a nurse-determined incubator humidity strategy. An optimal humidity protocol is thought to reduce time to regain birthweight. Prospective randomized controlled design. Level IIIC neonatal intensive care unit in the

  10. Rational behavior in decision making. A comparison between humans, computers and fast and frugal strategies

    NARCIS (Netherlands)

    Snijders, C.C.P.

    2007-01-01

    Rational behavior in decision making. A comparison between humans, computers, and fast and frugal strategies Chris Snijders and Frits Tazelaar (Eindhoven University of Technology, The Netherlands) Real life decisions often have to be made in "noisy" circumstances: not all crucial information is

  11. Computing in the Curriculum: Challenges and Strategies from a Teacher's Perspective

    Science.gov (United States)

    Sentance, Sue; Csizmadia, Andrew

    2017-01-01

    Computing is being introduced into the curriculum in many countries. Teachers' perspectives enable us to discover what challenges this presents, and also the strategies teachers claim to be using successfully in teaching the subject across primary and secondary education. The study described in this paper was carried out in the UK in 2014 where…

  12. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    Science.gov (United States)

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  13. ON SAMPLING BASED METHODS FOR THE DUBINS TRAVELING SALESMAN PROBLEM WITH NEIGHBORHOODS

    Directory of Open Access Journals (Sweden)

    Petr Váňa

    2015-12-01

    Full Text Available In this paper, we address the problem of path planning to visit a set of regions by Dubins vehicle, which is also known as the Dubins Traveling Salesman Problem Neighborhoods (DTSPN. We propose a modification of the existing sampling-based approach to determine increasing number of samples per goal region and thus improve the solution quality if a more computational time is available. The proposed modification of the sampling-based algorithm has been compared with performance of existing approaches for the DTSPN and results of the quality of the found solutions and the required computational time are presented in the paper.

  14. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Directory of Open Access Journals (Sweden)

    Yeqing Zhang

    2018-02-01

    Full Text Available For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully.

  15. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Science.gov (United States)

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  16. A Cloud-Computing-Based Data Placement Strategy in High-Speed Railway

    Directory of Open Access Journals (Sweden)

    Hanning Wang

    2012-01-01

    Full Text Available As an important component of China’s transportation data sharing system, high-speed railway data sharing is a typical application of data-intensive computing. Currently, most high-speed railway data is shared in cloud computing environment. Thus, there is an urgent need for an effective cloud-computing-based data placement strategy in high-speed railway. In this paper, a new data placement strategy named hierarchical structure data placement strategy is proposed. The proposed method combines the semidefinite programming algorithm with the dynamic interval mapping algorithm. The semi-definite programming algorithm is suitable for the placement of files with various replications, ensuring that different replications of a file are placed on different storage devices, while the dynamic interval mapping algorithm ensures better self-adaptability of the data storage system. A hierarchical data placement strategy is proposed for large-scale networks. In this paper, a new theoretical analysis is provided, which is put in comparison with several other previous data placement approaches, showing the efficacy of the new analysis in several experiments.

  17. Influence of Learning Strategy of Cognitive Conflict on Student Misconception in Computational Physics Course

    Science.gov (United States)

    Akmam, A.; Anshari, R.; Amir, H.; Jalinus, N.; Amran, A.

    2018-04-01

    Misconception is one of the factors causing students are not suitable in to choose a method for problem solving. Computational Physics course is a major subject in the Department of Physics FMIPA UNP Padang. The problem in Computational Physics learning lately is that students have difficulties in constructing knowledge. The indication of this problem was the student learning outcomes do not achieve mastery learning. The root of the problem is the ability of students to think critically weak. Student critical thinking can be improved using cognitive by conflict learning strategies. The research aims to determine the effect of cognitive conflict learning strategy to student misconception on the subject of Computational Physics Course at the Department of Physics, Faculty of Mathematics and Science, Universitas Negeri Padang. The experimental research design conducted after-before design cycles with a sample of 60 students by cluster random sampling. Data were analyzed using repeated Anova measurements. The cognitive conflict learning strategy has a significant effect on student misconception in the subject of Computational Physics Course.

  18. Chemometric classification of casework arson samples based on gasoline content.

    Science.gov (United States)

    Sinkov, Nikolai A; Sandercock, P Mark L; Harynuk, James J

    2014-02-01

    Detection and identification of ignitable liquids (ILs) in arson debris is a critical part of arson investigations. The challenge of this task is due to the complex and unpredictable chemical nature of arson debris, which also contains pyrolysis products from the fire. ILs, most commonly gasoline, are complex chemical mixtures containing hundreds of compounds that will be consumed or otherwise weathered by the fire to varying extents depending on factors such as temperature, air flow, the surface on which IL was placed, etc. While methods such as ASTM E-1618 are effective, data interpretation can be a costly bottleneck in the analytical process for some laboratories. In this study, we address this issue through the application of chemometric tools. Prior to the application of chemometric tools such as PLS-DA and SIMCA, issues of chromatographic alignment and variable selection need to be addressed. Here we use an alignment strategy based on a ladder consisting of perdeuterated n-alkanes. Variable selection and model optimization was automated using a hybrid backward elimination (BE) and forward selection (FS) approach guided by the cluster resolution (CR) metric. In this work, we demonstrate the automated construction, optimization, and application of chemometric tools to casework arson data. The resulting PLS-DA and SIMCA classification models, trained with 165 training set samples, have provided classification of 55 validation set samples based on gasoline content with 100% specificity and sensitivity. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. A strategy for improved computational efficiency of the method of anchored distributions

    Science.gov (United States)

    Over, Matthew William; Yang, Yarong; Chen, Xingyuan; Rubin, Yoram

    2013-06-01

    This paper proposes a strategy for improving the computational efficiency of model inversion using the method of anchored distributions (MAD) by "bundling" similar model parametrizations in the likelihood function. Inferring the likelihood function typically requires a large number of forward model (FM) simulations for each possible model parametrization; as a result, the process is quite expensive. To ease this prohibitive cost, we present an approximation for the likelihood function called bundling that relaxes the requirement for high quantities of FM simulations. This approximation redefines the conditional statement of the likelihood function as the probability of a set of similar model parametrizations "bundle" replicating field measurements, which we show is neither a model reduction nor a sampling approach to improving the computational efficiency of model inversion. To evaluate the effectiveness of these modifications, we compare the quality of predictions and computational cost of bundling relative to a baseline MAD inversion of 3-D flow and transport model parameters. Additionally, to aid understanding of the implementation we provide a tutorial for bundling in the form of a sample data set and script for the R statistical computing language. For our synthetic experiment, bundling achieved a 35% reduction in overall computational cost and had a limited negative impact on predicted probability distributions of the model parameters. Strategies for minimizing error in the bundling approximation, for enforcing similarity among the sets of model parametrizations, and for identifying convergence of the likelihood function are also presented.

  20. A Fast GPU-accelerated Mixed-precision Strategy for Fully NonlinearWater Wave Computations

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter; Madsen, Morten G.

    2011-01-01

    We present performance results of a mixed-precision strategy developed to improve a recently developed massively parallel GPU-accelerated tool for fast and scalable simulation of unsteady fully nonlinear free surface water waves over uneven depths (Engsig-Karup et.al. 2011). The underlying wave......-preconditioned defect correction method. The improved strategy improves the performance by exploiting architectural features of modern GPUs for mixed precision computations and is tested in a recently developed generic library for fast prototyping of PDE solvers. The new wave tool is applicable to solve and analyze...

  1. A strategy for the phased replacement of CANDU digital control computers

    International Nuclear Information System (INIS)

    Hepburn, G.A.

    2001-01-01

    Significant developments have occurred with respect to the replacement of the Plant Digital Control Computers (DCCs) on CANDU plants in the past six months. This paper summarises the conclusions of the condition assessment carried out on these machines at Point Lepreau Generating Station, and describes a strategy for a phased transition to a replacement system based on today's technology. Most elements of the strategy are already in place, and sufficient technical work has been done to allow those components which have been assessed as requiring prompt attention to be replaced in a matter of months. (author)

  2. Teaching strategies applied to teaching computer networks in Engineering in Telecommunications and Electronics

    Directory of Open Access Journals (Sweden)

    Elio Manuel Castañeda-González

    2016-07-01

    Full Text Available Because of the large impact that today computer networks, their study in related fields such as Telecommunications Engineering and Electronics is presented to the student with great appeal. However, by digging in content, lacking a strong practical component, you can make this interest decreases considerably. This paper proposes the use of teaching strategies and analogies, media and interactive applications that enhance the teaching of discipline networks and encourage their study. It is part of an analysis of how the teaching of the discipline process is performed and then a description of each of these strategies is done with their respective contribution to student learning.

  3. Functioning strategy study on control systems of large physical installations used with a digital computer

    International Nuclear Information System (INIS)

    Bel'man, L.B.; Lavrikov, S.A.; Lenskij, O.D.

    1975-01-01

    A criterion to evaluate the efficiency of a control system functioning of large physical installations by means of a control computer. The criteria are the object utilization factor and computer load factor. Different strategies of control system functioning are described, and their comparative analysis is made. A choice of such important parameters as sampling time and parameter correction time is made. A single factor to evaluate the system functioning efficiency is introduced and its dependence on the sampling interval value is given. Using diagrams attached, it is easy to find the optimum value of the sampling interval and the corresponding maximum value of the single efficiency factor proposed

  4. Incorporating electronic-based and computer-based strategies: graduate nursing courses in administration.

    Science.gov (United States)

    Graveley, E; Fullerton, J T

    1998-04-01

    The use of electronic technology allows faculty to improve their course offerings. Four graduate courses in nursing administration were contemporized to incorporate fundamental computer-based skills that would be expected of graduates in the work setting. Principles of adult learning offered a philosophical foundation that guided course development and revision. Course delivery strategies included computer-assisted instructional modules, e-mail interactive discussion groups, and use of the electronic classroom. Classroom seminar discussions and two-way interactive video conferencing focused on group resolution of problems derived from employment settings and assigned readings. Using these electronic technologies, a variety of courses can be revised to accommodate the learners' needs.

  5. Effects of a Computer-Assisted Concept Mapping Learning Strategy on EFL College Students' English Reading Comprehension

    Science.gov (United States)

    Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju

    2010-01-01

    The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…

  6. Learning Support Assessment Study of a Computer Simulation for the Development of Microbial Identification Strategies

    Directory of Open Access Journals (Sweden)

    Tristan E. Johnson

    2009-12-01

    Full Text Available This paper describes a study that examined how microbiology students construct knowledge of bacterial identification while using a computer simulation. The purpose of this study was to understand how the simulation affects the cognitive processing of students during thinking, problem solving, and learning about bacterial identification and to determine how the simulation facilitates the learning of a domain-specific problem-solving strategy. As part of an upper-division microbiology course, five students participated in several simulation assignments. The data were collected using think-aloud protocol and video action logs as the students used the simulation. The analysis revealed two major themes that determined the performance of the students: Simulation Usage—how the students used the software features and Problem-Solving Strategy Development—the strategy level students started with and the skill level they achieved when they completed their use of the simulation. Several conclusions emerged from the analysis of the data: (i The simulation affects various aspects of cognitive processing by creating an environment that makes it possible to practice the application of a problem-solving strategy. The simulation was used as an environment that allowed students to practice the cognitive skills required to solve an unknown. (ii Identibacter (the computer simulation may be considered to be a cognitive tool to facilitate the learning of a bacterial identification problem-solving strategy. (iii The simulation characteristics did support student learning of a problem-solving strategy. (iv Students demonstrated problem-solving strategy development specific to bacterial identification. (v Participants demonstrated an improved performance from their repeated use of the simulation.

  7. A Strategy for Automatic Performance Tuning of Stencil Computations on GPUs

    Directory of Open Access Journals (Sweden)

    Joseph D. Garvey

    2018-01-01

    Full Text Available We propose and evaluate a novel strategy for tuning the performance of a class of stencil computations on Graphics Processing Units. The strategy uses a machine learning model to predict the optimal way to load data from memory followed by a heuristic that divides other optimizations into groups and exhaustively explores one group at a time. We use a set of 104 synthetic OpenCL stencil benchmarks that are representative of many real stencil computations. We first demonstrate the need for auto-tuning by showing that the optimization space is sufficiently complex that simple approaches to determining a high-performing configuration fail. We then demonstrate the effectiveness of our approach on NVIDIA and AMD GPUs. Relative to a random sampling of the space, we find configurations that are 12%/32% faster on the NVIDIA/AMD platform in 71% and 4% less time, respectively. Relative to an expert search, we achieve 5% and 9% better performance on the two platforms in 89% and 76% less time. We also evaluate our strategy for different stencil computational intensities, varying array sizes and shapes, and in combination with expert search.

  8. The Toggle Local Planner for sampling-based motion planning

    KAUST Repository

    Denny, Jory; Amato, Nancy M.

    2012-01-01

    Sampling-based solutions to the motion planning problem, such as the probabilistic roadmap method (PRM), have become commonplace in robotics applications. These solutions are the norm as the dimensionality of the planning space grows, i.e., d > 5

  9. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya; Amato, Nancy M.

    2012-01-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented

  10. Copy-Right for Software and Computer Games: Strategies and Challenges

    Directory of Open Access Journals (Sweden)

    Hojatollah Ayoubi

    2009-11-01

    Full Text Available Copy-right has been initially used in cultural and art industries. From that time there have been two different approaches to the matter: the commercial-economic approach which is concerned with the rights of suppliers and investors; and the other approach, the cultural one, which is especially concerned with the rights of author. First approach is rooted in Anglo-American countries, while the other is originally French. Expansion of the computer market, and separating software and hardware markets caused to the so-called velvet-rubbery, which refers to the illegal reproduction in the market. Therefore, there were some struggles all over the world to protect rights of their producers. In present study, beside the domestic and international difficulties these strategies would encounter, this article has reviewed different strategies to face this challenge.

  11. Time Synchronization Strategy Between On-Board Computer and FIMS on STSAT-1

    Directory of Open Access Journals (Sweden)

    Seong Woo Kwak

    2004-06-01

    Full Text Available STSAT-1 was launched on sep. 2003 with the main payload of Far Ultra-violet Imaging Spectrograph(FIMS. The mission of FIMS is to observe universe and aurora. In this paper, we suggest a simple and reliable strategy adopted in STSAT-1 to synchronize time between On-board Computer(OBC and FIMS. For the characteristics of STSAT-1, this strategy is devised to maintain reliability of satellite system and to reduce implementation cost by using minimized electronic circuits. We suggested two methods with different synchronization resolutions to cope with unexpected faults in space. The backup method with low resolution can be activated when the main has some problems.

  12. Estimation of kinetic and thermodynamic ligand-binding parameters using computational strategies.

    Science.gov (United States)

    Deganutti, Giuseppe; Moro, Stefano

    2017-04-01

    Kinetic and thermodynamic ligand-protein binding parameters are gaining growing importance as key information to consider in drug discovery. The determination of the molecular structures, using particularly x-ray and NMR techniques, is crucial for understanding how a ligand recognizes its target in the final binding complex. However, for a better understanding of the recognition processes, experimental studies of ligand-protein interactions are needed. Even though several techniques can be used to investigate both thermodynamic and kinetic profiles for a ligand-protein complex, these procedures are very often laborious, time consuming and expensive. In the last 10 years, computational approaches have enormous potential in providing insights into each of the above effects and in parsing their contributions to the changes in both kinetic and thermodynamic binding parameters. The main purpose of this review is to summarize the state of the art of computational strategies for estimating the kinetic and thermodynamic parameters of a ligand-protein binding.

  13. Evolution of the heteroharmonic strategy for target-range computation in the echolocation of Mormoopidae.

    Directory of Open Access Journals (Sweden)

    Emanuel C Mora

    2013-06-01

    Full Text Available Echolocating bats use the time elapsed from biosonar pulse emission to the arrival of echo (defined as echo-delay to assess target-distance. Target-distance is represented in the brain by delay-tuned neurons that are classified as either heteroharmonic or homoharmormic. Heteroharmonic neurons respond more strongly to pulse-echo pairs in which the timing of the pulse is given by the fundamental biosonar harmonic while the timing of echoes is provided by one (or several of the higher order harmonics. On the other hand, homoharmonic neurons are tuned to the echo delay between similar harmonics in the emitted pulse and echo. It is generally accepted that heteroharmonic computations are advantageous over homoharmonic computations; i.e. heteroharmonic neurons receive information from call and echo in different frequency-bands which helps to avoid jamming between pulse and echo signals. Heteroharmonic neurons have been found in two species of the family Mormoopidae (Pteronotus parnellii and Pteronotus quadridens and in Rhinolophus rouxi. Recently, it was proposed that heteroharmonic target-range computations are a primitive feature of the genus Pteronotus that was preserved in the evolution of the genus. Here we review recent findings on the evolution of echolocation in Mormoopidae, and try to link those findings to the evolution of the heteroharmonic computation strategy. We stress the hypothesis that the ability to perform heteroharmonic computations evolved separately from the ability of using long constant-frequency echolocation calls, high duty cycle echolocation and Doppler Shift Compensation. Also, we present the idea that heteroharmonic computations might have been of advantage for categorizing prey size, hunting eared insects and living in large conspecific colonies. We make five testable predictions that might help future investigations to clarify the evolution of the heteroharmonic echolocation in Mormoopidae and other families.

  14. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam; Jacobs, Sam Ade; Sharma, Shishir; Amato, Nancy M.; Rauchwerger, Lawrence

    2014-01-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  15. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam

    2014-05-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  16. A hybrid computational strategy to address WGS variant analysis in >5000 samples.

    Science.gov (United States)

    Huang, Zhuoyi; Rustagi, Navin; Veeraraghavan, Narayanan; Carroll, Andrew; Gibbs, Richard; Boerwinkle, Eric; Venkata, Manjunath Gorentla; Yu, Fuli

    2016-09-10

    The decreasing costs of sequencing are driving the need for cost effective and real time variant calling of whole genome sequencing data. The scale of these projects are far beyond the capacity of typical computing resources available with most research labs. Other infrastructures like the cloud AWS environment and supercomputers also have limitations due to which large scale joint variant calling becomes infeasible, and infrastructure specific variant calling strategies either fail to scale up to large datasets or abandon joint calling strategies. We present a high throughput framework including multiple variant callers for single nucleotide variant (SNV) calling, which leverages hybrid computing infrastructure consisting of cloud AWS, supercomputers and local high performance computing infrastructures. We present a novel binning approach for large scale joint variant calling and imputation which can scale up to over 10,000 samples while producing SNV callsets with high sensitivity and specificity. As a proof of principle, we present results of analysis on Cohorts for Heart And Aging Research in Genomic Epidemiology (CHARGE) WGS freeze 3 dataset in which joint calling, imputation and phasing of over 5300 whole genome samples was produced in under 6 weeks using four state-of-the-art callers. The callers used were SNPTools, GATK-HaplotypeCaller, GATK-UnifiedGenotyper and GotCloud. We used Amazon AWS, a 4000-core in-house cluster at Baylor College of Medicine, IBM power PC Blue BioU at Rice and Rhea at Oak Ridge National Laboratory (ORNL) for the computation. AWS was used for joint calling of 180 TB of BAM files, and ORNL and Rice supercomputers were used for the imputation and phasing step. All other steps were carried out on the local compute cluster. The entire operation used 5.2 million core hours and only transferred a total of 6 TB of data across the platforms. Even with increasing sizes of whole genome datasets, ensemble joint calling of SNVs for low

  17. Axial power difference control strategy and computer simulation for GNPS during stretch-out and power decrease

    International Nuclear Information System (INIS)

    Liao Yehong; Xiao Min; Li Xianfeng; Zhu Minhong

    2004-01-01

    Successful control of the axial power difference for PWR is crucial to nuclear safety. After analyzing various elements' effect on the axial power distribution, different axial power deviation control strategies have been proposed to comply with different power decrease scenarios. Application of the strategy to computer simulation shows that our prediction of axial power deviation evolution is comparable to the measurement value, and that our control strategy is effective

  18. Comparison of different strategies in prenatal screening for Down's syndrome: cost effectiveness analysis of computer simulation.

    Science.gov (United States)

    Gekas, Jean; Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François

    2009-02-13

    To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down's syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options' outcomes. The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26,833 per case of Down's syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100,000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30,963 per additional birth with Down's syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26,833 to $C37,260 and from $C35,215 to $C45,314 per case of Down's syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100,000 pregnancies), and the number of unnecessary terminations (from 16 to 26 and from 16 to 25 per 100

  19. IMPLEMENTING THE COMPUTER-BASED NATIONAL EXAMINATION IN INDONESIAN SCHOOLS: THE CHALLENGES AND STRATEGIES

    Directory of Open Access Journals (Sweden)

    Heri Retnawati

    2017-12-01

    Full Text Available In line with technological development, the computer-based national examination (CBNE has become an urgent matter as its implementation faces various challenges, especially in developing countries. Strategies in implementing CBNE are thus needed to face the challenges. The aim of this research was to analyse the challenges and strategies of Indonesian schools in implementing CBNE. This research was qualitative phenomenological in nature. The data were collected through a questionnaire and a focus group discussion. The research participants were teachers who were test supervisors and technicians at junior high schools and senior high schools (i.e. Level 1 and 2 and vocational high schools implementing CBNE in Yogyakarta, Indonesia. The data were analysed using the Bogdan and Biklen model. The results indicate that (1 in implementing CBNE, the schools should initially make efforts to provide the electronic equipment supporting it; (2 the implementation of CBNE is challenged by problems concerning the Internet and the electricity supply; (3 the test supervisors have to learn their duties by themselves and (4 the students are not yet familiar with the beneficial use of information technology. To deal with such challenges, the schools employed strategies by making efforts to provide the standard electronic equipment through collaboration with the students’ parents and improving the curriculum content by adding information technology as a school subject.

  20. Computationally efficient design of optimal output feedback strategies for controllable passive damping devices

    International Nuclear Information System (INIS)

    Kamalzare, Mahmoud; Johnson, Erik A; Wojtkiewicz, Steven F

    2014-01-01

    Designing control strategies for smart structures, such as those with semiactive devices, is complicated by the nonlinear nature of the feedback control, secondary clipping control and other additional requirements such as device saturation. The usual design approach resorts to large-scale simulation parameter studies that are computationally expensive. The authors have previously developed an approach for state-feedback semiactive clipped-optimal control design, based on a nonlinear Volterra integral equation that provides for the computationally efficient simulation of such systems. This paper expands the applicability of the approach by demonstrating that it can also be adapted to accommodate more realistic cases when, instead of full state feedback, only a limited set of noisy response measurements is available to the controller. This extension requires incorporating a Kalman filter (KF) estimator, which is linear, into the nominal model of the uncontrolled system. The efficacy of the approach is demonstrated by a numerical study of a 100-degree-of-freedom frame model, excited by a filtered Gaussian random excitation, with noisy acceleration sensor measurements to determine the semiactive control commands. The results show that the proposed method can improve computational efficiency by more than two orders of magnitude relative to a conventional solver, while retaining a comparable level of accuracy. Further, the proposed approach is shown to be similarly efficient for an extensive Monte Carlo simulation to evaluate the effects of sensor noise levels and KF tuning on the accuracy of the response. (paper)

  1. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya

    2012-05-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented with useful information to model interesting scenarios related to multi-agent interaction and coordination. © 2012 IEEE.

  2. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  3. Performance Assessment Strategies: A computational framework for conceptual design of large roofs

    Directory of Open Access Journals (Sweden)

    Michela Turrin

    2014-01-01

    Full Text Available Using engineering performance evaluations to explore design alternatives during the conceptual phase of architectural design helps to understand the relationships between form and performance; and is crucial for developing well-performing final designs. Computer aided conceptual design has the potential to aid the design team in discovering and highlighting these relationships; especially by means of procedural and parametric geometry to support the generation of geometric design, and building performance simulation tools to support performance assessments. However, current tools and methods for computer aided conceptual design in architecture do not explicitly reveal nor allow for backtracking the relationships between performance and geometry of the design. They currently support post-engineering, rather than the early design decisions and the design exploration process. Focusing on large roofs, this research aims at developing a computational design approach to support designers in performance driven explorations. The approach is meant to facilitate the multidisciplinary integration and the learning process of the designer; and not to constrain the process in precompiled procedures or in hard engineering formulations, nor to automatize it by delegating the design creativity to computational procedures. PAS (Performance Assessment Strategies as a method is the main output of the research. It consists of a framework including guidelines and an extensible library of procedures for parametric modelling. It is structured on three parts. Pre-PAS provides guidelines for a design strategy-definition, toward the parameterization process. Model-PAS provides guidelines, procedures and scripts for building the parametric models. Explore-PAS supports the solutions-assessment based on numeric evaluations and performance simulations, until the identification of a suitable design solution. PAS has been developed based on action research. Several case studies

  4. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    Energy Technology Data Exchange (ETDEWEB)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    2015-05-27

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

  5. Computer use problems and accommodation strategies at work and home for people with systemic sclerosis: a needs assessment.

    Science.gov (United States)

    Baker, Nancy A; Aufman, Elyse L; Poole, Janet L

    2012-01-01

    We identified the extent of the need for interventions and assistive technology to prevent computer use problems in people with systemic sclerosis (SSc) and the accommodation strategies they use to alleviate such problems. Respondents were recruited through the Scleroderma Foundation. Twenty-seven people with SSc who used a computer and reported difficulty in working completed the Computer Problems Survey. All but 1 of the respondents reported one problem with at least one equipment type. The highest number of respondents reported problems with keyboards (88%) and chairs (85%). More than half reported discomfort in the past month associated with the chair, keyboard, and mouse. Respondents used a variety of accommodation strategies. Many respondents experienced problems and discomfort related to computer use. The characteristic symptoms of SSc may contribute to these problems. Occupational therapy interventions for computer use problems in clients with SSc need to be tested. Copyright © 2012 by the American Occupational Therapy Association, Inc.

  6. Using multiple metaphors and multimodalities as a semiotic resource when teaching year 2 students computational strategies

    Science.gov (United States)

    Mildenhall, Paula; Sherriff, Barbara

    2017-06-01

    Recent research indicates that using multimodal learning experiences can be effective in teaching mathematics. Using a social semiotic lens within a participationist framework, this paper reports on a professional learning collaboration with a primary school teacher designed to explore the use of metaphors and modalities in mathematics instruction. This video case study was conducted in a year 2 classroom over two terms, with the focus on building children's understanding of computational strategies. The findings revealed that the teacher was able to successfully plan both multimodal and multiple metaphor learning experiences that acted as semiotic resources to support the children's understanding of abstract mathematics. The study also led to implications for teaching when using multiple metaphors and multimodalities.

  7. Cloud computing task scheduling strategy based on differential evolution and ant colony optimization

    Science.gov (United States)

    Ge, Junwei; Cai, Yu; Fang, Yiqiu

    2018-05-01

    This paper proposes a task scheduling strategy DEACO based on the combination of Differential Evolution (DE) and Ant Colony Optimization (ACO), aiming at the single problem of optimization objective in cloud computing task scheduling, this paper combines the shortest task completion time, cost and load balancing. DEACO uses the solution of the DE to initialize the initial pheromone of ACO, reduces the time of collecting the pheromone in ACO in the early, and improves the pheromone updating rule through the load factor. The proposed algorithm is simulated on cloudsim, and compared with the min-min and ACO. The experimental results show that DEACO is more superior in terms of time, cost, and load.

  8. Proxy-equation paradigm: A strategy for massively parallel asynchronous computations

    Science.gov (United States)

    Mittal, Ankita; Girimaji, Sharath

    2017-09-01

    Massively parallel simulations of transport equation systems call for a paradigm change in algorithm development to achieve efficient scalability. Traditional approaches require time synchronization of processing elements (PEs), which severely restricts scalability. Relaxing synchronization requirement introduces error and slows down convergence. In this paper, we propose and develop a novel "proxy equation" concept for a general transport equation that (i) tolerates asynchrony with minimal added error, (ii) preserves convergence order and thus, (iii) expected to scale efficiently on massively parallel machines. The central idea is to modify a priori the transport equation at the PE boundaries to offset asynchrony errors. Proof-of-concept computations are performed using a one-dimensional advection (convection) diffusion equation. The results demonstrate the promise and advantages of the present strategy.

  9. Digital mammography: Signal-extraction strategies for computer-aided detection of microcalcifications

    International Nuclear Information System (INIS)

    Chan, H.P.; Doi, K.; Metz, C.E.; Vyborny, C.J.; Lam, K.L.; Schmidt, R.A.

    1987-01-01

    The authors found that the structured background of a mammogram can be removed effectively by either a difference-image technique (using a matched filter in combination with a median filter, a contrast-reversal filter, or a box-rim filter) or a visual response filter alone. Locally adaptive gray-level thresholding and region-growing techniques can then be employed to extract microcalcifications from the processed image. Signals are further distinguished from noise or artifacts by their size, contrast, and clustering properties. The authors studied the dependence of the detectability of microcalcifications on the various signal-extraction strategies. Potential application of the computer-aided system to mammography is assessed by its performance on clinical mammograms

  10. Using computer-aided drug design and medicinal chemistry strategies in the fight against diabetes.

    Science.gov (United States)

    Semighini, Evandro P; Resende, Jonathan A; de Andrade, Peterson; Morais, Pedro A B; Carvalho, Ivone; Taft, Carlton A; Silva, Carlos H T P

    2011-04-01

    The aim of this work is to present a simple, practical and efficient protocol for drug design, in particular Diabetes, which includes selection of the illness, good choice of a target as well as a bioactive ligand and then usage of various computer aided drug design and medicinal chemistry tools to design novel potential drug candidates in different diseases. We have selected the validated target dipeptidyl peptidase IV (DPP-IV), whose inhibition contributes to reduce glucose levels in type 2 diabetes patients. The most active inhibitor with complex X-ray structure reported was initially extracted from the BindingDB database. By using molecular modification strategies widely used in medicinal chemistry, besides current state-of-the-art tools in drug design (including flexible docking, virtual screening, molecular interaction fields, molecular dynamics, ADME and toxicity predictions), we have proposed 4 novel potential DPP-IV inhibitors with drug properties for Diabetes control, which have been supported and validated by all the computational tools used herewith.

  11. Examining the Effects of Two Computer Programming Learning Strategies: Self-Explanation versus Reading Questions and Answers

    Directory of Open Access Journals (Sweden)

    Nancy Lee

    2017-08-01

    Full Text Available The study described here explored the differential effects of two learning strategies, self-explanation and reading questions and answers, on learning the computer programming language JavaScript. Students’ test performance and perceptions of effectiveness toward the two strategies were examined. An online interactive tutorial instruction implementing worked-examples and multimedia learning principles was developed for this study. Participants were 147 high school students (ages 14 to 18 of a computer introductory course in six periods which were randomly divided into two groups (n = 78; n = 69 of three periods each. The two groups alternated learning strategies to learn five lessons. Students’ prerequisite knowledge of XHTML and motivation to learn computer programming languages were measured before starting the tutorial. Students largely expressed their preference toward self-explanation over reading questions and answers. They thought self-explanation incurred much more work yet was more effective. However, the two learning strategies did not have differential effects on students’ test performance. The seeming discrepancy arising from students’ preferred strategy and their test performance was discussed in the areas of familiar versus new strategy, difficulty of learning materials and testing method, and experimental duration.

  12. Computationally derived points of fragility of a human cascade are consistent with current therapeutic strategies.

    Directory of Open Access Journals (Sweden)

    Deyan Luan

    2007-07-01

    Full Text Available The role that mechanistic mathematical modeling and systems biology will play in molecular medicine and clinical development remains uncertain. In this study, mathematical modeling and sensitivity analysis were used to explore the working hypothesis that mechanistic models of human cascades, despite model uncertainty, can be computationally screened for points of fragility, and that these sensitive mechanisms could serve as therapeutic targets. We tested our working hypothesis by screening a model of the well-studied coagulation cascade, developed and validated from literature. The predicted sensitive mechanisms were then compared with the treatment literature. The model, composed of 92 proteins and 148 protein-protein interactions, was validated using 21 published datasets generated from two different quiescent in vitro coagulation models. Simulated platelet activation and thrombin generation profiles in the presence and absence of natural anticoagulants were consistent with measured values, with a mean correlation of 0.87 across all trials. Overall state sensitivity coefficients, which measure the robustness or fragility of a given mechanism, were calculated using a Monte Carlo strategy. In the absence of anticoagulants, fluid and surface phase factor X/activated factor X (fX/FXa activity and thrombin-mediated platelet activation were found to be fragile, while fIX/FIXa and fVIII/FVIIIa activation and activity were robust. Both anti-fX/FXa and direct thrombin inhibitors are important classes of anticoagulants; for example, anti-fX/FXa inhibitors have FDA approval for the prevention of venous thromboembolism following surgical intervention and as an initial treatment for deep venous thrombosis and pulmonary embolism. Both in vitro and in vivo experimental evidence is reviewed supporting the prediction that fIX/FIXa activity is robust. When taken together, these results support our working hypothesis that computationally derived points of

  13. CLOUD COMPUTING AND BIG DATA AS CONVERGENT TECHNOLOGIES FOR RETAIL PRICING STRATEGIES OF SMEs

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2014-05-01

    Full Text Available Most retailers know that technology has played an increasingly important role in helping retailers set prices. Online business decision systems are at the core point of an SMEs management and reporting activities. But, until recently, these efforts have been rooted in advances in computing technology, such as cloud computing and big data mining, rather than in newfound applications of scientific principles. In addition, in previous approaches big data mining solutions were implemented locally on private clouds and no SME could aggregate and analyze the information that consumers are exchanging with each other. Real science is a powerful, pervasive force in retail today, particularly so for addressing the complex challenge of retail pricing. Cloud Computing comes in to provide access to entirely new business capabilities through sharing resources and services and managing and assigning resources effectively. Done right, the application of scientific principles to the creation of a true price optimization strategy can lead to significant sales, margin, and profit lift for retailers. In this paper we describe a method to provide the mobile retail consumers with reviews, brand ratings and detailed product information at the point of sale. Furthermore, we present how we use Exalead CloudView platform to search for weak signals in big data by analyzing multimedia data (text, voice, picture, video and mining online social networks. The analysis makes not only customer profiling possible, but also brand promotion in the form of coupons, discounts or upselling to generate more sales, thus providing the opportunity for retailer SMEs to connect directly to its customers in real time. The paper explains why retailers can no longer thrive without a science-based pricing system, defines and illustrates the right science-based approach, and calls out the key features and functionalities of leading science-based price optimization systems. In particular, given

  14. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  15. The Goal Specificity Effect on Strategy Use and Instructional Efficiency during Computer-Based Scientific Discovery Learning

    Science.gov (United States)

    Kunsting, Josef; Wirth, Joachim; Paas, Fred

    2011-01-01

    Using a computer-based scientific discovery learning environment on buoyancy in fluids we investigated the "effects of goal specificity" (nonspecific goals vs. specific goals) for two goal types (problem solving goals vs. learning goals) on "strategy use" and "instructional efficiency". Our empirical findings close an important research gap,…

  16. Game-Based Practice versus Traditional Practice in Computer-Based Writing Strategy Training: Effects on Motivation and Achievement

    Science.gov (United States)

    Proske, Antje; Roscoe, Rod D.; McNamara, Danielle S.

    2014-01-01

    Achieving sustained student engagement with practice in computer-based writing strategy training can be a challenge. One potential solution is to foster engagement by embedding practice in educational games; yet there is currently little research comparing the effectiveness of game-based practice versus more traditional forms of practice. In this…

  17. The Effectiveness of Interactive Computer Assisted Modeling in Teaching Study Strategies and Concept Mapping of College Textbook Material.

    Science.gov (United States)

    Mikulecky, Larry

    A study evaluated the effectiveness of a series of print materials and interactive computer-guided study programs designed to lead undergraduate students to apply basic textbook reading and concept mapping strategies to the study of science and social science textbooks. Following field testing with 25 learning skills students, 50 freshman biology…

  18. Technologies and Business Value of Cloud Computing: Strategy for the Department of Information Processing

    OpenAIRE

    Manyando, Obyster

    2013-01-01

    This research focuses on the area of cloud computing, an important research area in computer and service science. The general aim of this research is to emphasize the business value of cloud computing to businesses. The objective of this research is to explore the economic benefits of cloud computing in order to promote its use in business and the education sector. Consequently, the research recommends the adoption of cloud computing by the Kemi-Tornio University of Applied Sciences. The ...

  19. Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.

    Science.gov (United States)

    Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel

    2017-06-01

    Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.

  20. Individual Differences in Strategy Use on Division Problems: Mental versus Written Computation

    Science.gov (United States)

    Hickendorff, Marian; van Putten, Cornelis M.; Verhelst, Norman D.; Heiser, Willem J.

    2010-01-01

    Individual differences in strategy use (choice and accuracy) were analyzed. A sample of 362 Grade 6 students solved complex division problems under 2 different conditions. In the choice condition students were allowed to use either a mental or a written strategy. In the subsequent no-choice condition, they were required to use a written strategy.…

  1. Evaluation of cognitive loads imposed by traditional paper-based and innovative computer-based instructional strategies.

    Science.gov (United States)

    Khalil, Mohammed K; Mansour, Mahmoud M; Wilhite, Dewey R

    2010-01-01

    Strategies of presenting instructional information affect the type of cognitive load imposed on the learner's working memory. Effective instruction reduces extraneous (ineffective) cognitive load and promotes germane (effective) cognitive load. Eighty first-year students from two veterinary schools completed a two-section questionnaire that evaluated their perspectives on the educational value of a computer-based instructional program. They compared the difference between cognitive loads imposed by paper-based and computer-based instructional strategies used to teach the anatomy of the canine skeleton. Section I included 17 closed-ended items, rated on a five-point Likert scale, that assessed the use of graphics, content, and the learning process. Section II included a nine-point mental effort rating scale to measure the level of difficulty of instruction; students were asked to indicate the amount of mental effort invested in the learning task using both paper-based and computer-based presentation formats. The closed-ended data were expressed as means and standard deviations. A paired t test with an alpha level of 0.05 was used to determine the overall mean difference between the two presentation formats. Students positively evaluated their experience with the computer-based instructional program with a mean score of 4.69 (SD=0.53) for use of graphics, 4.70 (SD=0.56) for instructional content, and 4.45 (SD=0.67) for the learning process. The mean difference of mental effort (1.50) between the two presentation formats was significant, t=8.26, p≤.0001, df=76, for two-tailed distribution. Consistent with cognitive load theory, innovative computer-based instructional strategies decrease extraneous cognitive load compared with traditional paper-based instructional strategies.

  2. A computational approach to compare regression modelling strategies in prediction research.

    Science.gov (United States)

    Pajouheshnia, Romin; Pestman, Wiebe R; Teerenstra, Steven; Groenwold, Rolf H H

    2016-08-25

    It is often unclear which approach to fit, assess and adjust a model will yield the most accurate prediction model. We present an extension of an approach for comparing modelling strategies in linear regression to the setting of logistic regression and demonstrate its application in clinical prediction research. A framework for comparing logistic regression modelling strategies by their likelihoods was formulated using a wrapper approach. Five different strategies for modelling, including simple shrinkage methods, were compared in four empirical data sets to illustrate the concept of a priori strategy comparison. Simulations were performed in both randomly generated data and empirical data to investigate the influence of data characteristics on strategy performance. We applied the comparison framework in a case study setting. Optimal strategies were selected based on the results of a priori comparisons in a clinical data set and the performance of models built according to each strategy was assessed using the Brier score and calibration plots. The performance of modelling strategies was highly dependent on the characteristics of the development data in both linear and logistic regression settings. A priori comparisons in four empirical data sets found that no strategy consistently outperformed the others. The percentage of times that a model adjustment strategy outperformed a logistic model ranged from 3.9 to 94.9 %, depending on the strategy and data set. However, in our case study setting the a priori selection of optimal methods did not result in detectable improvement in model performance when assessed in an external data set. The performance of prediction modelling strategies is a data-dependent process and can be highly variable between data sets within the same clinical domain. A priori strategy comparison can be used to determine an optimal logistic regression modelling strategy for a given data set before selecting a final modelling approach.

  3. Comparison between conservative perturbation and sampling based methods for propagation of Non-Neutronic uncertainties

    International Nuclear Information System (INIS)

    Campolina, Daniel de A.M.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2013-01-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using sampling based method is recent because of the huge computational effort required. In this work a sample space of MCNP calculations were used as a black box model to propagate the uncertainty of system parameters. The efficiency of the method was compared to a conservative method. Uncertainties in input parameters of the reactor considered non-neutronic uncertainties, including geometry dimensions and density. The effect of the uncertainties on the effective multiplication factor of the system was analyzed respect to the possibility of using many uncertainties in the same input. If the case includes more than 46 parameters with uncertainty in the same input, the sampling based method is proved to be more efficient than the conservative method. (author)

  4. Automatic Motion Generation for Robotic Milling Optimizing Stiffness with Sample-Based Planning

    Directory of Open Access Journals (Sweden)

    Julian Ricardo Diaz Posada

    2017-01-01

    Full Text Available Optimal and intuitive robotic machining is still a challenge. One of the main reasons for this is the lack of robot stiffness, which is also dependent on the robot positioning in the Cartesian space. To make up for this deficiency and with the aim of increasing robot machining accuracy, this contribution describes a solution approach for optimizing the stiffness over a desired milling path using the free degree of freedom of the machining process. The optimal motion is computed based on the semantic and mathematical interpretation of the manufacturing process modeled on its components: product, process and resource; and by configuring automatically a sample-based motion problem and the transition-based rapid-random tree algorithm for computing an optimal motion. The approach is simulated on a CAM software for a machining path revealing its functionality and outlining future potentials for the optimal motion generation for robotic machining processes.

  5. A complex-plane strategy for computing rotating polytropic models - Numerical results for strong and rapid differential rotation

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1990-01-01

    In this paper, a numerical method, called complex-plane strategy, is implemented in the computation of polytropic models distorted by strong and rapid differential rotation. The differential rotation model results from a direct generalization of the classical model, in the framework of the complex-plane strategy; this generalization yields very strong differential rotation. Accordingly, the polytropic models assume extremely distorted interiors, while their boundaries are slightly distorted. For an accurate simulation of differential rotation, a versatile method, called multiple partition technique is developed and implemented. It is shown that the method remains reliable up to rotation states where other elaborate techniques fail to give accurate results. 11 refs

  6. The Toggle Local Planner for sampling-based motion planning

    KAUST Repository

    Denny, Jory

    2012-05-01

    Sampling-based solutions to the motion planning problem, such as the probabilistic roadmap method (PRM), have become commonplace in robotics applications. These solutions are the norm as the dimensionality of the planning space grows, i.e., d > 5. An important primitive of these methods is the local planner, which is used for validation of simple paths between two configurations. The most common is the straight-line local planner which interpolates along the straight line between the two configurations. In this paper, we introduce a new local planner, Toggle Local Planner (Toggle LP), which extends local planning to a two-dimensional subspace of the overall planning space. If no path exists between the two configurations in the subspace, then Toggle LP is guaranteed to correctly return false. Intuitively, more connections could be found by Toggle LP than by the straight-line planner, resulting in better connected roadmaps. As shown in our results, this is the case, and additionally, the extra cost, in terms of time or storage, for Toggle LP is minimal. Additionally, our experimental analysis of the planner shows the benefit for a wide array of robots, with DOF as high as 70. © 2012 IEEE.

  7. Computer-aided detection of breast masses: Four-view strategy for screening mammography

    International Nuclear Information System (INIS)

    Wei Jun; Chan Heangping; Zhou Chuan; Wu Yita; Sahiner, Berkman; Hadjiiski, Lubomir M.; Roubidoux, Marilyn A.; Helvie, Mark A.

    2011-01-01

    Purpose: To improve the performance of a computer-aided detection (CAD) system for mass detection by using four-view information in screening mammography. Methods: The authors developed a four-view CAD system that emulates radiologists' reading by using the craniocaudal and mediolateral oblique views of the ipsilateral breast to reduce false positives (FPs) and the corresponding views of the contralateral breast to detect asymmetry. The CAD system consists of four major components: (1) Initial detection of breast masses on individual views, (2) information fusion of the ipsilateral views of the breast (referred to as two-view analysis), (3) information fusion of the corresponding views of the contralateral breast (referred to as bilateral analysis), and (4) fusion of the four-view information with a decision tree. The authors collected two data sets for training and testing of the CAD system: A mass set containing 389 patients with 389 biopsy-proven masses and a normal set containing 200 normal subjects. All cases had four-view mammograms. The true locations of the masses on the mammograms were identified by an experienced MQSA radiologist. The authors randomly divided the mass set into two independent sets for cross validation training and testing. The overall test performance was assessed by averaging the free response receiver operating characteristic (FROC) curves of the two test subsets. The FP rates during the FROC analysis were estimated by using the normal set only. The jackknife free-response ROC (JAFROC) method was used to estimate the statistical significance of the difference between the test FROC curves obtained with the single-view and the four-view CAD systems. Results: Using the single-view CAD system, the breast-based test sensitivities were 58% and 77% at the FP rates of 0.5 and 1.0 per image, respectively. With the four-view CAD system, the breast-based test sensitivities were improved to 76% and 87% at the corresponding FP rates, respectively

  8. Biliary drainage strategy of unresectable malignant hilar strictures by computed tomography volumetry.

    Science.gov (United States)

    Takahashi, Ei; Fukasawa, Mitsuharu; Sato, Tadashi; Takano, Shinichi; Kadokura, Makoto; Shindo, Hiroko; Yokota, Yudai; Enomoto, Nobuyuki

    2015-04-28

    To identify criteria for predicting successful drainage of unresectable malignant hilar biliary strictures (UMHBS) because no ideal strategy currently exists. We examined 78 patients with UMHBS who underwent biliary drainage. Drainage was considered effective when the serum bilirubin level decreased by ≥ 50% from the value before stent placement within 2 wk after drainage, without additional intervention. Complications that occurred within 7 d after stent placement were considered as early complications. Before drainage, the liver volume of each section (lateral and medial sections of the left liver and anterior and posterior sections of the right liver) was measured using computed tomography (CT) volumetry. Drained liver volume was calculated based on the volume of each liver section and the type of bile duct stricture (according to the Bismuth classification). Tumor volume, which was calculated by using CT volumetry, was excluded from the volume of each section. Receiver operating characteristic (ROC) analysis was performed to identify the optimal cutoff values for drained liver volume. In addition, factors associated with the effectiveness of drainage and early complications were evaluated. Multivariate analysis showed that drained liver volume [odds ratio (OR) = 2.92, 95%CI: 1.648-5.197; P < 0.001] and impaired liver function (with decompensated liver cirrhosis) (OR = 0.06, 95%CI: 0.009-0.426; P = 0.005) were independent factors contributing to the effectiveness of drainage. ROC analysis for effective drainage showed cutoff values of 33% of liver volume for patients with preserved liver function (with normal liver or compensated liver cirrhosis) and 50% for patients with impaired liver function (with decompensated liver cirrhosis). The sensitivity and specificity of these cutoff values were 82% and 80% for preserved liver function, and 100% and 67% for impaired liver function, respectively. Among patients who met these criteria, the rate of effective drainage

  9. Computer simulation of wolf-removal strategies for animal damage control

    Science.gov (United States)

    Robert G. Haight; Laurel E. Travis; Kevin Nimerfro; L. David Mech

    2002-01-01

    Because of the sustained growth of the gray wolf (Canis lupus) population in the western Great Lakes region of the United States, management agencies are anticipating gray wolf removal from the federal endangered species list and are proposing strategies for wolf management. Strategies are needed that would balance conflicting public demands for wolf...

  10. Towards a strategy for the introduction of information and computer literacy (ICL) courses

    NARCIS (Netherlands)

    Plomp, T.; Carleer, G.J.

    1987-01-01

    An important goal of the national policy on computers in education in the Netherlands is the familiarization of all citizens with information technology. This policy was a plea for some basic education in information and computer literacy. In the beginning of the implementation of this basic

  11. Different strategies in solving series completion inductive reasoning problems: an fMRI and computational study.

    Science.gov (United States)

    Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A; Zhong, Ning; Li, Kuncheng

    2014-08-01

    Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series and letter series completion tasks; even the underlying rules are identical. In the present study, we examined cortical activation as a function of two different reasoning strategies for solving series completion tasks. The retrieval strategy, used in number series completion tasks, involves direct retrieving of arithmetic knowledge to get the relations between items. The procedural strategy, used in letter series completion tasks, requires counting a certain number of times to detect the relations linking two items. The two strategies require essentially the equivalent cognitive processes, but have different working memory demands (the procedural strategy incurs greater demands). The procedural strategy produced significant greater activity in areas involved in memory retrieval (dorsolateral prefrontal cortex, DLPFC) and mental representation/maintenance (posterior parietal cortex, PPC). An ACT-R model of the tasks successfully predicted behavioral performance and BOLD responses. The present findings support a general-purpose dual-process theory of inductive reasoning regarding the cognitive architecture. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. High performance computing of density matrix renormalization group method for 2-dimensional model. Parallelization strategy toward peta computing

    International Nuclear Information System (INIS)

    Yamada, Susumu; Igarashi, Ryo; Machida, Masahiko; Imamura, Toshiyuki; Okumura, Masahiko; Onishi, Hiroaki

    2010-01-01

    We parallelize the density matrix renormalization group (DMRG) method, which is a ground-state solver for one-dimensional quantum lattice systems. The parallelization allows us to extend the applicable range of the DMRG to n-leg ladders i.e., quasi two-dimension cases. Such an extension is regarded to bring about several breakthroughs in e.g., quantum-physics, chemistry, and nano-engineering. However, the straightforward parallelization requires all-to-all communications between all processes which are unsuitable for multi-core systems, which is a mainstream of current parallel computers. Therefore, we optimize the all-to-all communications by the following two steps. The first one is the elimination of the communications between all processes by only rearranging data distribution with the communication data amount kept. The second one is the avoidance of the communication conflict by rescheduling the calculation and the communication. We evaluate the performance of the DMRG method on multi-core supercomputers and confirm that our two-steps tuning is quite effective. (author)

  13. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  14. Computing Optimal Mixed Strategies for Terrorist Plot Detection Games with the Consideration of Information Leakage

    Directory of Open Access Journals (Sweden)

    Li MingChu

    2017-01-01

    Full Text Available The terrorist’s coordinated attack is becoming an increasing threat to western countries. By monitoring potential terrorists, security agencies are able to detect and destroy terrorist plots at their planning stage. Therefore, an optimal monitoring strategy for the domestic security agency becomes necessary. However, previous study about monitoring strategy generation fails to consider the information leakage, due to hackers and insider threat. Such leakage events may lead to failure of watching potential terrorists and destroying the plot, and cause a huge risk to public security. This paper makes two major contributions. Firstly, we develop a new Stackelberg game model for the security agency to generate optimal monitoring strategy with the consideration of information leakage. Secondly, we provide a double-oracle framework DO-TPDIL for calculation effectively. The experimental result shows that our approach can obtain robust strategies against information leakage with high feasibility and efficiency.

  15. Organizational Strategy and Business Environment Effects Based on a Computation Method

    Science.gov (United States)

    Reklitis, Panagiotis; Konstantopoulos, Nikolaos; Trivellas, Panagiotis

    2007-12-01

    According to many researchers of organizational theory, a great number of problems encountered by the manufacturing firms are due to their ineffectiveness to respond to significant changes of their external environment and align their competitive strategy accordingly. From this point of view, the pursuit of the appropriate generic strategy is vital for firms facing a dynamic and highly competitive environment. In the present paper, we adopt Porter's typology to operationalise organizational strategy (cost leadership, innovative and marketing differentiation, and focus) considering changes in the external business environment (dynamism, complexity and munificence). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic system based on the conceptual framework of strategy-environment associations.

  16. The Pitzer-Lee-Kesler-Teja (PLKT) Strategy and Its Implementation by Meta-Computing Software

    Czech Academy of Sciences Publication Activity Database

    Smith, W. R.; Lísal, Martin; Missen, R. W.

    2001-01-01

    Roč. 35, č. 1 (2001), s. 68-73 ISSN 0009-2479 Institutional research plan: CEZ:AV0Z4072921 Keywords : The Pitzer -Lee-Kesler-Teja (PLKT) strategy * implementation Subject RIV: CF - Physical ; Theoretical Chemistry

  17. Computing Optimal Mixed Strategies for Terrorist Plot Detection Games with the Consideration of Information Leakage

    OpenAIRE

    Li MingChu; Yang Zekun; Lu Kun; Guo Cheng

    2017-01-01

    The terrorist’s coordinated attack is becoming an increasing threat to western countries. By monitoring potential terrorists, security agencies are able to detect and destroy terrorist plots at their planning stage. Therefore, an optimal monitoring strategy for the domestic security agency becomes necessary. However, previous study about monitoring strategy generation fails to consider the information leakage, due to hackers and insider threat. Such leakage events may lead to failure of watch...

  18. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,

    2010-01-25

    Motion planning for spatially constrained robots is difficult due to additional constraints placed on the robot, such as closure constraints for closed chains or requirements on end-effector placement for articulated linkages. It is usually computationally too expensive to apply sampling-based planners to these problems since it is difficult to generate valid configurations. We overcome this challenge by redefining the robot\\'s degrees of freedom and constraints into a new set of parameters, called reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot\\'s degrees of freedom. In addition to supporting efficient sampling of configurations, we show that the RD-space formulation naturally supports planning and, in particular, we design a local planner suitable for use by sampling-based planners. We demonstrate the effectiveness and efficiency of our approach for several systems including closed chain planning with multiple loops, restricted end-effector sampling, and on-line planning for drawing/sculpting. We can sample single-loop closed chain systems with 1,000 links in time comparable to open chain sampling, and we can generate samples for 1,000-link multi-loop systems of varying topologies in less than a second. © 2010 The Author(s).

  19. Fuzzy Cognitive and Social Negotiation Agent Strategy for Computational Collective Intelligence

    Science.gov (United States)

    Chohra, Amine; Madani, Kurosh; Kanzari, Dalel

    Finding the adequate (win-win solutions for both parties) negotiation strategy with incomplete information for autonomous agents, even in one-to-one negotiation, is a complex problem. Elsewhere, negotiation behaviors, in which the characters such as conciliatory or aggressive define a 'psychological' aspect of the negotiator personality, play an important role. The aim of this paper is to develop a fuzzy cognitive and social negotiation strategy for autonomous agents with incomplete information, where the characters conciliatory, neutral, or aggressive, are suggested to be integrated in negotiation behaviors (inspired from research works aiming to analyze human behavior and those on social negotiation psychology). For this purpose, first, one-to-one bargaining process, in which a buyer agent and a seller agent negotiate over single issue (price), is developed for a time-dependent strategy (based on time-dependent behaviors of Faratin et al.) and for a fuzzy cognitive and social strategy. Second, experimental environments and measures, allowing a set of experiments, carried out for different negotiation deadlines of buyer and seller agents, are detailed. Third, experimental results for both time-dependent and fuzzy cognitive and social strategies are presented, analyzed, and compared for different deadlines of agents. The suggested fuzzy cognitive and social strategy allows agents to improve the negotiation process, with regard to the time-dependent one, in terms of agent utilities, round number to reach an agreement, and percentage of agreements.

  20. What's New in Software? Computers and the Writing Process: Strategies That Work.

    Science.gov (United States)

    Ellsworth, Nancy J.

    1990-01-01

    The computer can be a powerful tool to help students who are having difficulty learning the skills of prewriting, composition, revision, and editing. Specific software is suggested for each phase, as well as for classroom publishing. (Author/JDD)

  1. Tree Based Decision Strategies and Auctions in Computational Multi-Agent Systems

    Czech Academy of Sciences Publication Activity Database

    Šlapák, M.; Neruda, Roman

    2017-01-01

    Roč. 38, č. 4 (2017), s. 335-342 ISSN 0257-4306 Institutional support: RVO:67985807 Keywords : auction systems * decision making * genetic programming * multi-agent system * task distribution Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) http://rev-inv-ope.univ-paris1.fr/fileadmin/rev-inv-ope/files/38417/38417-04.pdf

  2. Development and Evaluation of a Computer-Based Learning Environment for Teachers: Assessment of Learning Strategies in Learning Journals

    Directory of Open Access Journals (Sweden)

    Inga Glogger

    2013-01-01

    Full Text Available Training teachers to assess important components of self-regulated learning such as learning strategies is an important, yet somewhat neglected, aspect of the integration of self-regulated learning at school. Learning journals can be used to assess learning strategies in line with cyclical process models of self-regulated learning, allowing for rich formative feedback. Against this background, we developed a computer-based learning environment (CBLE that trains teachers to assess learning strategies with learning journals. The contents of the CBLE and its instructional design were derived from theory. The CBLE was further shaped by research in a design-based manner. Finally, in two evaluation studies, student teachers (N1=44; N2=89 worked with the CBLE. We analyzed satisfaction, interest, usability, and assessment skills. Additionally, in evaluation study 2, effects of an experimental variation on motivation and assessment skills were tested. We found high satisfaction, interest, and good usability, as well as satisfying assessment skills, after working with the CBLE. Results show that teachers can be trained to assess learning strategies in learning journals. The developed CBLE offers new perspectives on how to support teachers in fostering learning strategies as central component of effective self-regulated learning at school.

  3. The Effectiveness of Using Contextual Clues, Dictionary Strategy and Computer Assisted Language Learning (Call In Learning Vocabulary

    Directory of Open Access Journals (Sweden)

    Zuraina Ali

    2013-07-01

    Full Text Available This study investigates the effectiveness of three vocabulary learning methods that are Contextual Clues, Dictionary Strategy, and Computer Assisted Language Learning (CALL in learning vocabulary among ESL learners. First, it aims at finding which of the vocabulary learning methods namely Dictionary Strategy, Contextual Clues, and CALL that may result in the highest number of words learnt in the immediate and delayed recall tests. Second, it compares the results of the Pre-test and the Delayed Recall Post-test to determine the differences of learning vocabulary using the methods. A quasi-experiment that tested the effectiveness of learning vocabulary using Dictionary Strategy, Contextual clues, and CALL involved 123 first year university students. Qualitative procedures included the collection of data from interviews which were conducted to triangulate the data obtain from the quantitative inquiries. Findings from the study using ANOVA revealed that there were significant differences when students were exposed to Dictionary Strategy, Contextual Clues and CALL in the immediate recall tests but not in the Delayed Recall Post-test. Also, there were significant differences when t test was used to compare the scores between the Pre-test and the Delayed Recall Post-test in using the three methods of vocabulary learning. Although many researchers have advocated the relative effectiveness of Dictionary Strategy, Contextual Clues, and CALL in learning vocabulary, the study however, is still paramount since there is no study has ever empirically investigated the relative efficacy of these three methods in a single study.

  4. Improve Outcomes Study subjects Chemistry Teaching and Learning Strategies through independent study with the help of computer-based media

    Science.gov (United States)

    Sugiharti, Gulmah

    2018-03-01

    This study aims to see the improvement of student learning outcomes by independent learning using computer-based learning media in the course of STBM (Teaching and Learning Strategy) Chemistry. Population in this research all student of class of 2014 which take subject STBM Chemistry as many as 4 class. While the sample is taken by purposive as many as 2 classes, each 32 students, as control class and expriment class. The instrument used is the test of learning outcomes in the form of multiple choice with the number of questions as many as 20 questions that have been declared valid, and reliable. Data analysis techniques used one-sided t test and improved learning outcomes using a normalized gain test. Based on the learning result data, the average of normalized gain values for the experimental class is 0,530 and for the control class is 0,224. The result of the experimental student learning result is 53% and the control class is 22,4%. Hypothesis testing results obtained t count> ttable is 9.02> 1.6723 at the level of significance α = 0.05 and db = 58. This means that the acceptance of Ha is the use of computer-based learning media (CAI Computer) can improve student learning outcomes in the course Learning Teaching Strategy (STBM) Chemistry academic year 2017/2018.

  5. Survey of Storage and Fault Tolerance Strategies Used in Cloud Computing

    Science.gov (United States)

    Ericson, Kathleen; Pallickara, Shrideep

    Cloud computing has gained significant traction in recent years. Companies such as Google, Amazon and Microsoft have been building massive data centers over the past few years. Spanning geographic and administrative domains, these data centers tend to be built out of commodity desktops with the total number of computers managed by these companies being in the order of millions. Additionally, the use of virtualization allows a physical node to be presented as a set of virtual nodes resulting in a seemingly inexhaustible set of computational resources. By leveraging economies of scale, these data centers can provision cpu, networking, and storage at substantially reduced prices which in turn underpins the move by many institutions to host their services in the cloud.

  6. A Case Study on Neural Inspired Dynamic Memory Management Strategies for High Performance Computing.

    Energy Technology Data Exchange (ETDEWEB)

    Vineyard, Craig Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    As high performance computing architectures pursue more computational power there is a need for increased memory capacity and bandwidth as well. A multi-level memory (MLM) architecture addresses this need by combining multiple memory types with different characteristics as varying levels of the same architecture. How to efficiently utilize this memory infrastructure is an unknown challenge, and in this research we sought to investigate whether neural inspired approaches can meaningfully help with memory management. In particular we explored neurogenesis inspired re- source allocation, and were able to show a neural inspired mixed controller policy can beneficially impact how MLM architectures utilize memory.

  7. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  8. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  9. Monitoring and Depth of Strategy Use in Computer-Based Learning Environments for Science and History

    Science.gov (United States)

    Deekens, Victor M.; Greene, Jeffrey A.; Lobczowski, Nikki G.

    2018-01-01

    Background: Self-regulated learning (SRL) models position metacognitive monitoring as central to SRL processing and predictive of student learning outcomes (Winne & Hadwin, 2008; Zimmerman, 2000). A body of research evidence also indicates that depth of strategy use, ranging from surface to deep processing, is predictive of learning…

  10. Different strategies in solving series completion inductive reasoning problems : An fMRI and computational study

    NARCIS (Netherlands)

    Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A.; Zhong, Ning; Li, Kuncheng

    Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series

  11. InfoMall: An Innovative Strategy for High-Performance Computing and Communications Applications Development.

    Science.gov (United States)

    Mills, Kim; Fox, Geoffrey

    1994-01-01

    Describes the InfoMall, a program led by the Northeast Parallel Architectures Center (NPAC) at Syracuse University (New York). The InfoMall features a partnership of approximately 24 organizations offering linked programs in High Performance Computing and Communications (HPCC) technology integration, software development, marketing, education and…

  12. A dimension reduction strategy for improving the efficiency of computer-aided detection for CT colonography

    Science.gov (United States)

    Song, Bowen; Zhang, Guopeng; Wang, Huafeng; Zhu, Wei; Liang, Zhengrong

    2013-02-01

    Various types of features, e.g., geometric features, texture features, projection features etc., have been introduced for polyp detection and differentiation tasks via computer aided detection and diagnosis (CAD) for computed tomography colonography (CTC). Although these features together cover more information of the data, some of them are statistically highly-related to others, which made the feature set redundant and burdened the computation task of CAD. In this paper, we proposed a new dimension reduction method which combines hierarchical clustering and principal component analysis (PCA) for false positives (FPs) reduction task. First, we group all the features based on their similarity using hierarchical clustering, and then PCA is employed within each group. Different numbers of principal components are selected from each group to form the final feature set. Support vector machine is used to perform the classification. The results show that when three principal components were chosen from each group we can achieve an area under the curve of receiver operating characteristics of 0.905, which is as high as the original dataset. Meanwhile, the computation time is reduced by 70% and the feature set size is reduce by 77%. It can be concluded that the proposed method captures the most important information of the feature set and the classification accuracy is not affected after the dimension reduction. The result is promising and further investigation, such as automatically threshold setting, are worthwhile and are under progress.

  13. Do company strategies and structures converge in global markets? Evidence from the computer industry

    NARCIS (Netherlands)

    Duysters, G.M.; Hagedoorn, J.

    2001-01-01

    This note examines isomorphism and diversity in a global industry. We study how the ongoing internationalisation process has affected companies from various regions of the world. Empirical research is focussed on the international computer industry. We find that companies in this sector have become

  14. Text-mining strategies to support computational research in chemical toxicity (ACS 2017 Spring meeting)

    Science.gov (United States)

    With 26 million citations, PubMed is one of the largest sources of information about the activity of chemicals in biological systems. Because this information is expressed in natural language and not stored as data, using the biomedical literature directly in computational resear...

  15. Exploring the Strategies for a Community College Transition into a Cloud-Computing Environment

    Science.gov (United States)

    DeBary, Narges

    2017-01-01

    The use of the Internet has resulted in the birth of an innovative virtualization technology called cloud computing. Virtualization can tremendously improve the instructional and operational systems of a community college. Although the incidental adoption of the cloud solutions in the community colleges of higher education has been increased,…

  16. A study on evaluation strategies in dimensional X-ray computed tomography by estimation of measurement uncertainties

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Cantatore, Angela

    2012-01-01

    Computed tomography has entered the industrial world in 1980’s as a technique for nondestructive testing and has nowadays become a revolutionary tool for dimensional metrology, suitable for actual/nominal comparison and verification of geometrical and dimensional tolerances. This paper evaluates...... measurement results using different measuring strategies applied in different inspection software packages for volume and surface data analysis. The strategy influence is determined by calculating the measurement uncertainty. This investigation includes measurements of two industrial items, an aluminium pipe...... connector and a plastic toggle, a hearing aid component. These are measured using a commercial CT scanner. Traceability is transferred using tactile and optical coordinate measuring machines, which are used to produce reference measurements. Results show that measurements of diameter for both parts resulted...

  17. The use of a proactive dissemination strategy to optimize reach of an internet-delivered computer tailored lifestyle intervention

    Science.gov (United States)

    2013-01-01

    Background The use of reactive strategies to disseminate effective Internet-delivered lifestyle interventions restricts their level of reach within the target population. This stresses the need to invest in proactive strategies to offer these interventions to the target population. The present study used a proactive strategy to increase reach of an Internet-delivered multi component computer tailored intervention, by embedding the intervention in an existing online health monitoring system of the Regional Public Health Services in the Netherlands. Methods The research population consisted of Dutch adults who were invited to participate in the Adult Health Monitor (N = 96,388) offered by the Regional Public Health Services. This Monitor consisted of an online or a written questionnaire. A prospective design was used to determine levels of reach, by focusing on actual participation in the lifestyle intervention. Furthermore, adequacy of reach among the target group was assessed by composing detailed profiles of intervention users. Participants’ characteristics, like demographics, behavioral and mental health status and quality of life, were included in the model as predictors. Results A total of 41,155 (43%) people participated in the Adult Health Monitor, of which 41% (n = 16,940) filled out the online version. More than half of the online participants indicated their interest (n = 9169; 54%) in the computer tailored intervention and 5168 participants (31%) actually participated in the Internet-delivered computer tailored intervention. Males, older respondents and individuals with a higher educational degree were significantly more likely to participate in the intervention. Furthermore, results indicated that especially participants with a relatively healthier lifestyle and a healthy BMI were likely to participate. Conclusions With one out of three online Adult Health Monitor participants actually participating in the computer tailored lifestyle

  18. The use of a proactive dissemination strategy to optimize reach of an internet-delivered computer tailored lifestyle intervention.

    Science.gov (United States)

    Schneider, Francine; Schulz, Daniela N; Pouwels, Loes H L; de Vries, Hein; van Osch, Liesbeth A D M

    2013-08-05

    The use of reactive strategies to disseminate effective Internet-delivered lifestyle interventions restricts their level of reach within the target population. This stresses the need to invest in proactive strategies to offer these interventions to the target population. The present study used a proactive strategy to increase reach of an Internet-delivered multi component computer tailored intervention, by embedding the intervention in an existing online health monitoring system of the Regional Public Health Services in the Netherlands. The research population consisted of Dutch adults who were invited to participate in the Adult Health Monitor (N = 96,388) offered by the Regional Public Health Services. This Monitor consisted of an online or a written questionnaire. A prospective design was used to determine levels of reach, by focusing on actual participation in the lifestyle intervention. Furthermore, adequacy of reach among the target group was assessed by composing detailed profiles of intervention users. Participants' characteristics, like demographics, behavioral and mental health status and quality of life, were included in the model as predictors. A total of 41,155 (43%) people participated in the Adult Health Monitor, of which 41% (n = 16,940) filled out the online version. More than half of the online participants indicated their interest (n = 9169; 54%) in the computer tailored intervention and 5168 participants (31%) actually participated in the Internet-delivered computer tailored intervention. Males, older respondents and individuals with a higher educational degree were significantly more likely to participate in the intervention. Furthermore, results indicated that especially participants with a relatively healthier lifestyle and a healthy BMI were likely to participate. With one out of three online Adult Health Monitor participants actually participating in the computer tailored lifestyle intervention, the employed proactive

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  20. Sampling-based motion planning with reachable volumes: Theoretical foundations

    KAUST Repository

    McMahon, Troy

    2014-05-01

    © 2014 IEEE. We introduce a new concept, reachable volumes, that denotes the set of points that the end effector of a chain or linkage can reach. We show that the reachable volume of a chain is equivalent to the Minkowski sum of the reachable volumes of its links, and give an efficient method for computing reachable volumes. We present a method for generating configurations using reachable volumes that is applicable to various types of robots including open and closed chain robots, tree-like robots, and complex robots including both loops and branches. We also describe how to apply constraints (both on end effectors and internal joints) using reachable volumes. Unlike previous methods, reachable volumes work for spherical and prismatic joints as well as planar joints. Visualizations of reachable volumes can allow an operator to see what positions the robot can reach and can guide robot design. We present visualizations of reachable volumes for representative robots including closed chains and graspers as well as for examples with joint and end effector constraints.

  1. Sampling-based motion planning with reachable volumes: Theoretical foundations

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. We introduce a new concept, reachable volumes, that denotes the set of points that the end effector of a chain or linkage can reach. We show that the reachable volume of a chain is equivalent to the Minkowski sum of the reachable volumes of its links, and give an efficient method for computing reachable volumes. We present a method for generating configurations using reachable volumes that is applicable to various types of robots including open and closed chain robots, tree-like robots, and complex robots including both loops and branches. We also describe how to apply constraints (both on end effectors and internal joints) using reachable volumes. Unlike previous methods, reachable volumes work for spherical and prismatic joints as well as planar joints. Visualizations of reachable volumes can allow an operator to see what positions the robot can reach and can guide robot design. We present visualizations of reachable volumes for representative robots including closed chains and graspers as well as for examples with joint and end effector constraints.

  2. Computational drug design strategies applied to the modelling of human immunodeficiency virus-1 reverse transcriptase inhibitors

    Directory of Open Access Journals (Sweden)

    Lucianna Helene Santos

    2015-11-01

    Full Text Available Reverse transcriptase (RT is a multifunctional enzyme in the human immunodeficiency virus (HIV-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.

  3. COMPUTATION IN CLOUD; A COMPETITIVE STRATEGY FOR THE SMALL AND MEDIUM COMPANIES IN MEXICO

    Directory of Open Access Journals (Sweden)

    Alma Lilia Sapién Aguilar

    2014-05-01

    Full Text Available Cloud  computing  is  a  technology  that  provides  services  via  the  Internet.  With  this technology, companies can gain a competitive advantage, provide better customer service and reduce  costs. The objective of this research  was to analyze  the Cloud computing  service of companies  in the  city of Chihuahua,  Mexico.  This  was  a non-experimental,  descriptive  and empirical study with a quantitative approach, which was based on a survey conducted in the months  of  January  2012  through  April  2013.  The  study’s  purpose  was  small  and  medium enterprises (SMEs in the industrial, commercial and service sectors which represent the study population. Finite population formula was used to obtain the sample size, which were selected in a random manner. The results showed that 93 % of companies obtain reduced costs using cloud computing. Storage and sharing information was detected as some of the most used services. Companies want to have savings in technology infrastructure in order to increase the life cycle of the equipment, in addition to provide a higher quality service to customers.

  4. Computational Strategies for Dissecting the High-Dimensional Complexity of Adaptive Immune Repertoires

    Directory of Open Access Journals (Sweden)

    Enkelejda Miho

    2018-02-01

    Full Text Available The adaptive immune system recognizes antigens via an immense array of antigen-binding antibodies and T-cell receptors, the immune repertoire. The interrogation of immune repertoires is of high relevance for understanding the adaptive immune response in disease and infection (e.g., autoimmunity, cancer, HIV. Adaptive immune receptor repertoire sequencing (AIRR-seq has driven the quantitative and molecular-level profiling of immune repertoires, thereby revealing the high-dimensional complexity of the immune receptor sequence landscape. Several methods for the computational and statistical analysis of large-scale AIRR-seq data have been developed to resolve immune repertoire complexity and to understand the dynamics of adaptive immunity. Here, we review the current research on (i diversity, (ii clustering and network, (iii phylogenetic, and (iv machine learning methods applied to dissect, quantify, and compare the architecture, evolution, and specificity of immune repertoires. We summarize outstanding questions in computational immunology and propose future directions for systems immunology toward coupling AIRR-seq with the computational discovery of immunotherapeutics, vaccines, and immunodiagnostics.

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  6. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  8. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  9. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    Science.gov (United States)

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-04-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.

  10. Nuclear cycle length economics strategy using stochastic and deterministic Monte Carlo computation models

    International Nuclear Information System (INIS)

    Wook Ahn, T.

    2014-01-01

    Nuclear power plants (NPP) have historically been a low cost base-load electricity source because of their high fuel density and operational reliability. In the United States, NPPs typically run 18- to 24-month cycles to limit outage times and maximize capacity factor. recently, however, increased volatility in energy and fuel prices, lower natural gas prices, higher material costs, and new sources are challenging the nuclear industry. This warrants a study in developing a more robust cycle length and fuel burnup strategy to make NPPs more competitive. (Author)

  11. Nuclear cycle length economics strategy using stochastic and deterministic Monte Carlo computation models

    Energy Technology Data Exchange (ETDEWEB)

    Wook Ahn, T.

    2014-07-01

    Nuclear power plants (NPP) have historically been a low cost base-load electricity source because of their high fuel density and operational reliability. In the United States, NPPs typically run 18- to 24-month cycles to limit outage times and maximize capacity factor. recently, however, increased volatility in energy and fuel prices, lower natural gas prices, higher material costs, and new sources are challenging the nuclear industry. This warrants a study in developing a more robust cycle length and fuel burnup strategy to make NPPs more competitive. (Author)

  12. Calculation of demands for nuclear fuels and fuel cycle services. Description of computer model and strategies developed by Working Group 1

    International Nuclear Information System (INIS)

    Working Group 1 examined a range of reactor deployment strategies and fuel cycle options, in oder to estimate the range of nuclear fuel requirements and fuel cycle service needs which would result. The computer model, its verification in comparison with other models, the strategies to be examined through use of the model, and the range of results obtained are described

  13. Computational metabolic engineering strategies for growth-coupled biofuel production by Synechocystis

    Directory of Open Access Journals (Sweden)

    Kiyan Shabestary

    2016-12-01

    Full Text Available Chemical and fuel production by photosynthetic cyanobacteria is a promising technology but to date has not reached competitive rates and titers. Genome-scale metabolic modeling can reveal limitations in cyanobacteria metabolism and guide genetic engineering strategies to increase chemical production. Here, we used constraint-based modeling and optimization algorithms on a genome-scale model of Synechocystis PCC6803 to find ways to improve productivity of fermentative, fatty-acid, and terpene-derived fuels. OptGene and MOMA were used to find heuristics for knockout strategies that could increase biofuel productivity. OptKnock was used to find a set of knockouts that led to coupling between biofuel and growth. Our results show that high productivity of fermentation or reversed beta-oxidation derived alcohols such as 1-butanol requires elimination of NADH sinks, while terpenes and fatty-acid based fuels require creating imbalances in intracellular ATP and NADPH production and consumption. The FBA-predicted productivities of these fuels are at least 10-fold higher than those reported so far in the literature. We also discuss the physiological and practical feasibility of implementing these knockouts. This work gives insight into how cyanobacteria could be engineered to reach competitive biofuel productivities. Keywords: Cyanobacteria, Modeling, Flux balance analysis, Biofuel, MOMA, OptFlux, OptKnock

  14. Low-Computation Strategies for Extracting CO2 Emission Trends from Surface-Level Mixing Ratio Observations

    Science.gov (United States)

    Shusterman, A.; Kim, J.; Lieschke, K.; Newman, C.; Cohen, R. C.

    2017-12-01

    Global momentum is building for drastic, regulated reductions in greenhouse gas emissions over the coming decade. With this increasing regulation comes a clear need for increasingly sophisticated monitoring, reporting, and verification (MRV) strategies capable of enforcing and optimizing emissions-related policy, particularly as it applies to urban areas. Remote sensing and/or activity-based emission inventories can offer MRV insights for entire sectors or regions, but are not yet sophisticated enough to resolve unexpected trends in specific emitters. Urban surface monitors can offer the desired proximity to individual greenhouse gas sources, but due to the densely-packed nature of typical urban landscapes, surface observations are rarely representative of a single source. Most previous efforts to decompose these complex signals into their contributing emission processes have involved inverse atmospheric modeling techniques, which are computationally intensive and believed to depend heavily on poorly understood a priori estimates of error covariance. Here we present a number of transparent, low-computation approaches for extracting source-specific emissions estimates from signals with a variety of nearfield influences. Using observations from the first several years of the BErkeley Atmospheric CO2 Observation Network (BEACO2N), we demonstrate how to exploit strategic pairings of monitoring "nodes," anomalous wind conditions, and well-understood temporal variations to hone in on specific CO2 sources of interest. When evaluated against conventional, activity-based bottom-up emission inventories, these strategies are seen to generate quantitatively rigorous emission estimates. With continued application as the BEACO2N data set grows in time and space, these approaches offer a promising avenue for optimizing greenhouse gas mitigation strategies into the future.

  15. On a numerical strategy to compute gravity currents of non-Newtonian fluids

    International Nuclear Information System (INIS)

    Vola, D.; Babik, F.; Latche, J.-C.

    2004-01-01

    This paper is devoted to the presentation of a numerical scheme for the simulation of gravity currents of non-Newtonian fluids. The two dimensional computational grid is fixed and the free-surface is described as a polygonal interface independent from the grid and advanced in time by a Lagrangian technique. Navier-Stokes equations are semi-discretized in time by the Characteristic-Galerkin method, which finally leads to solve a generalized Stokes problem posed on a physical domain limited by the free surface to only a part of the computational grid. To this purpose, we implement a Galerkin technique with a particular approximation space, defined as the restriction to the fluid domain of functions of a finite element space. The decomposition-coordination method allows to deal without any regularization with a variety of non-linear and possibly non-differentiable constitutive laws. Beside more analytical tests, we revisit with this numerical method some simulations of gravity currents of the literature, up to now investigated within the simplified thin-flow approximation framework

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  19. Effects of Ordering Strategies and Programming Paradigms on Sparse Matrix Computations

    Science.gov (United States)

    Oliker, Leonid; Li, Xiaoye; Husbands, Parry; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The Conjugate Gradient (CG) algorithm is perhaps the best-known iterative technique to solve sparse linear systems that are symmetric and positive definite. For systems that are ill-conditioned, it is often necessary to use a preconditioning technique. In this paper, we investigate the effects of various ordering and partitioning strategies on the performance of parallel CG and ILU(O) preconditioned CG (PCG) using different programming paradigms and architectures. Results show that for this class of applications: ordering significantly improves overall performance on both distributed and distributed shared-memory systems, that cache reuse may be more important than reducing communication, that it is possible to achieve message-passing performance using shared-memory constructs through careful data ordering and distribution, and that a hybrid MPI+OpenMP paradigm increases programming complexity with little performance gains. A implementation of CG on the Cray MTA does not require special ordering or partitioning to obtain high efficiency and scalability, giving it a distinct advantage for adaptive applications; however, it shows limited scalability for PCG due to a lack of thread level parallelism.

  20. A Soft Computing Based Approach Using Modified Selection Strategy for Feature Reduction of Medical Systems

    Directory of Open Access Journals (Sweden)

    Kursat Zuhtuogullari

    2013-01-01

    Full Text Available The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  1. A soft computing based approach using modified selection strategy for feature reduction of medical systems.

    Science.gov (United States)

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  2. Beat the Cheater: Computing Game-Theoretic Strategies for When to Kick a Gambler out of a Casino

    DEFF Research Database (Denmark)

    Sørensen, Troels Bjerre; Dalis, Melissa; Korzhyk, Dmytro

    2014-01-01

    Gambles in casinos are usually set up so that the casino makes a profit in expectation -- as long as gamblers play honestly. However, some gamblers are able to cheat, reducing the casino’s profit. How should the casino address this? A common strategy is to selectively kick gamblers out, possibly ...... techniques could be useful for addressing related problems -- for example, illegal trades in financial markets....... model the problem as a Bayesian game in which the casino is a Stackelberg leader that can commit to a (possibly randomized) policy for when to kick gamblers out, and we provide efficient algorithms for computing the optimal policy. Besides being potentially useful to casinos, we imagine that similar...

  3. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  10. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    International Nuclear Information System (INIS)

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  11. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    Science.gov (United States)

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  12. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  13. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  15. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  16. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  19. Final Report: Sampling-Based Algorithms for Estimating Structure in Big Data.

    Energy Technology Data Exchange (ETDEWEB)

    Matulef, Kevin Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    The purpose of this project was to develop sampling-based algorithms to discover hidden struc- ture in massive data sets. Inferring structure in large data sets is an increasingly common task in many critical national security applications. These data sets come from myriad sources, such as network traffic, sensor data, and data generated by large-scale simulations. They are often so large that traditional data mining techniques are time consuming or even infeasible. To address this problem, we focus on a class of algorithms that do not compute an exact answer, but instead use sampling to compute an approximate answer using fewer resources. The particular class of algorithms that we focus on are streaming algorithms , so called because they are designed to handle high-throughput streams of data. Streaming algorithms have only a small amount of working storage - much less than the size of the full data stream - so they must necessarily use sampling to approximate the correct answer. We present two results: * A streaming algorithm called HyperHeadTail , that estimates the degree distribution of a graph (i.e., the distribution of the number of connections for each node in a network). The degree distribution is a fundamental graph property, but prior work on estimating the degree distribution in a streaming setting was impractical for many real-world application. We improve upon prior work by developing an algorithm that can handle streams with repeated edges, and graph structures that evolve over time. * An algorithm for the task of maintaining a weighted subsample of items in a stream, when the items must be sampled according to their weight, and the weights are dynamically changing. To our knowledge, this is the first such algorithm designed for dynamically evolving weights. We expect it may be useful as a building block for other streaming algorithms on dynamic data sets.

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  2. Cloud Computing Strategy

    Science.gov (United States)

    2012-07-01

    regardless of  access point or the device being used across the Global Information Grid ( GIG ).  These data  centers will host existing applications...state.  It  illustrates that the DoD Enterprise Cloud is an integrated environment on the  GIG , consisting of  DoD Components, commercial entities...Operations and Maintenance (O&M) costs by  leveraging  economies  of scale, and automate monitoring and provisioning to reduce the  human cost of service

  3. An efficient rhythmic component expression and weighting synthesis strategy for classifying motor imagery EEG in a brain computer interface

    Science.gov (United States)

    Wang, Tao; He, Bin

    2004-03-01

    The recognition of mental states during motor imagery tasks is crucial for EEG-based brain computer interface research. We have developed a new algorithm by means of frequency decomposition and weighting synthesis strategy for recognizing imagined right- and left-hand movements. A frequency range from 5 to 25 Hz was divided into 20 band bins for each trial, and the corresponding envelopes of filtered EEG signals for each trial were extracted as a measure of instantaneous power at each frequency band. The dimensionality of the feature space was reduced from 200 (corresponding to 2 s) to 3 by down-sampling of envelopes of the feature signals, and subsequently applying principal component analysis. The linear discriminate analysis algorithm was then used to classify the features, due to its generalization capability. Each frequency band bin was weighted by a function determined according to the classification accuracy during the training process. The present classification algorithm was applied to a dataset of nine human subjects, and achieved a success rate of classification of 90% in training and 77% in testing. The present promising results suggest that the present classification algorithm can be used in initiating a general-purpose mental state recognition based on motor imagery tasks.

  4. Cation-π interactions: computational analyses of the aromatic box motif and the fluorination strategy for experimental evaluation.

    Science.gov (United States)

    Davis, Matthew R; Dougherty, Dennis A

    2015-11-21

    Cation-π interactions are common in biological systems, and many structural studies have revealed the aromatic box as a common motif. With the aim of understanding the nature of the aromatic box, several computational methods were evaluated for their ability to reproduce experimental cation-π binding energies. We find the DFT method M06 with the 6-31G(d,p) basis set performs best of several methods tested. The binding of benzene to a number of different cations (sodium, potassium, ammonium, tetramethylammonium, and guanidinium) was studied. In addition, the binding of the organic cations NH4(+) and NMe4(+) to ab initio generated aromatic boxes as well as examples of aromatic boxes from protein crystal structures were investigated. These data, along with a study of the distance dependence of the cation-π interaction, indicate that multiple aromatic residues can meaningfully contribute to cation binding, even with displacements of more than an angstrom from the optimal cation-π interaction. Progressive fluorination of benzene and indole was studied as well, and binding energies obtained were used to reaffirm the validity of the "fluorination strategy" to study cation-π interactions in vivo.

  5. Relative Effectiveness of Computer-Supported Jigsaw II, STAD and TAI Cooperative Learning Strategies on Performance, Attitude, and Retention of Secondary School Students in Physics

    Science.gov (United States)

    Gambari, Amosa Isiaka; Yusuf, Mudasiru Olalere

    2017-01-01

    This study investigated the relative effectiveness of computer-supported cooperative learning strategies on the performance, attitudes, and retention of secondary school students in physics. A purposive sampling technique was used to select four senior secondary schools from Minna, Nigeria. The students were allocated to one of four groups:…

  6. Effects of Computer-Assisted Instruction in Using Formal Decision-Making Strategies to Choose a College Major.

    Science.gov (United States)

    Mau, Wei-Cheng; Jepsen, David A.

    1992-01-01

    Compared decision-making strategies and college major choice among 113 first-year students assigned to Elimination by Aspects Strategy (EBA), Subjective Expected Utility Strategy (SEU), and control groups. "Rational" EBA students scored significantly higher on choice certainty; lower on choice anxiety and career indecision than "rational"…

  7. Influence of Cone-beam Computed Tomography on Endodontic Retreatment Strategies among General Dental Practitioners and Endodontists.

    Science.gov (United States)

    Rodríguez, Gustavo; Patel, Shanon; Durán-Sindreu, Fernando; Roig, Miguel; Abella, Francesc

    2017-09-01

    Treatment options for endodontic failure include nonsurgical or surgical endodontic retreatment, intentional replantation, and extraction with or without replacement of the tooth. The aim of the present study was to determine the impact of cone-beam computed tomographic (CBCT) imaging on clinical decision making among general dental practitioners and endodontists after failed root canal treatment. A second objective was to assess the self-reported level of difficulty in making a treatment choice before and after viewing a preoperative CBCT scan. Eight patients with endodontically treated teeth diagnosed as symptomatic apical periodontitis, acute apical abscess, or chronic apical abscess were selected. In the first session, the examiners were given the details of each case, including any relevant radiographs, and were asked to choose 1 of the proposed treatment alternatives and assess the difficulty of making a decision. One month later, the examiners reviewed randomly the same 8 cases with the additional information from the CBCT data. The examiners altered their treatment plan after viewing the CBCT scan in 49.8% of the cases. A significant difference in the treatment plan between the 2 imaging modalities was recorded for endodontists and general practitioners (P < .05). After CBCT evaluation, neither group altered their self-reported level of difficulty when choosing a treatment plan (P = .0524). The extraction option rose significantly to 20% after viewing the CBCT scan (P < .05). CBCT imaging directly influences endodontic retreatment strategies among general dental practitioners and endodontists. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  8. Rayleigh’s quotient–based damage detection algorithm: Theoretical concepts, computational techniques, and field implementation strategies

    DEFF Research Database (Denmark)

    NJOMO WANDJI, Wilfried

    2017-01-01

    levels are targeted: existence, location, and severity. The proposed algorithm is analytically developed from the dynamics theory and the virtual energy principle. Some computational techniques are proposed for carrying out computations, including discretization, integration, derivation, and suitable...

  9. STRATEGY FOR IMPROVEMENT OF SAFETY AND EFFICIENCY OF COMPUTER-AIDED DESIGN ANALYSIS OF CIVIL ENGINEERING STRUCTURES ON THE BASIS OF THE SYSTEM APPROACH

    Directory of Open Access Journals (Sweden)

    Zaikin Vladimir Genrikhovich

    2012-12-01

    Full Text Available The authors highlight three problems of the age of information technologies and proposes the strategy for their resolution in relation to the computer-aided design of civil engineering structures. The authors express their concerns in respect of globalization of software programmes designated for the analysis of civil engineering structures and employed outside of Russia. The problem of the poor quality of the input data has reached Russia. Lately, the rate of accidents of buildings and structures has been growing not only in Russia. Control over efficiency of design projects is hardly performed. This attitude should be changed. Development and introduction of CAD along with the application the efficient methods of projection of behaviour of building structures are in demand. Computer-aided calculations have the function of a logical nucleus, and they need proper control. The system approach to computer-aided calculations and technologies designated for the projection of accidents is formulated by the authors. Two tasks of the system approach and fundamentals of the strategy for its implementation are formulated. The study of cases of negative results of computer-aided design of engineering structures was performed and multi-component design patterns were developed. Conclusions concerning the results of researches aimed at regular and wide-scale implementation of the strategy fundamentals are formulated. Organizational and innovative actions concerning the projected behaviour of civil engineering structures proposed in the strategy are to facilitate: safety and reliability improvement of buildings and structures; saving of building materials and resources; improvement of labour efficiency of designers; modernization and improvement of accuracy of projected behaviour of buildings and building standards; closer ties between civil and building engineering researchers and construction companies; development of competitive environment to boost

  10. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action

    NARCIS (Netherlands)

    Pawlik, A.; Gelder, C.W.G. van; Nenadic, A.; Palagi, P.M.; Korpelainen, E.; Lijnzaad, P.; Marek, D.; Sansone, S.A.; Hancock, J.; Goble, C.

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training

  11. Scalable Game Design: A Strategy to Bring Systemic Computer Science Education to Schools through Game Design and Simulation Creation

    Science.gov (United States)

    Repenning, Alexander; Webb, David C.; Koh, Kyu Han; Nickerson, Hilarie; Miller, Susan B.; Brand, Catharine; Her Many Horses, Ian; Basawapatna, Ashok; Gluck, Fred; Grover, Ryan; Gutierrez, Kris; Repenning, Nadia

    2015-01-01

    An educated citizenry that participates in and contributes to science technology engineering and mathematics innovation in the 21st century will require broad literacy and skills in computer science (CS). School systems will need to give increased attention to opportunities for students to engage in computational thinking and ways to promote a…

  12. Strategies for Ensuring Computer Literacy among Undergraduate Business Students: A Marketing Survey of AACSB-Accredited Schools

    Science.gov (United States)

    Hungerford, Bruce C.; Baxter, Joseph T.; LeMay, Stephen; Helms, Marilyn M.

    2012-01-01

    There is broad agreement that college students need computer and information literacy for their studies and to be competitive as graduates in an environment that increasingly relies on information technology. However, as information technology changes, what constitutes computer literacy changes. Colleges have traditionally used the freshman- or…

  13. Individualizing Educational Strategies: An Apple Computer Managed System for the Diagnosis and Evaluation of Reading, Math and Behavior.

    Science.gov (United States)

    Kern, Richard

    1985-01-01

    A computer-based interactive system for diagnosing academic and school behavior problems is described. Elements include criterion-referenced testing, an instructional management system, and a behavior evaluation tool developed by the author. (JW)

  14. Five-Year-Olds’ Systematic Errors in Second-Order False Belief Tasks Are Due to First-Order Theory of Mind Strategy Selection: A Computational Modeling Study

    Science.gov (United States)

    Arslan, Burcu; Taatgen, Niels A.; Verbrugge, Rineke

    2017-01-01

    The focus of studies on second-order false belief reasoning generally was on investigating the roles of executive functions and language with correlational studies. Different from those studies, we focus on the question how 5-year-olds select and revise reasoning strategies in second-order false belief tasks by constructing two computational cognitive models of this process: an instance-based learning model and a reinforcement learning model. Unlike the reinforcement learning model, the instance-based learning model predicted that children who fail second-order false belief tasks would give answers based on first-order theory of mind (ToM) reasoning as opposed to zero-order reasoning. This prediction was confirmed with an empirical study that we conducted with 72 5- to 6-year-old children. The results showed that 17% of the answers were correct and 83% of the answers were wrong. In line with our prediction, 65% of the wrong answers were based on a first-order ToM strategy, while only 29% of them were based on a zero-order strategy (the remaining 6% of subjects did not provide any answer). Based on our instance-based learning model, we propose that when children get feedback “Wrong,” they explicitly revise their strategy to a higher level instead of implicitly selecting one of the available ToM strategies. Moreover, we predict that children’s failures are due to lack of experience and that with exposure to second-order false belief reasoning, children can revise their wrong first-order reasoning strategy to a correct second-order reasoning strategy. PMID:28293206

  15. Five-Year-Olds' Systematic Errors in Second-Order False Belief Tasks Are Due to First-Order Theory of Mind Strategy Selection: A Computational Modeling Study.

    Science.gov (United States)

    Arslan, Burcu; Taatgen, Niels A; Verbrugge, Rineke

    2017-01-01

    The focus of studies on second-order false belief reasoning generally was on investigating the roles of executive functions and language with correlational studies. Different from those studies, we focus on the question how 5-year-olds select and revise reasoning strategies in second-order false belief tasks by constructing two computational cognitive models of this process: an instance-based learning model and a reinforcement learning model. Unlike the reinforcement learning model, the instance-based learning model predicted that children who fail second-order false belief tasks would give answers based on first-order theory of mind (ToM) reasoning as opposed to zero-order reasoning. This prediction was confirmed with an empirical study that we conducted with 72 5- to 6-year-old children. The results showed that 17% of the answers were correct and 83% of the answers were wrong. In line with our prediction, 65% of the wrong answers were based on a first-order ToM strategy, while only 29% of them were based on a zero-order strategy (the remaining 6% of subjects did not provide any answer). Based on our instance-based learning model, we propose that when children get feedback "Wrong," they explicitly revise their strategy to a higher level instead of implicitly selecting one of the available ToM strategies. Moreover, we predict that children's failures are due to lack of experience and that with exposure to second-order false belief reasoning, children can revise their wrong first-order reasoning strategy to a correct second-order reasoning strategy.

  16. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    International Nuclear Information System (INIS)

    Zhu, T.

    2015-01-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ_e_f_f sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  17. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, T.

    2015-07-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ{sub eff} sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  18. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action.

    Science.gov (United States)

    Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole

    2017-01-01

    Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  19. Individualized Nonadaptive and Online-Adaptive Intensity-Modulated Radiotherapy Treatment Strategies for Cervical Cancer Patients Based on Pretreatment Acquired Variable Bladder Filling Computed Tomography Scans

    International Nuclear Information System (INIS)

    Bondar, M.L.; Hoogeman, M.S.; Mens, J.W.; Quint, S.; Ahmad, R.; Dhawtal, G.; Heijmen, B.J.

    2012-01-01

    Purpose: To design and evaluate individualized nonadaptive and online-adaptive strategies based on a pretreatment established motion model for the highly deformable target volume in cervical cancer patients. Methods and Materials: For 14 patients, nine to ten variable bladder filling computed tomography (CT) scans were acquired at pretreatment and after 40 Gy. Individualized model-based internal target volumes (mbITVs) accounting for the cervix and uterus motion due to bladder volume changes were generated by using a motion-model constructed from two pretreatment CT scans (full and empty bladder). Two individualized strategies were designed: a nonadaptive strategy, using an mbITV accounting for the full-range of bladder volume changes throughout the treatment; and an online-adaptive strategy, using mbITVs of bladder volume subranges to construct a library of plans. The latter adapts the treatment online by selecting the plan-of-the-day from the library based on the measured bladder volume. The individualized strategies were evaluated by the seven to eight CT scans not used for mbITVs construction, and compared with a population-based approach. Geometric uniform margins around planning cervix–uterus and mbITVs were determined to ensure adequate coverage. For each strategy, the percentage of the cervix–uterus, bladder, and rectum volumes inside the planning target volume (PTV), and the clinical target volume (CTV)-to-PTV volume (volume difference between PTV and CTV) were calculated. Results: The margin for the population-based approach was 38 mm and for the individualized strategies was 7 to 10 mm. Compared with the population-based approach, the individualized nonadaptive strategy decreased the CTV-to-PTV volume by 48% ± 6% and the percentage of bladder and rectum inside the PTV by 5% to 45% and 26% to 74% (p < 0.001), respectively. Replacing the individualized nonadaptive strategy by an online-adaptive, two-plan library further decreased the percentage of

  20. Facilitating Preschoolers' Scientific Knowledge Construction via Computer Games Regarding Light and Shadow: The Effect of the Prediction-Observation-Explanation (POE) Strategy

    Science.gov (United States)

    Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong

    2011-10-01

    Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the prediction-observation-explanation (POE) strategy (White and Gunstone in Probing understanding. Routledge, New York, 1992) on facilitating preschoolers' acquisition of scientific concepts regarding light and shadow. The children's alternative conceptions were explored as well. Fifty participants were randomly assigned into either an experimental group that played a computer game integrating the POE model or a control group that played a non-POE computer game. By assessing the students' conceptual understanding through interviews, this study revealed that the students in the experimental group significantly outperformed their counterparts in the concepts regarding "shadow formation in daylight" and "shadow orientation." However, children in both groups, after playing the games, still expressed some alternative conceptions such as "Shadows always appear behind a person" and "Shadows should be on the same side as the sun."

  1. Computational reduction strategies for the detection of steady bifurcations in incompressible fluid-dynamics: Applications to Coanda effect in cardiology

    Science.gov (United States)

    Pitton, Giuseppe; Quaini, Annalisa; Rozza, Gianluigi

    2017-09-01

    We focus on reducing the computational costs associated with the hydrodynamic stability of solutions of the incompressible Navier-Stokes equations for a Newtonian and viscous fluid in contraction-expansion channels. In particular, we are interested in studying steady bifurcations, occurring when non-unique stable solutions appear as physical and/or geometric control parameters are varied. The formulation of the stability problem requires solving an eigenvalue problem for a partial differential operator. An alternative to this approach is the direct simulation of the flow to characterize the asymptotic behavior of the solution. Both approaches can be extremely expensive in terms of computational time. We propose to apply Reduced Order Modeling (ROM) techniques to reduce the demanding computational costs associated with the detection of a type of steady bifurcations in fluid dynamics. The application that motivated the present study is the onset of asymmetries (i.e., symmetry breaking bifurcation) in blood flow through a regurgitant mitral valve, depending on the Reynolds number and the regurgitant mitral valve orifice shape.

  2. New strategies of the LHC experiments to meet the computing requirements of the HL-LHC era

    CERN Document Server

    Adamova, Dagmar

    2017-01-01

    The performance of the Large Hadron Collider (LHC) during the ongoing Run 2 is above expectations both concerning the delivered luminosity and the LHC live time. This resulted in a volume of data much larger than originally anticipated. Based on the current data production levels and the structure of the LHC experiment computing models, the estimates of the data production rates and resource needs were re-evaluated for the era leading into the High Luminosity LHC (HLLHC), the Run 3 and Run 4 phases of LHC operation. It turns out that the raw data volume will grow 10 times by the HL-LHC era and the processing capacity needs will grow more than 60 times. While the growth of storage requirements might in principle be satisfied with a 20 per cent budget increase and technology advancements, there is a gap of a factor 6 to 10 between the needed and available computing resources. The threat of a lack of computing and storage resources was present already in the beginning of Run 2, but could still be mitigated, e.g....

  3. Sampling-based exploration of folded state of a protein under kinematic and geometric constraints

    KAUST Repository

    Yao, Peggy

    2011-10-04

    Flexibility is critical for a folded protein to bind to other molecules (ligands) and achieve its functions. The conformational selection theory suggests that a folded protein deforms continuously and its ligand selects the most favorable conformations to bind to. Therefore, one of the best options to study protein-ligand binding is to sample conformations broadly distributed over the protein-folded state. This article presents a new sampler, called kino-geometric sampler (KGS). This sampler encodes dominant energy terms implicitly by simple kinematic and geometric constraints. Two key technical contributions of KGS are (1) a robotics-inspired Jacobian-based method to simultaneously deform a large number of interdependent kinematic cycles without any significant break-up of the closure constraints, and (2) a diffusive strategy to generate conformation distributions that diffuse quickly throughout the protein folded state. Experiments on four very different test proteins demonstrate that KGS can efficiently compute distributions containing conformations close to target (e.g., functional) conformations. These targets are not given to KGS, hence are not used to bias the sampling process. In particular, for a lysine-binding protein, KGS was able to sample conformations in both the intermediate and functional states without the ligand, while previous work using molecular dynamics simulation had required the ligand to be taken into account in the potential function. Overall, KGS demonstrates that kino-geometric constraints characterize the folded subset of a protein conformation space and that this subset is small enough to be approximated by a relatively small distribution of conformations. © 2011 Wiley Periodicals, Inc.

  4. Sampling-Based Motion Planning Algorithms for Replanning and Spatial Load Balancing

    Energy Technology Data Exchange (ETDEWEB)

    Boardman, Beth Leigh [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-12

    The common theme of this dissertation is sampling-based motion planning with the two key contributions being in the area of replanning and spatial load balancing for robotic systems. Here, we begin by recalling two sampling-based motion planners: the asymptotically optimal rapidly-exploring random tree (RRT*), and the asymptotically optimal probabilistic roadmap (PRM*). We also provide a brief background on collision cones and the Distributed Reactive Collision Avoidance (DRCA) algorithm. The next four chapters detail novel contributions for motion replanning in environments with unexpected static obstacles, for multi-agent collision avoidance, and spatial load balancing. First, we show improved performance of the RRT* when using the proposed Grandparent-Connection (GP) or Focused-Refinement (FR) algorithms. Next, the Goal Tree algorithm for replanning with unexpected static obstacles is detailed and proven to be asymptotically optimal. A multi-agent collision avoidance problem in obstacle environments is approached via the RRT*, leading to the novel Sampling-Based Collision Avoidance (SBCA) algorithm. The SBCA algorithm is proven to guarantee collision free trajectories for all of the agents, even when subject to uncertainties in the knowledge of the other agents’ positions and velocities. Given that a solution exists, we prove that livelocks and deadlock will lead to the cost to the goal being decreased. We introduce a new deconfliction maneuver that decreases the cost-to-come at each step. This new maneuver removes the possibility of livelocks and allows a result to be formed that proves convergence to the goal configurations. Finally, we present a limited range Graph-based Spatial Load Balancing (GSLB) algorithm which fairly divides a non-convex space among multiple agents that are subject to differential constraints and have a limited travel distance. The GSLB is proven to converge to a solution when maximizing the area covered by the agents. The analysis

  5. Effects of a radiation dose reduction strategy for computed tomography in severely injured trauma patients in the emergency department: an observational study

    Directory of Open Access Journals (Sweden)

    Kim Soo Hyun

    2011-11-01

    Full Text Available Abstract Background Severely injured trauma patients are exposed to clinically significant radiation doses from computed tomography (CT imaging in the emergency department. Moreover, this radiation exposure is associated with an increased risk of cancer. The purpose of this study was to determine some effects of a radiation dose reduction strategy for CT in severely injured trauma patients in the emergency department. Methods We implemented the radiation dose reduction strategy in May 2009. A prospective observational study design was used to collect data from patients who met the inclusion criteria during this one year study (intervention group from May 2009 to April 2010. The prospective data were compared with data collected retrospectively for one year prior to the implementation of the radiation dose reduction strategy (control group. By comparison of the cumulative effective dose and the number of CT examinations in the two groups, we evaluated effects of a radiation dose reduction strategy. All the patients met the institutional adult trauma team activation criteria. The radiation doses calculated by the CT scanner were converted to effective doses by multiplication by a conversion coefficient. Results A total of 118 patients were included in this study. Among them, 33 were admitted before May 2009 (control group, and 85 were admitted after May 2009 (intervention group. There were no significant differences between the two groups regarding baseline characteristics, such as injury severity and mortality. Additionally, there was no difference between the two groups in the mean number of total CT examinations per patient (4.8 vs. 4.5, respectively; p = 0.227. However, the mean effective dose of the total CT examinations per patient significantly decreased from 78.71 mSv to 29.50 mSv (p Conclusions The radiation dose reduction strategy for CT in severely injured trauma patients effectively decreased the cumulative effective dose of the total

  6. Strategies and methodologies to develop techniques for computer-assisted analysis of gas phase formation during altitude decompression

    Science.gov (United States)

    Powell, Michael R.; Hall, W. A.

    1993-01-01

    It would be of operational significance if one possessed a device that would indicate the presence of gas phase formation in the body during hypobaric decompression. Automated analysis of Doppler gas bubble signals has been attempted for 2 decades but with generally unfavorable results, except with surgically implanted transducers. Recently, efforts have intensified with the introduction of low-cost computer programs. Current NASA work is directed towards the development of a computer-assisted method specifically targeted to EVA, and we are most interested in Spencer Grade 4. We note that Spencer Doppler Grades 1 to 3 have increased in the FFT sonogram and spectrogram in the amplitude domain, and the frequency domain is sometimes increased over that created by the normal blood flow envelope. The amplitude perturbations are of very short duration, in both systole and diastole and at random temporal positions. Grade 4 is characteristic in the amplitude domain but with modest increases in the FFT sonogram and spectral frequency power from 2K to 4K over all of the cardiac cycle. Heart valve motion appears to characteristic display signals: (1) the demodulated Doppler signal amplitude is considerably above the Doppler-shifted blow flow signal (even Grade 4); and (2) demodulated Doppler frequency shifts are considerably greater (often several kHz) than the upper edge of the blood flow envelope. Knowledge of these facts will aid in the construction of a real-time, computer-assisted discriminator to eliminate cardiac motion artifacts. There could also exist perturbations in the following: (1) modifications of the pattern of blood flow in accordance with Poiseuille's Law, (2) flow changes with a change in the Reynolds number, (3) an increase in the pulsatility index, and/or (4) diminished diastolic flow or 'runoff.' Doppler ultrasound devices have been constructed with a three-transducer array and a pulsed frequency generator.

  7. A parallel offline CFD and closed-form approximation strategy for computationally efficient analysis of complex fluid flows

    Science.gov (United States)

    Allphin, Devin

    Computational fluid dynamics (CFD) solution approximations for complex fluid flow problems have become a common and powerful engineering analysis technique. These tools, though qualitatively useful, remain limited in practice by their underlying inverse relationship between simulation accuracy and overall computational expense. While a great volume of research has focused on remedying these issues inherent to CFD, one traditionally overlooked area of resource reduction for engineering analysis concerns the basic definition and determination of functional relationships for the studied fluid flow variables. This artificial relationship-building technique, called meta-modeling or surrogate/offline approximation, uses design of experiments (DOE) theory to efficiently approximate non-physical coupling between the variables of interest in a fluid flow analysis problem. By mathematically approximating these variables, DOE methods can effectively reduce the required quantity of CFD simulations, freeing computational resources for other analytical focuses. An idealized interpretation of a fluid flow problem can also be employed to create suitably accurate approximations of fluid flow variables for the purposes of engineering analysis. When used in parallel with a meta-modeling approximation, a closed-form approximation can provide useful feedback concerning proper construction, suitability, or even necessity of an offline approximation tool. It also provides a short-circuit pathway for further reducing the overall computational demands of a fluid flow analysis, again freeing resources for otherwise unsuitable resource expenditures. To validate these inferences, a design optimization problem was presented requiring the inexpensive estimation of aerodynamic forces applied to a valve operating on a simulated piston-cylinder heat engine. The determination of these forces was to be found using parallel surrogate and exact approximation methods, thus evidencing the comparative

  8. Self-regulated learning in higher education: strategies adopted by computer programming students when supported by the SimProgramming approach

    Directory of Open Access Journals (Sweden)

    Daniela Pedrosa

    Full Text Available Abstract The goal of the SimProgramming approach is to help students overcome their learning difficulties in the transition from entry-level to advanced computer programming, developing an appropriate set of learning strategies. We implemented it at the University of Trás-os-Montes e Alto Douro (Portugal, in two courses (PM3 and PM4 of the bachelor programmes in Informatics Engineering and ICT. We conducted semi-structured interviews with students (n=38 at the end of the courses, to identify the students’ strategies for self-regulation of learning in the assignment. We found that students changed some of their strategies from one course edition to the following one and that changes are related to the SimProgramming approach. We believe that changes to the educational approach were appropriate to support the assignment goals. We recommend applying the SimProgramming approach in other educational contexts, to improve educational practices by including techniques to help students in their learning.

  9. Project-based learning strategy, supported by virtual mediations and computer tools in a poultry production course: case study in Agricultural Sciences

    Directory of Open Access Journals (Sweden)

    Luis Díaz S

    2017-12-01

    Full Text Available The project-based learning strategy, supported by virtual mediations and computer tools, was applied to 42 students of a program of Animal Science in the course poultry Production in which the degree of familiarization, academic productivity, e-mail asynchronous mediation, the domain of the excel spreadsheet and the appreciation against the implemented methodology. The results showed that 100% of the students did not know the learning strategy and showed fears at the beginning of the activity. The final grades obtained (4.44 ± 0.14, 38.09%; 3.67 ± 0.09; 38.09%; 2.8, 19.05%, delivered products and degree of achievement (100%, 31 students; 88.88%, 5 students; 77.77%, 3 students; 55.55%, 3 students were influenced by the degree of mastery of the spreadsheet (8 students showed mastery, 26 a basic level to elementary and scarce, the rest and the registered participation level. It was found that the strategy generated motivation in the students reflected in the accomplishment of the goals and objectives drawn at the beginning of the course, increased the student-teacher interaction and reached a high academic performance (final grades ≥3.7 in the majority of the participants (73.8%.

  10. A scalable method for parallelizing sampling-based motion planning algorithms

    KAUST Repository

    Jacobs, Sam Ade; Manavi, Kasra; Burgos, Juan; Denny, Jory; Thomas, Shawna; Amato, Nancy M.

    2012-01-01

    This paper describes a scalable method for parallelizing sampling-based motion planning algorithms. It subdivides configuration space (C-space) into (possibly overlapping) regions and independently, in parallel, uses standard (sequential) sampling-based planners to construct roadmaps in each region. Next, in parallel, regional roadmaps in adjacent regions are connected to form a global roadmap. By subdividing the space and restricting the locality of connection attempts, we reduce the work and inter-processor communication associated with nearest neighbor calculation, a critical bottleneck for scalability in existing parallel motion planning methods. We show that our method is general enough to handle a variety of planning schemes, including the widely used Probabilistic Roadmap (PRM) and Rapidly-exploring Random Trees (RRT) algorithms. We compare our approach to two other existing parallel algorithms and demonstrate that our approach achieves better and more scalable performance. Our approach achieves almost linear scalability on a 2400 core LINUX cluster and on a 153,216 core Cray XE6 petascale machine. © 2012 IEEE.

  11. A scalable method for parallelizing sampling-based motion planning algorithms

    KAUST Repository

    Jacobs, Sam Ade

    2012-05-01

    This paper describes a scalable method for parallelizing sampling-based motion planning algorithms. It subdivides configuration space (C-space) into (possibly overlapping) regions and independently, in parallel, uses standard (sequential) sampling-based planners to construct roadmaps in each region. Next, in parallel, regional roadmaps in adjacent regions are connected to form a global roadmap. By subdividing the space and restricting the locality of connection attempts, we reduce the work and inter-processor communication associated with nearest neighbor calculation, a critical bottleneck for scalability in existing parallel motion planning methods. We show that our method is general enough to handle a variety of planning schemes, including the widely used Probabilistic Roadmap (PRM) and Rapidly-exploring Random Trees (RRT) algorithms. We compare our approach to two other existing parallel algorithms and demonstrate that our approach achieves better and more scalable performance. Our approach achieves almost linear scalability on a 2400 core LINUX cluster and on a 153,216 core Cray XE6 petascale machine. © 2012 IEEE.

  12. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces

    Science.gov (United States)

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-01

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  13. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  14. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi; Zhou, Lan; Najibi, Seyed Morteza; Gao, Xin; Huang, Jianhua Z.

    2015-01-01

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  15. Strategies to Prevent Cholera Introduction during International Personnel Deployments: A Computational Modeling Analysis Based on the 2010 Haiti Outbreak

    Science.gov (United States)

    Lewnard, Joseph A.; Antillón, Marina; Gonsalves, Gregg; Miller, Alice M.; Ko, Albert I.; Pitzer, Virginia E.

    2016-01-01

    Background Introduction of Vibrio cholerae to Haiti during the deployment of United Nations (UN) peacekeepers in 2010 resulted in one of the largest cholera epidemics of the modern era. Following the outbreak, a UN-commissioned independent panel recommended three pre-deployment intervention strategies to minimize the risk of cholera introduction in future peacekeeping operations: screening for V. cholerae carriage, administering prophylactic antimicrobial chemotherapies, or immunizing with oral cholera vaccines. However, uncertainty regarding the effectiveness of these approaches has forestalled their implementation by the UN. We assessed how the interventions would have impacted the likelihood of the Haiti cholera epidemic. Methods and Findings We developed a stochastic model for cholera importation and transmission, fitted to reported cases during the first weeks of the 2010 outbreak in Haiti. Using this model, we estimated that diagnostic screening reduces the probability of cases occurring by 82% (95% credible interval: 75%, 85%); however, false-positive test outcomes may hamper this approach. Antimicrobial chemoprophylaxis at time of departure and oral cholera vaccination reduce the probability of cases by 50% (41%, 57%) and by up to 61% (58%, 63%), respectively. Chemoprophylaxis beginning 1 wk before departure confers a 91% (78%, 96%) reduction independently, and up to a 98% reduction (94%, 99%) if coupled with vaccination. These results are not sensitive to assumptions about the background cholera incidence rate in the endemic troop-sending country. Further research is needed to (1) validate the sensitivity and specificity of rapid test approaches for detecting asymptomatic carriage, (2) compare prophylactic efficacy across antimicrobial regimens, and (3) quantify the impact of oral cholera vaccine on transmission from asymptomatic carriers. Conclusions Screening, chemoprophylaxis, and vaccination are all effective strategies to prevent cholera introduction

  16. Complex-plane strategy for computing rotating polytropic models - efficiency and accuracy of the complex first-order perturbation theory

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1988-01-01

    In this paper, a numerical method is developed for determining the structure distortion of a polytropic star which rotates either uniformly or differentially. This method carries out the required numerical integrations in the complex plane. The method is implemented to compute indicative quantities, such as the critical perturbation parameter which represents an upper limit in the rotational behavior of the star. From such indicative results, it is inferred that this method achieves impressive improvement against other relevant methods; most important, it is comparable to some of the most elaborate and accurate techniques on the subject. It is also shown that the use of this method with Chandrasekhar's first-order perturbation theory yields an immediate drastic improvement of the results. Thus, there is no neeed - for most applications concerning rotating polytropic models - to proceed to the further use of the method with higher order techniques, unless the maximum accuracy of the method is required. 31 references

  17. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler.

    Science.gov (United States)

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O'Connor, Mary; Shapiro, Bruce A

    2008-10-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes.

  18. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler☆

    Science.gov (United States)

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O’Connor, Mary; Shapiro, Bruce A.

    2013-01-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes. PMID:18838281

  19. Effects of a radiation dose reduction strategy for computed tomography in severely injured trauma patients in the emergency department: an observational study.

    Science.gov (United States)

    Kim, Soo Hyun; Jung, Seung Eun; Oh, Sang Hoon; Park, Kyu Nam; Youn, Chun Song

    2011-11-03

    Severely injured trauma patients are exposed to clinically significant radiation doses from computed tomography (CT) imaging in the emergency department. Moreover, this radiation exposure is associated with an increased risk of cancer. The purpose of this study was to determine some effects of a radiation dose reduction strategy for CT in severely injured trauma patients in the emergency department. We implemented the radiation dose reduction strategy in May 2009. A prospective observational study design was used to collect data from patients who met the inclusion criteria during this one year study (intervention group) from May 2009 to April 2010. The prospective data were compared with data collected retrospectively for one year prior to the implementation of the radiation dose reduction strategy (control group). By comparison of the cumulative effective dose and the number of CT examinations in the two groups, we evaluated effects of a radiation dose reduction strategy. All the patients met the institutional adult trauma team activation criteria. The radiation doses calculated by the CT scanner were converted to effective doses by multiplication by a conversion coefficient. A total of 118 patients were included in this study. Among them, 33 were admitted before May 2009 (control group), and 85 were admitted after May 2009 (intervention group). There were no significant differences between the two groups regarding baseline characteristics, such as injury severity and mortality. Additionally, there was no difference between the two groups in the mean number of total CT examinations per patient (4.8 vs. 4.5, respectively; p = 0.227). However, the mean effective dose of the total CT examinations per patient significantly decreased from 78.71 mSv to 29.50 mSv (p trauma patients effectively decreased the cumulative effective dose of the total CT examinations in the emergency department. But not effectively decreased the number of CT examinations.

  20. Whole arm manipulation planning based on feedback velocity fields and sampling-based techniques.

    Science.gov (United States)

    Talaei, B; Abdollahi, F; Talebi, H A; Omidi Karkani, E

    2013-09-01

    Changing the configuration of a cooperative whole arm manipulator is not easy while enclosing an object. This difficulty is mainly because of risk of jamming caused by kinematic constraints. To reduce this risk, this paper proposes a feedback manipulation planning algorithm that takes grasp kinematics into account. The idea is based on a vector field that imposes perturbation in object motion inducing directions when the movement is considerably along manipulator redundant directions. Obstacle avoidance problem is then considered by combining the algorithm with sampling-based techniques. As experimental results confirm, the proposed algorithm is effective in avoiding jamming as well as obstacles for a 6-DOF dual arm whole arm manipulator. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  2. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  3. Caps and gaps: a computer model for studies on brood incubation strategies in honeybees (Apis mellifera carnica)

    Science.gov (United States)

    Fehler, Manuel; Kleinhenz, Marco; Klügl, Franziska; Puppe, Frank; Tautz, Jürgen

    2007-08-01

    In addition to heat production on the comb surface, honeybee workers frequently visit open cells (“gaps”) that are scattered throughout the sealed brood area, and enter them to incubate adjacent brood cells. We examined the efficiency of this heating strategy under different environmental conditions and for gap proportions from 0 to 50%. For gap proportions from 4 to 10%, which are common to healthy colonies, we find a significant reduction in the incubation time per brood cell to maintain the correct temperature. The savings make up 18 to 37% of the time, which would be required for this task in completely sealed brood areas without any gaps. For unnatural high proportions of gaps (>20%), which may be the result of inbreeding or indicate a poor condition of the colony, brood nest thermoregulation becomes less efficient, and the incubation time per brood cell has to increase to maintain breeding temperature. Although the presence of gaps is not essential to maintain an optimal brood nest temperature, a small number of gaps make heating more economical by reducing the time and energy that must be spent on this vital task. As the benefit depends on the availability, spatial distribution and usage of gaps by the bees, further studies need to show the extent to which these results apply to real colonies.

  4. Distributed computing strategies for processing of FT-ICR MS imaging datasets for continuous mode data visualization

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Donald F.; Schulz, Carl; Konijnenburg, Marco; Kilic, Mehmet; Heeren, Ronald M.

    2015-03-01

    High-resolution Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry imaging enables the spatial mapping and identification of biomolecules from complex surfaces. The need for long time-domain transients, and thus large raw file sizes, results in a large amount of raw data (“big data”) that must be processed efficiently and rapidly. This can be compounded by largearea imaging and/or high spatial resolution imaging. For FT-ICR, data processing and data reduction must not compromise the high mass resolution afforded by the mass spectrometer. The continuous mode “Mosaic Datacube” approach allows high mass resolution visualization (0.001 Da) of mass spectrometry imaging data, but requires additional processing as compared to featurebased processing. We describe the use of distributed computing for processing of FT-ICR MS imaging datasets with generation of continuous mode Mosaic Datacubes for high mass resolution visualization. An eight-fold improvement in processing time is demonstrated using a Dutch nationally available cloud service.

  5. Waiting is the hardest part: comparison of two computational strategies for performing a compelled-response task

    Directory of Open Access Journals (Sweden)

    Emilio Salinas

    2010-12-01

    Full Text Available The neural basis of choice behavior is commonly investigated with tasks in which a subject analyzes a stimulus and reports his or her perceptual experience with an appropriate motor action. We recently developed a novel task, the compelled-saccade task, with which the influence of the sensory information on the subject's choice can be tracked through time with millisecond resolution, thus providing a new tool for correlating neuronal activity and behavior. This paradigm has a crucial feature: the signal that instructs the subject to make an eye movement is given before the cue that indicates which of two possible choices is the correct one. Previously, we found that psychophysical performance in this task could be accurately replicated by a model in which two developing oculomotor plans race to a threshold and the incoming perceptual information differentially accelerates their trajectories toward it. However, the task design suggests an alternative mechanism: instead of modifying an ongoing oculomotor plan on the fly as the sensory information becomes available, the subject could try to wait, withholding the oculomotor response until the sensory cue is revealed. Here, we use computer simulations to explore and compare the performance of these two types of models. We find that both reproduce the main features of the psychophysical data in the compelled-saccade task, but they give rise to distinct behavioral and neurophysiological predictions. Although, superficially, the waiting model is intuitively appealing, it is ultimately inconsistent with experimental results from this and other tasks.

  6. Stent sizing strategies in renal artery stenting: the comparison of conventional invasive renal angiography with renal computed tomographic angiography.

    Science.gov (United States)

    Kadziela, Jacek; Michalowska, Ilona; Pregowski, Jerzy; Janaszek-Sitkowska, Hanna; Lech, Katarzyna; Kabat, Marek; Staruch, Adam; Januszewicz, Andrzej; Witkowski, Adam

    2016-01-01

    Randomized trials comparing invasive treatment of renal artery stenosis with standard pharmacotherapy did not show substantial benefit from revascularization. One of the potential reasons for that may be suboptimal procedure technique. To compare renal stent sizing using two modalities: three-dimensional renal computed tomography angiography (CTA) versus conventional angiography. Forty patients (41 renal arteries), aged 65.1 ±8.5 years, who underwent renal artery stenting with preprocedural CTA performed within 6 months, were retrospectively analyzed. In CTA analysis, reference diameter (CTA-D) and lesion length (CTA_LL) were measured and proposed stent diameter and length were recorded. Similarly, angiographic reference diameter (ANGIO_D) and lesion length (ANGIO_LL) as well as proposed stent dimensions were obtained by visual estimation. The median CTA_D was 0.5 mm larger than the median ANGIO_D (p < 0.001). Also, the proposed stent diameter in CTA evaluation was 0.5 mm larger than that in angiography (p < 0.0001). The median CTA_LL was 1 mm longer than the ANGIO_LL (p = NS), with significant correlation of these variables (r = 0.66, p < 0.0001). The median proposed stent length with CTA was equal to that proposed with angiography. The median diameter of the implanted stent was 0.5 mm smaller than that proposed in CTA (p < 0.0005) and identical to that proposed in angiography. The median length of the actual stent was longer than that proposed in angiography (p = 0.0001). Renal CTA has potential advantages as a tool adjunctive to angiography in appropriate stent sizing. Careful evaluation of the available CTA scans may be beneficial and should be considered prior to the planned procedure.

  7. Computer simulation of nuclear pollutant diffusion from Shimane nuclear power plants and development of an evacuation strategy in the event of a nuclear incident

    International Nuclear Information System (INIS)

    Yakura, Haruna; Kurimasa, Akihiro

    2012-01-01

    Beginning from Tohoku (northeastern) Earthquake on March 11, an accident at the Fukushima Dai-ichi Nuclear Power Plant resulted in a substantial release of radioactivity to the environment. The accident forced a large number of residents to evacuate from surrounding areas. Moreover, the nuclear incident was life-threatening for the elderly and for people with serious illnesses who were confined in hospital or nursing homes. Strikingly, the causes of death were not directly attributed to radiation exposure but to problems encountered during evacuation. Using nuclear diffusion data from the Fukushima incident, we simulated nuclear pollutant dispersion using computer software A2C in an area of Tottori and Shimane Prefectures surrounding the Shimane Nuclear Power Plant. We generated a model for the spread of nuclear pollutants around the Emergency Planning Zone, id est (i.e.) EPZ. From these findings, we proposed evacuation strategies for residents near the power plant to ensure a safe and reliable escape from nuclear pollutants. Our recommendations include: immediate evacuation from PAZ area (within 5 km), securing indoor shelter in the area of the UPZ (from 5 km to 30 km) and preparations to evacuate further outwards from the nuclear plant site, daytime evacuation within a 30 km area after considering wind direction and velocity, and preparation of a planned evacuation strategy that identifies secure facilities for weaker people in the event of a disaster. (author)

  8. Comparison of imaging strategies with conditional versus immediate contrast-enhanced computed tomography in patients with clinical suspicion of acute appendicitis

    International Nuclear Information System (INIS)

    Atema, J.J.; Gans, S.L.; Boermeester, M.A.; Randen, A. van; Stoker, J.; Lameris, W.; Es, H.W. van; Heesewijk, J.P.M. van; Ramshorst, B. van; Bouma, W.H.; Hove, W. ten; Keulen, E.M. van; Dijkgraaf, M.G.W.; Bossuyt, P.M.M.

    2015-01-01

    To compare the diagnostic accuracy of conditional computed tomography (CT), i.e. CT when initial ultrasound findings are negative or inconclusive, and immediate CT for patients with suspected appendicitis. Data were collected within a prospective diagnostic accuracy study on imaging in adults with acute abdominal pain. All patients underwent ultrasound and CT, read by different observers who were blinded from the other modality. Only patients with clinical suspicion of appendicitis were included. An expert panel assigned a final diagnosis to each patient after 6 months of follow-up (clinical reference standard). A total of 422 patients were included with final diagnosis appendicitis in 251 (60 %). For 199 patients (47 %), ultrasound findings were inconclusive or negative. Conditional CT imaging correctly identified 241 of 251 (96 %) appendicitis cases (95 %CI, 92 % to 98 %), versus 238 (95 %) with immediate CT (95 %CI, 91 % to 97 %). The specificity of conditional CT imaging was lower: 77 % (95 %CI, 70 % to 83 %) versus 87 % for immediate CT (95 %CI, 81 % to 91 %). A conditional CT strategy correctly identifies as many patients with appendicitis as an immediate CT strategy, and can halve the number of CTs needed. However, conditional CT imaging results in more false positives. (orig.)

  9. A linguistic analysis of grooming strategies of online child sex offenders: Implications for our understanding of predatory sexual behavior in an increasingly computer-mediated world.

    Science.gov (United States)

    Black, Pamela J; Wollis, Melissa; Woodworth, Michael; Hancock, Jeffrey T

    2015-06-01

    There is a large body of evidence to suggest that child sex offenders engage in grooming to facilitate victimization. It has been speculated that this step-by-step grooming process is also used by offenders who access their underage victims online; however, little research has been done to examine whether there are unique aspects of computer-mediated communication that impact the traditional face-to-face grooming process. This study considered the similarities and differences in the grooming process in online environments by analyzing the language used by online offenders when communicating with their victims. The transcripts of 44 convicted online offenders were analyzed to assess a proposed theory of the online grooming process (O'Connell, 2003). Using a stage-based approach, computerized text analysis examined the types of language used in each stage of the offender-victim interaction. The transcripts also were content analyzed to examine the frequency of specific techniques known to be employed by both face-to-face and online offenders, such as flattery. Results reveal that while some evidence of the strategies used by offenders throughout the grooming process are present in online environments, the order and timing of these stages appear to be different. The types (and potential underlying pattern) of strategies used in online grooming support the development of a revised model for grooming in online environments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Comparison of imaging strategies with conditional versus immediate contrast-enhanced computed tomography in patients with clinical suspicion of acute appendicitis

    Energy Technology Data Exchange (ETDEWEB)

    Atema, J.J.; Gans, S.L.; Boermeester, M.A. [Academic Medical Centre, Department of Surgery (G4-142), Amsterdam (Netherlands); Randen, A. van; Stoker, J. [Academic Medical Centre, Department of Radiology, Amsterdam (Netherlands); Lameris, W. [Academic Medical Centre, Department of Surgery (G4-142), Amsterdam (Netherlands); Spaarne Hospital, Department of Surgery, Hoofddorp (Netherlands); Es, H.W. van; Heesewijk, J.P.M. van [St Antonius Hospital, Department of Radiology, Nieuwegein (Netherlands); Ramshorst, B. van [St Antonius Hospital, Department of Surgery, Nieuwegein (Netherlands); Bouma, W.H. [Gelre Hospital, Department of Surgery, Apeldoorn (Netherlands); Hove, W. ten [Gelre Hospital, Department of Radiology, Apeldoorn (Netherlands); Keulen, E.M. van [Tergooi Hospital, Department of Radiology, Hilversum (Netherlands); Dijkgraaf, M.G.W. [Academic Medical Centre, Clinical Research Unit, Amsterdam (Netherlands); Bossuyt, P.M.M. [Academic Medical Center, Department of Clinical Epidemiology, Biostatistics, and Bioinformatics, Amsterdam (Netherlands)

    2015-08-15

    To compare the diagnostic accuracy of conditional computed tomography (CT), i.e. CT when initial ultrasound findings are negative or inconclusive, and immediate CT for patients with suspected appendicitis. Data were collected within a prospective diagnostic accuracy study on imaging in adults with acute abdominal pain. All patients underwent ultrasound and CT, read by different observers who were blinded from the other modality. Only patients with clinical suspicion of appendicitis were included. An expert panel assigned a final diagnosis to each patient after 6 months of follow-up (clinical reference standard). A total of 422 patients were included with final diagnosis appendicitis in 251 (60 %). For 199 patients (47 %), ultrasound findings were inconclusive or negative. Conditional CT imaging correctly identified 241 of 251 (96 %) appendicitis cases (95 %CI, 92 % to 98 %), versus 238 (95 %) with immediate CT (95 %CI, 91 % to 97 %). The specificity of conditional CT imaging was lower: 77 % (95 %CI, 70 % to 83 %) versus 87 % for immediate CT (95 %CI, 81 % to 91 %). A conditional CT strategy correctly identifies as many patients with appendicitis as an immediate CT strategy, and can halve the number of CTs needed. However, conditional CT imaging results in more false positives. (orig.)

  11. Progressive sampling-based Bayesian optimization for efficient and automatic machine learning model selection.

    Science.gov (United States)

    Zeng, Xueqiang; Luo, Gang

    2017-12-01

    Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.

  12. A sampling-based Bayesian model for gas saturation estimationusing seismic AVA and marine CSEM data

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinsong; Hoversten, Michael; Vasco, Don; Rubin, Yoram; Hou,Zhangshuan

    2006-04-04

    We develop a sampling-based Bayesian model to jointly invertseismic amplitude versus angles (AVA) and marine controlled-sourceelectromagnetic (CSEM) data for layered reservoir models. The porosityand fluid saturation in each layer of the reservoir, the seismic P- andS-wave velocity and density in the layers below and above the reservoir,and the electrical conductivity of the overburden are considered asrandom variables. Pre-stack seismic AVA data in a selected time windowand real and quadrature components of the recorded electrical field areconsidered as data. We use Markov chain Monte Carlo (MCMC) samplingmethods to obtain a large number of samples from the joint posteriordistribution function. Using those samples, we obtain not only estimatesof each unknown variable, but also its uncertainty information. Thedeveloped method is applied to both synthetic and field data to explorethe combined use of seismic AVA and EM data for gas saturationestimation. Results show that the developed method is effective for jointinversion, and the incorporation of CSEM data reduces uncertainty influid saturation estimation, when compared to results from inversion ofAVA data only.

  13. A Sampling Based Approach to Spacecraft Autonomous Maneuvering with Safety Specifications

    Science.gov (United States)

    Starek, Joseph A.; Barbee, Brent W.; Pavone, Marco

    2015-01-01

    This paper presents a methods for safe spacecraft autonomous maneuvering that leverages robotic motion-planning techniques to spacecraft control. Specifically the scenario we consider is an in-plan rendezvous of a chaser spacecraft in proximity to a target spacecraft at the origin of the Clohessy Wiltshire Hill frame. The trajectory for the chaser spacecraft is generated in a receding horizon fashion by executing a sampling based robotic motion planning algorithm name Fast Marching Trees (FMT) which efficiently grows a tree of trajectories over a set of probabillistically drawn samples in the state space. To enforce safety the tree is only grown over actively safe samples for which there exists a one-burn collision avoidance maneuver that circularizes the spacecraft orbit along a collision-free coasting arc and that can be executed under potential thrusters failures. The overall approach establishes a provably correct framework for the systematic encoding of safety specifications into the spacecraft trajectory generations process and appears amenable to real time implementation on orbit. Simulation results are presented for a two-fault tolerant spacecraft during autonomous approach to a single client in Low Earth Orbit.

  14. Status of XSUSA for sampling based nuclear data uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Zwermann, W.; Gallner, L.; Klein, M.; Krzydacz-Hausmann; Pasichnyk, I.; Pautz, A.; Velkov, K.

    2013-01-01

    In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steady state as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA - Uncertainty Analyses for Criticality Safety Assessment, UAM - Uncertainty Analysis in Modelling). It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO 2 /MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations. (authors)

  15. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    Science.gov (United States)

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  16. A sample-based method for perishable good inventory control with a service level constraint

    NARCIS (Netherlands)

    Hendrix, Eligius M.T.; Pauls-Worm, Karin G.J.; Rossi, Roberto; Alcoba, Alejandro G.; Haijema, Rene

    2015-01-01

    This paper studies the computation of so-called order-upto levels for a stochastic programming inventory problem of a perishable product. Finding a solution is a challenge as the problem enhances a perishable product, fixed ordering cost and non-stationary stochastic demand with a service level

  17. A moving blocker-based strategy for simultaneous megavoltage and kilovoltage scatter correction in cone-beam computed tomography image acquired during volumetric modulated arc therapy

    International Nuclear Information System (INIS)

    Ouyang, Luo; Lee, Huichen Pam; Wang, Jing

    2015-01-01

    Purpose: To evaluate a moving blocker-based approach in estimating and correcting megavoltage (MV) and kilovoltage (kV) scatter contamination in kV cone-beam computed tomography (CBCT) acquired during volumetric modulated arc therapy (VMAT). Methods and materials: During the concurrent CBCT/VMAT acquisition, a physical attenuator (i.e., “blocker”) consisting of equally spaced lead strips was mounted and moved constantly between the CBCT source and patient. Both kV and MV scatter signals were estimated from the blocked region of the imaging panel, and interpolated into the unblocked region. A scatter corrected CBCT was then reconstructed from the unblocked projections after scatter subtraction using an iterative image reconstruction algorithm based on constraint optimization. Experimental studies were performed on a Catphan® phantom and an anthropomorphic pelvis phantom to demonstrate the feasibility of using a moving blocker for kV–MV scatter correction. Results: Scatter induced cupping artifacts were substantially reduced in the moving blocker corrected CBCT images. Quantitatively, the root mean square error of Hounsfield units (HU) in seven density inserts of the Catphan phantom was reduced from 395 to 40. Conclusions: The proposed moving blocker strategy greatly improves the image quality of CBCT acquired with concurrent VMAT by reducing the kV–MV scatter induced HU inaccuracy and cupping artifacts

  18. Design Sensitivity Method for Sampling-Based RBDO with Fixed COV

    Science.gov (United States)

    2015-04-29

    contours of the input model at initial design d0 and RBDO optimum design dopt are shown. As the limit state functions are not linear and some input...Glasser, M. L., Moore, R. A., and Scott, T. C., 1990, "Evaluation of Classes of Definite Integrals Involving Elementary Functions via...Differentiation of Special Functions," Applicable Algebra in Engineering, Communication and Computing, 1(2), pp. 149-165. [25] Cho, H., Bae, S., Choi, K. K

  19. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  20. An efficient modularized sample-based method to estimate the first-order Sobol' index

    International Nuclear Information System (INIS)

    Li, Chenzhao; Mahadevan, Sankaran

    2016-01-01

    Sobol' index is a prominent methodology in global sensitivity analysis. This paper aims to directly estimate the Sobol' index based only on available input–output samples, even if the underlying model is unavailable. For this purpose, a new method to calculate the first-order Sobol' index is proposed. The innovation is that the conditional variance and mean in the formula of the first-order index are calculated at an unknown but existing location of model inputs, instead of an explicit user-defined location. The proposed method is modularized in two aspects: 1) index calculations for different model inputs are separate and use the same set of samples; and 2) model input sampling, model evaluation, and index calculation are separate. Due to this modularization, the proposed method is capable to compute the first-order index if only input–output samples are available but the underlying model is unavailable, and its computational cost is not proportional to the dimension of the model inputs. In addition, the proposed method can also estimate the first-order index with correlated model inputs. Considering that the first-order index is a desired metric to rank model inputs but current methods can only handle independent model inputs, the proposed method contributes to fill this gap. - Highlights: • An efficient method to estimate the first-order Sobol' index. • Estimate the index from input–output samples directly. • Computational cost is not proportional to the number of model inputs. • Handle both uncorrelated and correlated model inputs.

  1. A Global Sampling Based Image Matting Using Non-Negative Matrix Factorization

    Directory of Open Access Journals (Sweden)

    NAVEED ALAM

    2017-10-01

    Full Text Available Image matting is a technique in which a foreground is separated from the background of a given image along with the pixel wise opacity. This foreground can then be seamlessly composited in a different background to obtain a novel scene. This paper presents a global non-parametric sampling algorithm over image patches and utilizes a dimension reduction technique known as NMF (Non-Negative Matrix Factorization. Although some existing non-parametric approaches use large nearby foreground and background regions to sample patches but these approaches fail to take the whole image to sample patches. It is because of the high memory and computational requirements. The use of NMF in the proposed algorithm allows the dimension reduction which reduces the computational cost and memory requirement. The use of NMF also allow the proposed approach to use the whole foreground and background region in the image and reduces the patch complexity and help in efficient patch sampling. The use of patches not only allows the incorporation of the pixel colour but also the local image structure. The use of local structures in the image is important to estimate a high-quality alpha matte especially in the images which have regions containing high texture. The proposed algorithm is evaluated on the standard data set and obtained results are comparable to the state-of-the-art matting techniques

  2. A global sampling based image matting using non-negative matrix factorization

    International Nuclear Information System (INIS)

    Alam, N.; Sarim, M.; Shaikh, A.B.

    2017-01-01

    Image matting is a technique in which a foreground is separated from the background of a given image along with the pixel wise opacity. This foreground can then be seamlessly composited in a different background to obtain a novel scene. This paper presents a global non-parametric sampling algorithm over image patches and utilizes a dimension reduction technique known as NMF (Non-Negative Matrix Factorization). Although some existing non-parametric approaches use large nearby foreground and background regions to sample patches but these approaches fail to take the whole image to sample patches. It is because of the high memory and computational requirements. The use of NMF in the proposed algorithm allows the dimension reduction which reduces the computational cost and memory requirement. The use of NMF also allow the proposed approach to use the whole foreground and background region in the image and reduces the patch complexity and help in efficient patch sampling. The use of patches not only allows the incorporation of the pixel colour but also the local image structure. The use of local structures in the image is important to estimate a high-quality alpha matte especially in the images which have regions containing high texture. The proposed algorithm is evaluated on the standard data set and obtained results are comparable to the state-of-the-art matting techniques. (author)

  3. Sub-sampling-based 2D localization of an impulsive acoustic source in reverberant environments

    KAUST Repository

    Omer, Muhammad

    2014-07-01

    This paper presents a robust method for two-dimensional (2D) impulsive acoustic source localization in a room environment using low sampling rates. The proposed method finds the time delay from the room impulse response (RIR) which makes it robust against room reverberations. We consider the RIR as a sparse phenomenon and apply a recently proposed sparse signal reconstruction technique called orthogonal clustering (OC) for its estimation from the sub-sampled received signal. The arrival time of the direct path signal at a pair of microphones is identified from the estimated RIR, and their difference yields the desired time delay estimate (TDE). Low sampling rates reduces the hardware and computational complexity and decreases the communication between the microphones and the centralized location. Simulation and experimental results of an actual hardware setup are presented to demonstrate the performance of the proposed technique.

  4. Sub-sampling-based 2D localization of an impulsive acoustic source in reverberant environments

    KAUST Repository

    Omer, Muhammad; Quadeer, Ahmed A; Sharawi, Mohammad S; Al-Naffouri, Tareq Y.

    2014-01-01

    This paper presents a robust method for two-dimensional (2D) impulsive acoustic source localization in a room environment using low sampling rates. The proposed method finds the time delay from the room impulse response (RIR) which makes it robust against room reverberations. We consider the RIR as a sparse phenomenon and apply a recently proposed sparse signal reconstruction technique called orthogonal clustering (OC) for its estimation from the sub-sampled received signal. The arrival time of the direct path signal at a pair of microphones is identified from the estimated RIR, and their difference yields the desired time delay estimate (TDE). Low sampling rates reduces the hardware and computational complexity and decreases the communication between the microphones and the centralized location. Simulation and experimental results of an actual hardware setup are presented to demonstrate the performance of the proposed technique.

  5. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  6. FIRM: Sampling-based feedback motion-planning under motion uncertainty and imperfect measurements

    KAUST Repository

    Agha-mohammadi, A.-a.; Chakravorty, S.; Amato, N. M.

    2013-01-01

    In this paper we present feedback-based information roadmap (FIRM), a multi-query approach for planning under uncertainty which is a belief-space variant of probabilistic roadmap methods. The crucial feature of FIRM is that the costs associated with the edges are independent of each other, and in this sense it is the first method that generates a graph in belief space that preserves the optimal substructure property. From a practical point of view, FIRM is a robust and reliable planning framework. It is robust since the solution is a feedback and there is no need for expensive replanning. It is reliable because accurate collision probabilities can be computed along the edges. In addition, FIRM is a scalable framework, where the complexity of planning with FIRM is a constant multiplier of the complexity of planning with PRM. In this paper, FIRM is introduced as an abstract framework. As a concrete instantiation of FIRM, we adopt stationary linear quadratic Gaussian (SLQG) controllers as belief stabilizers and introduce the so-called SLQG-FIRM. In SLQG-FIRM we focus on kinematic systems and then extend to dynamical systems by sampling in the equilibrium space. We investigate the performance of SLQG-FIRM in different scenarios. © The Author(s) 2013.

  7. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy

    2014-09-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  8. FIRM: Sampling-based feedback motion-planning under motion uncertainty and imperfect measurements

    KAUST Repository

    Agha-mohammadi, A.-a.

    2013-11-15

    In this paper we present feedback-based information roadmap (FIRM), a multi-query approach for planning under uncertainty which is a belief-space variant of probabilistic roadmap methods. The crucial feature of FIRM is that the costs associated with the edges are independent of each other, and in this sense it is the first method that generates a graph in belief space that preserves the optimal substructure property. From a practical point of view, FIRM is a robust and reliable planning framework. It is robust since the solution is a feedback and there is no need for expensive replanning. It is reliable because accurate collision probabilities can be computed along the edges. In addition, FIRM is a scalable framework, where the complexity of planning with FIRM is a constant multiplier of the complexity of planning with PRM. In this paper, FIRM is introduced as an abstract framework. As a concrete instantiation of FIRM, we adopt stationary linear quadratic Gaussian (SLQG) controllers as belief stabilizers and introduce the so-called SLQG-FIRM. In SLQG-FIRM we focus on kinematic systems and then extend to dynamical systems by sampling in the equilibrium space. We investigate the performance of SLQG-FIRM in different scenarios. © The Author(s) 2013.

  9. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  10. Strategy Instruction in Mathematics.

    Science.gov (United States)

    Goldman, Susan R.

    1989-01-01

    Experiments in strategy instruction for mathematics have been conducted using three models (direct instruction, self-instruction, and guided learning) applied to the tasks of computation and word problem solving. Results have implications for effective strategy instruction for learning disabled students. It is recommended that strategy instruction…

  11. Adaptive local learning in sampling based motion planning for protein folding.

    Science.gov (United States)

    Ekenna, Chinwe; Thomas, Shawna; Amato, Nancy M

    2016-08-01

    Simulating protein folding motions is an important problem in computational biology. Motion planning algorithms, such as Probabilistic Roadmap Methods, have been successful in modeling the folding landscape. Probabilistic Roadmap Methods and variants contain several phases (i.e., sampling, connection, and path extraction). Most of the time is spent in the connection phase and selecting which variant to employ is a difficult task. Global machine learning has been applied to the connection phase but is inefficient in situations with varying topology, such as those typical of folding landscapes. We develop a local learning algorithm that exploits the past performance of methods within the neighborhood of the current connection attempts as a basis for learning. It is sensitive not only to different types of landscapes but also to differing regions in the landscape itself, removing the need to explicitly partition the landscape. We perform experiments on 23 proteins of varying secondary structure makeup with 52-114 residues. We compare the success rate when using our methods and other methods. We demonstrate a clear need for learning (i.e., only learning methods were able to validate against all available experimental data) and show that local learning is superior to global learning producing, in many cases, significantly higher quality results than the other methods. We present an algorithm that uses local learning to select appropriate connection methods in the context of roadmap construction for protein folding. Our method removes the burden of deciding which method to use, leverages the strengths of the individual input methods, and it is extendable to include other future connection methods.

  12. Examining Motivational Orientation and Learning Strategies in Computer-Supported Self-Directed Learning (CS-SDL) for Mathematics: The Perspective of Intrinsic and Extrinsic Goals

    Science.gov (United States)

    Lao, Andrew Chan-Chio; Cheng, Hercy N. H.; Huang, Mark C. L.; Ku, Oskar; Chan, Tak-Wai

    2017-01-01

    One-to-one technology, which allows every student to receive equal access to learning tasks through a personal computing device, has shown increasing potential for self-directed learning in elementary schools. With computer-supported self-directed learning (CS-SDL), students may set their own learning goals through the suggestions of the system…

  13. Facilitating Preschoolers' Scientific Knowledge Construction via Computer Games Regarding Light and Shadow: The Effect of the Prediction-Observation-Explanation (POE) Strategy

    Science.gov (United States)

    Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong

    2011-01-01

    Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the…

  14. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Randell, B.

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  15. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  16. A review of single-sample-based models and other approaches for radiocarbon dating of dissolved inorganic carbon in groundwater

    Science.gov (United States)

    Han, L. F; Plummer, Niel

    2016-01-01

    Numerous methods have been proposed to estimate the pre-nuclear-detonation 14C content of dissolved inorganic carbon (DIC) recharged to groundwater that has been corrected/adjusted for geochemical processes in the absence of radioactive decay (14C0) - a quantity that is essential for estimation of radiocarbon age of DIC in groundwater. The models/approaches most commonly used are grouped as follows: (1) single-sample-based models, (2) a statistical approach based on the observed (curved) relationship between 14C and δ13C data for the aquifer, and (3) the geochemical mass-balance approach that constructs adjustment models accounting for all the geochemical reactions known to occur along a groundwater flow path. This review discusses first the geochemical processes behind each of the single-sample-based models, followed by discussions of the statistical approach and the geochemical mass-balance approach. Finally, the applications, advantages and limitations of the three groups of models/approaches are discussed.The single-sample-based models constitute the prevailing use of 14C data in hydrogeology and hydrological studies. This is in part because the models are applied to an individual water sample to estimate the 14C age, therefore the measurement data are easily available. These models have been shown to provide realistic radiocarbon ages in many studies. However, they usually are limited to simple carbonate aquifers and selection of model may have significant effects on 14C0 often resulting in a wide range of estimates of 14C ages.Of the single-sample-based models, four are recommended for the estimation of 14C0 of DIC in groundwater: Pearson's model, (Ingerson and Pearson, 1964; Pearson and White, 1967), Han & Plummer's model (Han and Plummer, 2013), the IAEA model (Gonfiantini, 1972; Salem et al., 1980), and Oeschger's model (Geyh, 2000). These four models include all processes considered in single-sample-based models, and can be used in different ranges of

  17. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  18. Developing a strategy for computational lab skills training through Software and Data Carpentry: Experiences from the ELIXIR Pilot action [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Aleksandra Pawlik

    2017-07-01

    Full Text Available Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.

  19. Effectiveness of ESL Students' Performance by Computational Assessment and Role of Reading Strategies in Courseware-Implemented Business Translation Tasks

    Science.gov (United States)

    Tsai, Shu-Chiao

    2017-01-01

    This study reports on investigating students' English translation performance and their use of reading strategies in an elective English writing course offered to senior students of English as a Foreign Language for 100 minutes per week for 12 weeks. A courseware-implemented instruction combined with a task-based learning approach was adopted.…

  20. Five-Year-Olds' Systematic Errors in Second-Order False Belief Tasks Are Due to First-Order Theory of Mind Strategy Selection : A Computational Modeling Study

    NARCIS (Netherlands)

    Arslan, Burcu; Taatgen, Niels A; Verbrugge, Rineke

    2017-01-01

    The focus of studies on second-order false belief reasoning generally was on investigating the roles of executive functions and language with correlational studies. Different from those studies, we focus on the question how 5-year-olds select and revise reasoning strategies in second-order false

  1. Cardiac computed tomography guided treatment strategy in patients with recent acute-onset chest pain ☆ ☆☆: Results from the randomised, controlled trial

    DEFF Research Database (Denmark)

    Linde, Jesper James; Kofoed, Klaus Fuglsang; Sørgaard, Mathias

    2013-01-01

    In patients admitted on suspicion of acute coronary syndrome, with normal electrocardiogram and troponines, we evaluated the clinical impact of a Coronary CT angiography (CCTA)-strategy on referral rate for invasive coronary angiography (ICA), detection of significant coronary stenoses (positive...

  2. Motivational Qualities of Instructional Strategies and Computer Use for Mathematics Teaching in Japan and the United States: Results from the Timss 1999 Assessment.

    Science.gov (United States)

    House, J. Daniel

    2005-01-01

    Recent mathematics assessments have indicated that students in several Asian countries have tended to score above international averages. Research findings indicate that there are cultural differences in expectations for student achievement in mathematics and in classroom practices and instructional strategies. The importance of the motivational…

  3. Relationships among Individual Task Self-Efficacy, Self-Regulated Learning Strategy Use and Academic Performance in a Computer-Supported Collaborative Learning Environment

    Science.gov (United States)

    Wilson, Kimberly; Narayan, Anupama

    2016-01-01

    This study investigates relationships between self-efficacy, self-regulated learning strategy use and academic performance. Participants were 96 undergraduate students working on projects with three subtasks (idea generation task, methodical task and data collection) in a blended learning environment. Task self-efficacy was measured with…

  4. Contemporary evolution strategies

    CERN Document Server

    Bäck, Thomas; Krause, Peter

    2013-01-01

    Evolution strategies have more than 50 years of history in the field of evolutionary computation. Since the early 1990s, many algorithmic variations of evolution strategies have been developed, characterized by the fact that they use the so-called derandomization concept for strategy parameter adaptation. Most importantly, the covariance matrix adaptation strategy (CMA-ES) and its successors are the key representatives of this group of contemporary evolution strategies. This book provides an overview of the key algorithm developments between 1990 and 2012, including brief descriptions of the a

  5. Learners' Use of Communication Strategies in Text-Based and Video-Based Synchronous Computer-Mediated Communication Environments: Opportunities for Language Learning

    Science.gov (United States)

    Hung, Yu-Wan; Higgins, Steve

    2016-01-01

    This study investigates the different learning opportunities enabled by text-based and video-based synchronous computer-mediated communication (SCMC) from an interactionist perspective. Six Chinese-speaking learners of English and six English-speaking learners of Chinese were paired up as tandem (reciprocal) learning dyads. Each dyad participated…

  6. Assessing the economic impact of North China’s water scarcity mitigation strategy : a multi - region, water - extended computable general equilibrium analysis

    NARCIS (Netherlands)

    Qin, Changbo; Qin, C.; Su, Zhongbo; Bressers, Johannes T.A.; Jia, Y.; Wang, H.

    2013-01-01

    This paper describes a multi-region computable general equilibrium model for analyzing the effectiveness of measures and policies for mitigating North China’s water scarcity with respect to three different groups of scenarios. The findings suggest that a reduction in groundwater use would negatively

  7. Making Friends in Dark Shadows: An Examination of the Use of Social Computing Strategy Within the United States Intelligence Community Since 9/11

    Directory of Open Access Journals (Sweden)

    Andrew Chomik

    2011-01-01

    Full Text Available The tragic events of 9/11/2001 in the United States highlighted failures in communication and cooperation in the U.S. intelligence community. Agencies within the community failed to “connect the dots” by not collaborating in intelligence gathering efforts, which resulted in severe gaps in data sharing that eventually contributed to the terrorist attack on American soil. Since then, and under the recommendation made by the 9/11 Commission Report, the United States intelligence community has made organizational and operational changes to intelligence gathering and sharing, primarily with the creation of the Office of the Director of National Intelligence (ODNI. The ODNI has since introduced a series of web-based social computing tools to be used by all members of the intelligence community, primarily with its closed-access wiki entitled “Intellipedia” and their social networking service called “A-Space”. This paper argues that, while the introduction of these and other social computing tools have been adopted successfully into the intelligence workplace, they have reached a plateau in their use and serve only as complementary tools to otherwise pre-existing information sharing processes. Agencies continue to ‘stove-pipe’ their respective data, a chronic challenge that plagues the community due to bureaucratic policy, technology use and workplace culture. This paper identifies and analyzes these challenges, and recommends improvements in the use of these tools, both in the business processes behind them and the technology itself. These recommendations aim to provide possible solutions for using these social computing tools as part of a more trusted, collaborative information sharing process.

  8. A multi-sample based method for identifying common CNVs in normal human genomic structure using high-resolution aCGH data.

    Directory of Open Access Journals (Sweden)

    Chihyun Park

    Full Text Available BACKGROUND: It is difficult to identify copy number variations (CNV in normal human genomic data due to noise and non-linear relationships between different genomic regions and signal intensity. A high-resolution array comparative genomic hybridization (aCGH containing 42 million probes, which is very large compared to previous arrays, was recently published. Most existing CNV detection algorithms do not work well because of noise associated with the large amount of input data and because most of the current methods were not designed to analyze normal human samples. Normal human genome analysis often requires a joint approach across multiple samples. However, the majority of existing methods can only identify CNVs from a single sample. METHODOLOGY AND PRINCIPAL FINDINGS: We developed a multi-sample-based genomic variations detector (MGVD that uses segmentation to identify common breakpoints across multiple samples and a k-means-based clustering strategy. Unlike previous methods, MGVD simultaneously considers multiple samples with different genomic intensities and identifies CNVs and CNV zones (CNVZs; CNVZ is a more precise measure of the location of a genomic variant than the CNV region (CNVR. CONCLUSIONS AND SIGNIFICANCE: We designed a specialized algorithm to detect common CNVs from extremely high-resolution multi-sample aCGH data. MGVD showed high sensitivity and a low false discovery rate for a simulated data set, and outperformed most current methods when real, high-resolution HapMap datasets were analyzed. MGVD also had the fastest runtime compared to the other algorithms evaluated when actual, high-resolution aCGH data were analyzed. The CNVZs identified by MGVD can be used in association studies for revealing relationships between phenotypes and genomic aberrations. Our algorithm was developed with standard C++ and is available in Linux and MS Windows format in the STL library. It is freely available at: http://embio.yonsei.ac.kr/~Park/mgvd.php.

  9. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Shoji Kawahito

    2016-11-01

    Full Text Available This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs. This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC. The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median: 0.29 e−rms when compared with the CMS gain of two (2.4 e−rms, or 16 (1.1 e−rms.

  10. Colorimetric assay for on-the-spot alcoholic strength sensing in spirit samples based on dual-responsive lanthanide coordination polymer particles with ratiometric fluorescence

    International Nuclear Information System (INIS)

    Deng, Jingjing; Shi, Guoyue; Zhou, Tianshu

    2016-01-01

    This study demonstrates a new strategy for colorimetric detection of alcoholic strength (AS) in spirit samples based on dual-responsive lanthanide infinite coordination polymer (Ln-ICP) particles with ratiometric fluorescence. The ICP used in this study are composed of two components: one is the supramolecular Ln-ICP network formed by the coordination between the ligand 2,2’-thiodiacetic acid (TDA) and central metal ion Eu"3"+; and the other is a fluorescent dye, i.e., coumarin 343 (C343), both as the cofactor ligand and as the sensitizer, doped into the Ln-ICP network through self-adaptive chemistry. Upon being excited at 300 nm, the red fluorescence of Ln-ICP network itself at 617 nm is highly enhanced due to the concomitant energy transfer from C343 to Eu"3"+, while the fluorescence of C343 at 495 nm is supressed. In pure ethanol solvent, the as-formed C343@Eu-TDA is well dispersed and quite stable. However, the addition of water into ethanolic dispersion of C343@Eu-TDA destructs Eu-TDA network structure, resulting in the release of C343 from ICP network into the solvent. Consequently, the fluorescence of Eu-TDA turns off and the fluorescence of C343 turns on, leading to the fluorescent color change of the dispersion from red to blue, which constitutes a new mechanism for colorimetric sensing of AS in commercial spirit samples. With the method developed here, we could clearly distinguish the AS of different spirit samples within a wide linear range from 10% vol to 100% vol directly by “naked eye” with the help of UV-lamp (365 nm). This study not only offers a new method for on-the-spot visible detection of AS, but also provides a strategy for dual-responsive sensing mode by rational designing the optical properties of the Ln-ICP network and the guest, respectively. - Highlights: • Dual responsive lanthanide coordination polymer particles C343@Eu-TDA were synthesized. • The guest molecular coumarin 343 sensitized the luminescence of Eu-TDA network

  11. Colorimetric assay for on-the-spot alcoholic strength sensing in spirit samples based on dual-responsive lanthanide coordination polymer particles with ratiometric fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Jingjing, E-mail: jjdeng@des.ecnu.edu.cn [School of Ecological and Environmental Sciences, East China Normal University, 500 Dongchuan Road, Shanghai 200241 (China); Shi, Guoyue [Department of Chemistry, East China Normal University, 500 Dongchuan Road, Shanghai 200241 (China); Zhou, Tianshu, E-mail: tszhou@des.ecnu.edu.cn [School of Ecological and Environmental Sciences, East China Normal University, 500 Dongchuan Road, Shanghai 200241 (China)

    2016-10-26

    This study demonstrates a new strategy for colorimetric detection of alcoholic strength (AS) in spirit samples based on dual-responsive lanthanide infinite coordination polymer (Ln-ICP) particles with ratiometric fluorescence. The ICP used in this study are composed of two components: one is the supramolecular Ln-ICP network formed by the coordination between the ligand 2,2’-thiodiacetic acid (TDA) and central metal ion Eu{sup 3+}; and the other is a fluorescent dye, i.e., coumarin 343 (C343), both as the cofactor ligand and as the sensitizer, doped into the Ln-ICP network through self-adaptive chemistry. Upon being excited at 300 nm, the red fluorescence of Ln-ICP network itself at 617 nm is highly enhanced due to the concomitant energy transfer from C343 to Eu{sup 3+}, while the fluorescence of C343 at 495 nm is supressed. In pure ethanol solvent, the as-formed C343@Eu-TDA is well dispersed and quite stable. However, the addition of water into ethanolic dispersion of C343@Eu-TDA destructs Eu-TDA network structure, resulting in the release of C343 from ICP network into the solvent. Consequently, the fluorescence of Eu-TDA turns off and the fluorescence of C343 turns on, leading to the fluorescent color change of the dispersion from red to blue, which constitutes a new mechanism for colorimetric sensing of AS in commercial spirit samples. With the method developed here, we could clearly distinguish the AS of different spirit samples within a wide linear range from 10% vol to 100% vol directly by “naked eye” with the help of UV-lamp (365 nm). This study not only offers a new method for on-the-spot visible detection of AS, but also provides a strategy for dual-responsive sensing mode by rational designing the optical properties of the Ln-ICP network and the guest, respectively. - Highlights: • Dual responsive lanthanide coordination polymer particles C343@Eu-TDA were synthesized. • The guest molecular coumarin 343 sensitized the luminescence of Eu

  12. Computing in Research.

    Science.gov (United States)

    Ashenhurst, Robert L.

    The introduction and diffusion of automatic computing facilities during the 1960's is reviewed; it is described as a time when research strategies in a broad variety of disciplines changed to take advantage of the newfound power provided by the computer. Several types of typical problems encountered by researchers who adopted the new technologies,…

  13. Collaborative Project: The problem of bias in defining uncertainty in computationally enabled strategies for data-driven climate model development. Final Technical Report.

    Energy Technology Data Exchange (ETDEWEB)

    Huerta, Gabriel [Univ. of New Mexico, Albuquerque, NM (United States)

    2016-05-10

    The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projections of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.

  14. Dietetics students' ability to choose appropriate communication and counseling methods is improved by teaching behavior-change strategies in computer-assisted instruction.

    Science.gov (United States)

    Puri, Ruchi; Bell, Carol; Evers, William D

    2010-06-01

    Several models and theories have been proposed to help registered dietitians (RD) counsel and communicate nutrition information to patients. However, there is little time for students or interns to observe and/or participate in counseling sessions. Computer-assisted instruction (CAI) can be used to give students more opportunity to observe the various methods and theories of counseling. This study used CAI simulations of RD-client communications to examine whether students who worked through the CAI modules would choose more appropriate counseling methods. Modules were created based on information from experienced RD. They contained videos of RD-patient interactions and demonstrated helpful and less helpful methods of communication. Students in didactic programs in dietetics accessed the modules via the Internet. The intervention group of students received a pretest module, two tutorial modules, and a posttest module. The control group only received the pretest and posttest modules. Data were collected during three semesters in 2006 and 2007. Two sample t tests were used to compare pretest and posttest scores. The influence of other factors was measured using factorial analysis of variance. Statistical significance was set at Pcommunication and counseling methods for dietetics students. 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  15. A distributed computational search strategy for the identification of diagnostics targets: application to finding aptamer targets for methicillin-resistant staphylococci.

    Science.gov (United States)

    Flanagan, Keith; Cockell, Simon; Harwood, Colin; Hallinan, Jennifer; Nakjang, Sirintra; Lawry, Beth; Wipat, Anil

    2014-06-30

    The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  16. A distributed computational search strategy for the identification of diagnostics targets: Application to finding aptamer targets for methicillin-resistant staphylococci

    Directory of Open Access Journals (Sweden)

    Flanagan Keith

    2014-06-01

    Full Text Available The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  17. Two Topics in Data Analysis: Sample-based Optimal Transport and Analysis of Turbulent Spectra from Ship Track Data

    Science.gov (United States)

    Kuang, Simeng Max

    This thesis contains two topics in data analysis. The first topic consists of the introduction of algorithms for sample-based optimal transport and barycenter problems. In chapter 1, a family of algorithms is introduced to solve both the L2 optimal transport problem and the Wasserstein barycenter problem. Starting from a theoretical perspective, the new algorithms are motivated from a key characterization of the barycenter measure, which suggests an update that reduces the total transportation cost and stops only when the barycenter is reached. A series of general theorems is given to prove the convergence of all the algorithms. We then extend the algorithms to solve sample-based optimal transport and barycenter problems, in which only finite sample sets are available instead of underlying probability distributions. A unique feature of the new approach is that it compares sample sets in terms of the expected values of a set of feature functions, which at the same time induce the function space of optimal maps and can be chosen by users to incorporate their prior knowledge of the data. All the algorithms are implemented and applied to various synthetic example and practical applications. On synthetic examples it is found that both the SOT algorithm and the SCB algorithm are able to find the true solution and often converge in a handful of iterations. On more challenging applications including Gaussian mixture models, color transfer and shape transform problems, the algorithms give very good results throughout despite the very different nature of the corresponding datasets. In chapter 2, a preconditioning procedure is developed for the L2 and more general optimal transport problems. The procedure is based on a family of affine map pairs, which transforms the original measures into two new measures that are closer to each other, while preserving the optimality of solutions. It is proved that the preconditioning procedure minimizes the remaining transportation cost

  18. Parameterizing Spatial Models of Infectious Disease Transmission that Incorporate Infection Time Uncertainty Using Sampling-Based Likelihood Approximations.

    Directory of Open Access Journals (Sweden)

    Rajat Malik

    Full Text Available A class of discrete-time models of infectious disease spread, referred to as individual-level models (ILMs, are typically fitted in a Bayesian Markov chain Monte Carlo (MCMC framework. These models quantify probabilistic outcomes regarding the risk of infection of susceptible individuals due to various susceptibility and transmissibility factors, including their spatial distance from infectious individuals. The infectious pressure from infected individuals exerted on susceptible individuals is intrinsic to these ILMs. Unfortunately, quantifying this infectious pressure for data sets containing many individuals can be computationally burdensome, leading to a time-consuming likelihood calculation and, thus, computationally prohibitive MCMC-based analysis. This problem worsens when using data augmentation to allow for uncertainty in infection times. In this paper, we develop sampling methods that can be used to calculate a fast, approximate likelihood when fitting such disease models. A simple random sampling approach is initially considered followed by various spatially-stratified schemes. We test and compare the performance of our methods with both simulated data and data from the 2001 foot-and-mouth disease (FMD epidemic in the U.K. Our results indicate that substantial computation savings can be obtained--albeit, of course, with some information loss--suggesting that such techniques may be of use in the analysis of very large epidemic data sets.

  19. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  20. MZDASoft: a software architecture that enables large-scale comparison of protein expression levels over multiple samples based on liquid chromatography/tandem mass spectrometry.

    Science.gov (United States)

    Ghanat Bari, Mehrab; Ramirez, Nelson; Wang, Zhiwei; Zhang, Jianqiu Michelle

    2015-10-15

    Without accurate peak linking/alignment, only the expression levels of a small percentage of proteins can be compared across multiple samples in Liquid Chromatography/Mass Spectrometry/Tandem Mass Spectrometry (LC/MS/MS) due to the selective nature of tandem MS peptide identification. This greatly hampers biomedical research that aims at finding biomarkers for disease diagnosis, treatment, and the understanding of disease mechanisms. A recent algorithm, PeakLink, has allowed the accurate linking of LC/MS peaks without tandem MS identifications to their corresponding ones with identifications across multiple samples collected from different instruments, tissues and labs, which greatly enhanced the ability of comparing proteins. However, PeakLink cannot be implemented practically for large numbers of samples based on existing software architectures, because it requires access to peak elution profiles from multiple LC/MS/MS samples simultaneously. We propose a new architecture based on parallel processing, which extracts LC/MS peak features, and saves them in database files to enable the implementation of PeakLink for multiple samples. The software has been deployed in High-Performance Computing (HPC) environments. The core part of the software, MZDASoft Parallel Peak Extractor (PPE), can be downloaded with a user and developer's guide, and it can be run on HPC centers directly. The quantification applications, MZDASoft TandemQuant and MZDASoft PeakLink, are written in Matlab, which are compiled with a Matlab runtime compiler. A sample script that incorporates all necessary processing steps of MZDASoft for LC/MS/MS quantification in a parallel processing environment is available. The project webpage is http://compgenomics.utsa.edu/zgroup/MZDASoft. The proposed architecture enables the implementation of PeakLink for multiple samples. Significantly more (100%-500%) proteins can be compared over multiple samples with better quantification accuracy in test cases. MZDASoft

  1. Global Strategy

    DEFF Research Database (Denmark)

    Li, Peter Ping

    2013-01-01

    Global strategy differs from domestic strategy in terms of content and process as well as context and structure. The content of global strategy can contain five key elements, while the process of global strategy can have six major stages. These are expounded below. Global strategy is influenced...... by rich and complementary local contexts with diverse resource pools and game rules at the national level to form a broad ecosystem at the global level. Further, global strategy dictates the interaction or balance between different entry strategies at the levels of internal and external networks....

  2. A time and imaging cost analysis of low-risk ED observation patients: a conservative 64-section computed tomography coronary angiography "triple rule-out" compared to nuclear stress test strategy.

    Science.gov (United States)

    Takakuwa, Kevin M; Halpern, Ethan J; Shofer, Frances S

    2011-02-01

    The study aimed to examine time and imaging costs of 2 different imaging strategies for low-risk emergency department (ED) observation patients with acute chest pain or symptoms suggestive of acute coronary syndrome. We compared a "triple rule-out" (TRO) 64-section multidetector computed tomography protocol with nuclear stress testing. This was a prospective observational cohort study of consecutive ED patients who were enrolled in our chest pain observation protocol during a 16-month period. Our standard observation protocol included a minimum of 2 sets of cardiac enzymes at least 6 hours apart followed by a nuclear stress test. Once a week, observation patients were offered a TRO (to evaluate for coronary artery disease, thoracic dissection, and pulmonary embolus) multidetector computed tomography with the option of further stress testing for those patients found to have evidence of coronary artery disease. We analyzed 832 consecutive observation patients including 214 patients who underwent the TRO protocol. Mean total length of stay was 16.1 hours for TRO patients, 16.3 hours for TRO plus other imaging test, 22.6 hours for nuclear stress testing, 23.3 hours for nuclear stress testing plus other imaging tests, and 23.7 hours for nuclear stress testing plus TRO (P < .0001 for TRO and TRO + other test compared to stress test ± other test). Mean imaging times were 3.6, 4.4, 5.9, 7.5, and 6.6 hours, respectively (P < .05 for TRO and TRO + other test compared to stress test ± other test). Mean imaging costs were $1307 for TRO patients vs $945 for nuclear stress testing. Triple rule-out reduced total length of stay and imaging time but incurred higher imaging costs. A per-hospital analysis would be needed to determine if patient time savings justify the higher imaging costs. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez, N.; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp...

  4. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibanez, Noelia; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp......Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional...

  5. Marine Corps Private Cloud Computing Environment Strategy

    Science.gov (United States)

    2012-05-15

    leveraging economies of scale through the MCEITS PCCE, the Marine Corps will measure consumed IT resources more effectively, increase or decrease...flexible broad network access, resource pooling, elastic provisioning and measured services. By leveraging economies of scale the Marine Corps will be able...IaaS SaaS / IaaS 1 1 LCE I ACE Dets I I I I ------------------~ GIG / CJ Internet Security Boundary MCEN I DISN r :------------------ MCEN

  6. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  7. Sample-based engine noise synthesis using an enhanced pitch-synchronous overlap-and-add method.

    Science.gov (United States)

    Jagla, Jan; Maillard, Julien; Martin, Nadine

    2012-11-01

    An algorithm for the real time synthesis of internal combustion engine noise is presented. Through the analysis of a recorded engine noise signal of continuously varying engine speed, a dataset of sound samples is extracted allowing the real time synthesis of the noise induced by arbitrary evolutions of engine speed. The sound samples are extracted from a recording spanning the entire engine speed range. Each sample is delimitated such as to contain the sound emitted during one cycle of the engine plus the necessary overlap to ensure smooth transitions during the synthesis. The proposed approach, an extension of the PSOLA method introduced for speech processing, takes advantage of the specific periodicity of engine noise signals to locate the extraction instants of the sound samples. During the synthesis stage, the sound samples corresponding to the target engine speed evolution are concatenated with an overlap and add algorithm. It is shown that this method produces high quality audio restitution with a low computational load. It is therefore well suited for real time applications.

  8. Security in cloud computing

    OpenAIRE

    Moreno Martín, Oriol

    2016-01-01

    Security in Cloud Computing is becoming a challenge for next generation Data Centers. This project will focus on investigating new security strategies for Cloud Computing systems. Cloud Computingisarecent paradigmto deliver services over Internet. Businesses grow drastically because of it. Researchers focus their work on it. The rapid access to exible and low cost IT resources on an on-demand fashion, allows the users to avoid planning ahead for provisioning, and enterprises to save money ...

  9. Fiscal 1999 research report on long-term energy technology strategy. Basic research on industrial technology strategy (Individual technology strategy). Electronic information technology field (Human process ware field for computers); 1999 nendo choki energy gijutsu senryaku nado ni kansuru chsao hokokusho. Sangyo gijutsu senryaku sakutei kiban chosa (bun'yabetsu gijutsu senryaku) denshi joho gijutsu bun'ya (computer kanren bun'ya no uchi, human process wear bun'ya)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    This report summarizes the fiscal 1999 basic research result on industrial technology strategy of a human process ware (HPW) field on electronic information. Although the current HPW industry is not yet complete, various technologies concerned are being stored, and the HPW industry is expected as new major industry for home and personal information equipment. As factors hindering technological innovation in Japan, the incomplete market, poor cooperation between industry and government, ambiguous role of university, and poor standardization were pointed out. Technology trends were analyzed to extract seed technology fields from 3 viewpoints of mobile HPW available at any time, at any place and to anybody, HPW system capable of customizing every personal use, and HPW supporting development of a creativity. As general strategy, the technology seed chart and map of technical development targets for every HPW were prepared to conceive a route to practical use, and develop element technologies required for practical use, based on the future trend of the market. (NEDO)

  10. Fiscal 1999 research report on long-term energy technology strategy. Basic research on industrial technology strategy (Individual technology strategy). Electronic information technology field (Human process ware field for computers); 1999 nendo choki energy gijutsu senryaku nado ni kansuru chsao hokokusho. Sangyo gijutsu senryaku sakutei kiban chosa (bun'yabetsu gijutsu senryaku) denshi joho gijutsu bun'ya (computer kanren bun'ya no uchi, human process wear bun'ya)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    This report summarizes the fiscal 1999 basic research result on industrial technology strategy of a human process ware (HPW) field on electronic information. Although the current HPW industry is not yet complete, various technologies concerned are being stored, and the HPW industry is expected as new major industry for home and personal information equipment. As factors hindering technological innovation in Japan, the incomplete market, poor cooperation between industry and government, ambiguous role of university, and poor standardization were pointed out. Technology trends were analyzed to extract seed technology fields from 3 viewpoints of mobile HPW available at any time, at any place and to anybody, HPW system capable of customizing every personal use, and HPW supporting development of a creativity. As general strategy, the technology seed chart and map of technical development targets for every HPW were prepared to conceive a route to practical use, and develop element technologies required for practical use, based on the future trend of the market. (NEDO)

  11. A novel multi-scale adaptive sampling-based approach for energy saving in leak detection for WSN-based water pipelines

    Science.gov (United States)

    Saqib, Najam us; Faizan Mysorewala, Muhammad; Cheded, Lahouari

    2017-12-01

    In this paper, we propose a novel monitoring strategy for a wireless sensor networks (WSNs)-based water pipeline network. Our strategy uses a multi-pronged approach to reduce energy consumption based on the use of two types of vibration sensors and pressure sensors, all having different energy levels, and a hierarchical adaptive sampling mechanism to determine the sampling frequency. The sampling rate of the sensors is adjusted according to the bandwidth of the vibration signal being monitored by using a wavelet-based adaptive thresholding scheme that calculates the new sampling frequency for the following cycle. In this multimodal sensing scheme, the duty-cycling approach is used for all sensors to reduce the sampling instances, such that the high-energy, high-precision (HE-HP) vibration sensors have low duty cycles, and the low-energy, low-precision (LE-LP) vibration sensors have high duty cycles. The low duty-cycling (HE-HP) vibration sensor adjusts the sampling frequency of the high duty-cycling (LE-LP) vibration sensor. The simulated test bed considered here consists of a water pipeline network which uses pressure and vibration sensors, with the latter having different energy consumptions and precision levels, at various locations in the network. This is all the more useful for energy conservation for extended monitoring. It is shown that by using the novel features of our proposed scheme, a significant reduction in energy consumption is achieved and the leak is effectively detected by the sensor node that is closest to it. Finally, both the total energy consumed by monitoring as well as the time to detect the leak by a WSN node are computed, and show the superiority of our proposed hierarchical adaptive sampling algorithm over a non-adaptive sampling approach.

  12. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  13. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  14. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  15. Characterizing spatiotemporal information loss in sparse-sampling-based dynamic MRI for monitoring respiration-induced tumor motion in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Arai, Tatsuya J. [Department of Radiation Oncology, UT Southwestern Medical Center, Dallas, Texas 75390 (United States); Nofiele, Joris; Yuan, Qing [Department of Radiology, UT Southwestern Medical Center, Dallas, Texas 75390 (United States); Madhuranthakam, Ananth J.; Pedrosa, Ivan; Chopra, Rajiv [Department of Radiology, UT Southwestern Medical Center, Dallas, Texas 75390 (United States); Advanced Imaging Research Center, UT Southwestern Medical Center, Dallas, Texas 75390 (United States); Sawant, Amit, E-mail: amit.sawant@utsouthwestern.edu [Department of Radiation Oncology, UT Southwestern Medical Center, Dallas, Texas 75390 (United States); Department of Radiology, UT Southwestern Medical Center, Dallas, Texas 75390 (United States); Department of Radiation Oncology, University of Maryland School of Medicine, Baltimore, Maryland, 21201 (United States)

    2016-06-15

    Purpose: Sparse-sampling and reconstruction techniques represent an attractive strategy to achieve faster image acquisition speeds, while maintaining adequate spatial resolution and signal-to-noise ratio in rapid magnetic resonance imaging (MRI). The authors investigate the use of one such sequence, broad-use linear acquisition speed-up technique (k-t BLAST) in monitoring tumor motion for thoracic and abdominal radiotherapy and examine the potential trade-off between increased sparsification (to increase imaging speed) and the potential loss of “true” information due to greater reliance on a priori information. Methods: Lung tumor motion trajectories in the superior–inferior direction, previously recorded from ten lung cancer patients, were replayed using a motion phantom module driven by an MRI-compatible motion platform. Eppendorf test tubes filled with water which serve as fiducial markers were placed in the phantom. The modeled rigid and deformable motions were collected in a coronal image slice using balanced fast field echo in conjunction with k-t BLAST. Root mean square (RMS) error was used as a metric of spatial accuracy as measured trajectories were compared to input data. The loss of spatial information was characterized for progressively increasing acceleration factor from 1 to 16; the resultant sampling frequency was increased approximately from 2.5 to 19 Hz when the principal direction of the motion was set along frequency encoding direction. In addition to the phantom study, respiration-induced tumor motions were captured from two patients (kidney tumor and lung tumor) at 13 Hz over 49 s to demonstrate the impact of high speed motion monitoring over multiple breathing cycles. For each subject, the authors compared the tumor centroid trajectory as well as the deformable motion during free breathing. Results: In the rigid and deformable phantom studies, the RMS error of target tracking at the acquisition speed of 19 Hz was approximately 0.3–0

  16. Appropriate strategies.

    Science.gov (United States)

    Halty, M

    1979-01-01

    Technology strategies are concerned with the production, distribution, and consumption of technology. Observation of less developed countries (LDCs) and international organizations shows that little attention is given to the development of a technology strategy. LDCs need to formulate a strategy of self-reliant technological development for the next decade. They should no longer be content to stand in a technologically dependent relationship to the developed countries. Such strategies must balance the ratio between investment in indigenous technologies and expenditure for foreign technology. The strategies change according to the level of industrialization achieved. The following considerations come into development of technology strategies: 1) determination of an appropriate balance among the accumulation, consumption, and distribution of technology; 2) the amount and level of government support; and 3) the balance between depth and breadth of technology to be encouraged.

  17. Environmental strategy

    DEFF Research Database (Denmark)

    Zabkar, Vesna; Cater, Tomaz; Bajde, Domen

    2013-01-01

    perspective, appropriate environmental strategies in compliance with environmental requirements aim at building competitive advantages through sustainable development. There is no universal “green” strategy that would be appropriate for each company, regardless of its market requirements and competitive......Environmental issues and the inclusion of environmental strategies in strategic thinking is an interesting subject of investigation. In general, managerial practices organized along ecologically sound principles contribute to a more environmentally sustainable global economy. From the managerial...

  18. Proto-computational Thinking

    DEFF Research Database (Denmark)

    Tatar, Deborah Gail; Harrison, Steve; Stewart, Michael

    2017-01-01

    . Utilizing university students in co-development activities with teachers, the current study located and implemented opportunities for integrated computational thinking in middle school in a large, suburban, mixed-socioeconomic standing (SES) , mixed-race district. The co-development strategy resulted...

  19. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  20. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  1. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  2. Evaluation Strategy

    DEFF Research Database (Denmark)

    Coto Chotto, Mayela; Wentzer, Helle; Dirckinck-Holmfeld, Lone

    2009-01-01

    The paper presents an evaluation strategy based on deliberate ideals and principles of dialogue design. The evaluation strategy is based on experiential phenomenology taking the point of departure for design and evaluation processes in the experienced practitioners themselves. The article present...... the evaluation strategy and methodology of a research project Making Online Path to Enter new Markets, MOPEM. It is an EU-research project with partners from different Educational Institutions of Technology and Business in five European Countries.......The paper presents an evaluation strategy based on deliberate ideals and principles of dialogue design. The evaluation strategy is based on experiential phenomenology taking the point of departure for design and evaluation processes in the experienced practitioners themselves. The article presents...

  3. Explorations in computing an introduction to computer science

    CERN Document Server

    Conery, John S

    2010-01-01

    Introduction Computation The Limits of Computation Algorithms A Laboratory for Computational ExperimentsThe Ruby WorkbenchIntroducing Ruby and the RubyLabs environment for computational experimentsInteractive Ruby Numbers Variables Methods RubyLabs The Sieve of EratosthenesAn algorithm for finding prime numbersThe Sieve Algorithm The mod Operator Containers Iterators Boolean Values and the delete if Method Exploring the Algorithm The sieve Method A Better Sieve Experiments with the Sieve A Journey of a Thousand MilesIteration as a strategy for solving computational problemsSearching and Sortin

  4. A Computer Security Course in the Undergraduate Computer Science Curriculum.

    Science.gov (United States)

    Spillman, Richard

    1992-01-01

    Discusses the importance of computer security and considers criminal, national security, and personal privacy threats posed by security breakdown. Several examples are given, including incidents involving computer viruses. Objectives, content, instructional strategies, resources, and a sample examination for an experimental undergraduate computer…

  5. Requirements of modernization strategies

    International Nuclear Information System (INIS)

    Heinbuch, R.

    1997-01-01

    Instrumentation and control contributed a major share to the current level of safety, economic efficiency, and availability of the German nuclear power plants. German NPPs occupy a top position in this respect at international level, but novel instrumentation and digital control technology alone will not guarantee further enhancements. Therefore, the owner/operators established carefully devised maintenance and modernization strategies in order to safeguard their NPPs top position in the long run. The German NPPs are the most thoroughly automated plants of the world. In addition to the sweeping modernization strategies recommended by the plant manufacturers, based on computer-supported control, alternative modernization strategies have been considered in the evaluation process. This approach provides for room for maneuvre, for manufacturers as well as managers responsible for risk and cost optimization, which is a major task in view of the changing regulatory framework in the electricity market. (orig./CB) [de

  6. Safety strategy

    International Nuclear Information System (INIS)

    Schultheiss, G.F.

    1980-01-01

    The basis for safety strategy in nuclear industry and especially nuclear power plants is the prevention of radioactivity release inside or outside of the technical installation. Therefore either technical or administrative measures are combined to a general strategy concept. This introduction will explain in more detail the following topics: - basic principles of safety - lines of assurance (LOA) - defense in depth - deterministic and probabilistic methods. This presentation is seen as an introduction to the more detailed discussion following in this course, nevertheless some selected examples will be used to illustrate the aspects of safety strategy development although they might be repeated later on. (orig.)

  7. Export strategy

    DEFF Research Database (Denmark)

    Knudsen, Thorbjørn; Koed Madsen, Tage

    2002-01-01

    It is argued here that traditional export strategy research (encompassing the study of internationalization processes and export performance) is characterized by weak theoretical foundations and could benefit from a reorientation towards a dynamic capabilities perspective (DCP). We seek to draw...... on insights from DCP in order to devise a theoretical basis that could enrich export strategy research. Although our development of DCP insights builds on previous work, it also adds a crucial distinction between knowledge stocks and informational architecture. Changes in architecture are of greater...... importance. Following this elaboration of the dynamic capabilities perspective, we outline some implications and guidelines for future export strategy research....

  8. Quantitative Analysis of a Parasitic Antiviral Strategy

    OpenAIRE

    Kim, Hwijin; Yin, John

    2004-01-01

    We extended a computer simulation of viral intracellular growth to study a parasitic antiviral strategy that diverts the viral replicase toward parasite growth. This strategy inhibited virus growth over a wide range of conditions, while minimizing host cell perturbations. Such parasitic strategies may inhibit the development of drug-resistant virus strains.

  9. Pistage informatisé des stratégies de lecture : une étude de cas en contexte pédagogique Computer tracking of reading strategies : a case study in a pedagogical context

    Directory of Open Access Journals (Sweden)

    Dany Bréelle

    2004-12-01

    Full Text Available La lecture constitue une activité mentale privée dont les mécanismes sont difficiles à observer dans le contexte de la classe de langue. Pour mieux permettre à l'enseignant de jauger les capacités de lecture de ses apprenants, nous avons conçu une procédure informatisée dont le but a été d'enregistrer des données sur les conditions d'utilisation d'une sélection de stratégies de compréhension textuelle communément utilisées par les apprenants de Français Langue Étrangère (FLE. Les 16 étudiants universitaires d'une classe de FLE de niveau intermédiaire ont été soumis à la lecture de deux textes français en mode Pistage Informatisé. À l'occasion de cet exercice, les lecteurs ont dû effectuer des choix stratégiques liés aux problèmes de compréhension rencontrés pendant la lecture. Les résultats obtenus nous ont permis d'observer l'effort stratégique mis en œuvre par les lecteurs. L'objectif de cet article est de justifier ce qui nous a conduits à concevoir la procédure de Pistage Informatisé, de décrire notre méthode, de présenter les résultats et, enfin, de discuter de l'utilité pédagogique de notre approche.Reading is a private mental activity and its mechanisms are difficult to observe in the language classroom context. In order to allow instructors to better gauge the learners' reading skills, we have designed a computerized programme aiming to track data on the conditions of use of a selection of reading comprehension strategies frequently implemented by learners of French as a Foreign Language. The sixteen university students of intermediate level of proficiency who participated in the study were required to read two texts in French in Computer Tracking mode. During the exercise the readers were asked to make strategic choices relative to comprehension problems encountered while reading. The results subsequently obtained allowed us to observe the strategic effort made by the readers. The aim of

  10. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  11. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  12. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  13. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  14. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  15. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  16. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  17. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  18. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  19. FY 1999 New Sunshine Project survey research project - Survey on the long-term energy technology strategy, etc. Fundamental survey to decide on the industrial technology strategy - Technology strategy by field (Electronic information technology field - Human process ware technology field of the computer relation field); 1999 nendo choki energy gijutsu senryaku nado ni kansuru chosa hokokusho. Sangyo gijutsu senryaku sakutei kiban chosa (bun'yabetsu gijutsu senryaku (denshi joho gijutsu bun'ya (computer kanren bun'ya no uchi, human process ware gijutsu bun'ya)))

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    The survey/study were conducted to contribute to proposing technology strategies such as the analysis of the present state of technical competitive force and the forecast in the human process ware industry technology field of the computer relation field. The human process ware is a system aimed at fermenting human power for thinking, problem solution and creation. Its main subjects are the following three: close society technology for will communication by automatically learning backgrounds of people joining the community; topics community technology for community formation for the appropriate people/organizations by grasping true topics/purposes from conversation; active interaction technology for supporting creative activities of users/community and making conversation with users. Further, in the region of the technology to meet the requests/restrictions from the society, the following are expected: energy/resource conservation; realization of the life with ease/safety and of high quality in the aged society; realization of the advanced information network society which becomes the basis of a new economic society. (NEDO)

  20. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  1. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  2. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  3. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  4. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  5. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  6. CSR STRATEGIES

    OpenAIRE

    LAURENTIU BARANGA; ION STEGAROIU

    2011-01-01

    Corporate Social Responsibility (CSR) has got three components: economic responsibility of shareholders, corporate environmental responsibility, corporate responsibility of the society. Each component of the CSR has its own features, according to which adequate individual behaviour is established. Knowing these features is very important in CSR strategy development.

  7. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  8. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  9. Sex Equity: Increasing Girls' Use of Computers.

    Science.gov (United States)

    Lockheed, Marlaine E.; Frakt, Steven B.

    1984-01-01

    Indicates that although computer science has been free of male domination, the stereotype of computers as male machines is emerging with increasing growth in microcomputer use by children. Factors that account for this development and some strategies teachers can adopt to equalize computer use by boys and girls are presented. (MBR)

  10. DOT strategies versus orbiter strategies

    NARCIS (Netherlands)

    Rutten, R.J.

    2001-01-01

    The Dutch Open Telescope is a high-resolution solar imager coming on-line at La Palma. The definition of the DOT science niche, strategies, and requirements resemble Solar Orbiter considerations and deliberations. I discuss the latter in the light of the former, and claim that multi-line observation

  11. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  12. Introduction to computer ethics

    Directory of Open Access Journals (Sweden)

    Ćorić Dragana M.

    2015-01-01

    Full Text Available Ethics is becoming one of the most often used but also misinter­preted words. It is often taken as an additional, corrective parameter to the policies and strategies that has to be adopted in the area of political works, environment, business, and medicine. Computer ethics thus makes the latest ethical discipline in the scientific sky. But its roots, as it was the case with environmental ethics, ranging decades; only the speech and the use of the same, as well as discussions on the postulates of computer ethics, are the results of rapid IT development in the last decade or two. In this paper, according to the title, will be shown introduction to computer ethics-its basis, the most important representatives, as well as the most important succession.

  13. Study Strategies

    DEFF Research Database (Denmark)

    Nielsen, Camilla Kirketerp; Noer, Vibeke Røn

    and theory forms the basis of the research. Veterinary students have been followed through alternating learning contexts referring to both the scholastic academic classrooms and workplaces in commercial pig herds as well as to group work and game-based situations in a mandatory master course - ”the pig...... as a student´ and the process of professionalization. By maintaining the position of focusing upon the education as a situated and formative trajectory, the comparative analysis shows how students in profession-oriented educational settings manage the challenges of education and use study strategies......ID: 1277 / 22 SES 06 B: 2 22. Research in Higher Education Format of Presentation: Paper Alternative EERA Network: 19. Ethnography Topics: NW 22: Teaching, learning and assessment in higher education Keywords: Profession-oriented learning, study strategies, professionalisation processes...

  14. Arbitrage strategy

    OpenAIRE

    Kardaras, Constantinos

    2010-01-01

    An arbitrage strategy allows a financial agent to make certain profit out of nothing, i.e., out of zero initial investment. This has to be disallowed on economic basis if the market is in equilibrium state, as opportunities for riskless profit would result in an instantaneous movement of prices of certain financial instruments. The principle of not allowing for arbitrage opportunities in financial markets has far-reaching consequences, most notably the option-pricing and hedging formulas in c...

  15. Marketing Strategy

    OpenAIRE

    Andrýsková, Hana

    2013-01-01

    The aim of this thesis is to examine current marketing efforts of AARON - the leading seller of digital cameras. I will analyze current marketing tools and suggest improvements in efficiency and effectiveness of these tools. Marketing strategy is a method of managing and coordinating efforts in marketing field so that the goals can be reached. The aim is to increase sales, turnover, revenues, profits, drive out competition and also to build up a corporate image and corporate culture. I will a...

  16. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  17. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  18. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  19. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  20. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  1. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  2. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  3. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  4. Skill and strategy in games

    NARCIS (Netherlands)

    Dreef, M.R.M.

    2005-01-01

    This thesis consists of two parts. Part I deals with relative skill and the role of random factors in games. Part II is devoted to the computation of optimal strategies in two interesting classes of games: poker and take-and-guess games.

  5. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  6. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  7. Market-Oriented Cloud Computing: Vision, Hype, and Reality for Delivering IT Services as Computing Utilities

    OpenAIRE

    Buyya, Rajkumar; Yeo, Chee Shin; Venugopal, Srikumar

    2008-01-01

    This keynote paper: presents a 21st century vision of computing; identifies various computing paradigms promising to deliver the vision of computing utilities; defines Cloud computing and provides the architecture for creating market-oriented Clouds by leveraging technologies such as VMs; provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; presents...

  8. Utility Computing: Reality and Beyond

    Science.gov (United States)

    Ivanov, Ivan I.

    Utility Computing is not a new concept. It involves organizing and providing a wide range of computing-related services as public utilities. Much like water, gas, electricity and telecommunications, the concept of computing as public utility was announced in 1955. Utility Computing remained a concept for near 50 years. Now some models and forms of Utility Computing are emerging such as storage and server virtualization, grid computing, and automated provisioning. Recent trends in Utility Computing as a complex technology involve business procedures that could profoundly transform the nature of companies' IT services, organizational IT strategies and technology infrastructure, and business models. In the ultimate Utility Computing models, organizations will be able to acquire as much IT services as they need, whenever and wherever they need them. Based on networked businesses and new secure online applications, Utility Computing would facilitate "agility-integration" of IT resources and services within and between virtual companies. With the application of Utility Computing there could be concealment of the complexity of IT, reduction of operational expenses, and converting of IT costs to variable `on-demand' services. How far should technology, business and society go to adopt Utility Computing forms, modes and models?

  9. Correction of estimates of retention in care among a cohort of HIV-positive patients in Uganda in the period before starting ART: a sampling-based approach.

    Science.gov (United States)

    Nyakato, Patience; Kiragga, Agnes N; Kambugu, Andrew; Bradley, John; Baisley, Kathy

    2018-04-20

    The aim of this study was to use a sampling-based approach to obtain estimates of retention in HIV care before initiation of antiretroviral treatment (ART), corrected for outcomes in patients who were lost according to clinic registers. Retrospective cohort study of HIV-positive individuals not yet eligible for ART (CD4 >500). Three urban and three rural HIV care clinics in Uganda; information was extracted from the clinic registers for all patients who had registered for pre-ART care between January and August 2015. A random sample of patients who were lost according to the clinic registers (>3 months late to scheduled visit) was traced to ascertain their outcomes. The proportion of patients lost from care was estimated using a competing risks approach, first based on the information in the clinic records alone and then using inverse probability weights to incorporate the results from tracing. Cox regression was used to determine factors associated with loss from care. Of 1153 patients registered for pre-ART care (68% women, median age 29 years, median CD4 count 645 cells/µL), 307 (27%) were lost according to clinic records. Among these, 195 (63%) were selected for tracing; outcomes were ascertained in 118 (61%). Seven patients (6%) had died, 40 (34%) were in care elsewhere and 71 (60%) were out of care. Loss from care at 9 months was 30.2% (95% CI 27.3% to 33.5%). After incorporating outcomes from tracing, loss from care decreased to 18.5% (95% CI 13.8% to 23.6%). Estimates of loss from HIV care may be too high if based on routine clinic data alone. A sampling-based approach is a feasible way of obtaining more accurate estimates of retention, accounting for transfers to other clinics. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  11. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  12. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  13. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  14. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  15. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  16. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  17. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  18. Transfusion strategy

    DEFF Research Database (Denmark)

    Jakobsen, Carl-Johan

    2014-01-01

    Blood transfusion is associated with increased morbidity and mortality and numerous reports have emphasised the need for reduction. Following this there is increased attention to the concept of patient blood management. However, bleeding is relatively common following cardiac surgery and is furth....... In conclusion the evidence supports that each institution establishes its own patient blood management strategy to both conserve blood products and maximise outcome.......Blood transfusion is associated with increased morbidity and mortality and numerous reports have emphasised the need for reduction. Following this there is increased attention to the concept of patient blood management. However, bleeding is relatively common following cardiac surgery and is further...

  19. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  20. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  1. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  2. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  3. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  4. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  5. NATO strategy

    International Nuclear Information System (INIS)

    Tarr, D.W.

    1988-01-01

    Nuclear weapons are for NATO both the primary instrument for deterring a nuclear attack and the fall-back option if, in the event of a conventional assault, NATO's non-nuclear defence forces cannot hold. The dilemma of depending upon potentially self-destructive nuclear responses if deterrence fails and defence falters has always haunted NATO strategy, but after the Soviets achieved nuclear parity with the United States, this dilemma became more intensely perceived. Now the credibility, viability, acceptability, sensibility and morality of nuclear weapons for deterrence and defence are increasingly at issue. As a result, an increasing variety of non-nuclear (conventional) alternatives are being advanced to replace or supplement nuclear deterrence for the Alliance. The thrust of the author of this paper remarks is to attempt to apply the definitions, requirements, and other key conceptual distinctions of nuclear deterrence found in the body of western strategic literature to three categories of non-nuclear alternatives currently in vogue: strengthened conventional forces; deployment of new, high-technology non-nuclear weapons; and adoption of an offensive manoeuvre strategy. Let me, therefore, begin by listing the basic definitions, concepts and apparent requirements of deterrence which the deterrence theory literature offers

  6. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  7. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  8. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  9. A dip-and-read test strip for the determination of mercury(II) ion in aqueous samples based on urease activity inhibition.

    Science.gov (United States)

    Shi, Guo-Qing; Jiang, Guibin

    2002-11-01

    A sensitive dip-and-read test strip for the determination of mercury in aqueous samples based on the inhibition of urease reaction by the ion has been developed. The strip has a circular sensing zone that containing two layers: the top layer is a cellulose acetate membrane where urease is immobilized on it; the bottom layer is a pH indicator wafer that is impregnated with urea. The principle of the measurement is based on the disappearance of a yellow spot on the pH indicator wafer. The elapsing time until the disappearance of the spot which depends on the concentration of mercury(II) ion is measured with a stopwatch. Under the experimental conditions, as low as 0.2 ng/ml mercury can be observed with the detection range from 0.2 to 200 ng/ml in water. Organomercury compounds give essentially the same response as inorganic mercury. Heavy-metal ions such as Ag(I), Cu(II), Cd(II), Ni(II), Zn(II), and Pb(II) as well as other sample matrixes basically do not interfere with the mercury measurement.

  10. Identification of Susceptibility Genes for Peritoneal, Ovarian, and Deep Infiltrating Endometriosis Using a Pooled Sample-Based Genome-Wide Association Study

    Directory of Open Access Journals (Sweden)

    Bruno Borghese

    2015-01-01

    Full Text Available Characterizing genetic contributions to endometriosis might help to shorten the time to diagnosis, especially in the most severe forms, but represents a challenge. Previous genome-wide association studies (GWAS made no distinction between peritoneal endometriosis (SUP, endometrioma (OMA, and deep infiltrating endometriosis (DIE. We therefore conducted a pooled sample-based GWAS and distinguished histologically confirmed endometriosis subtypes. We performed an initial discovery step on 10-individual pools (two pools per condition. After quality control filtering, a Monte-Carlo simulation was used to rank the significant SNPs according to the ratio of allele frequencies and the coefficient of variation. Then, a replication step of individual genotyping was conducted in an independent cohort of 259 cases and 288 controls. Our approach was very stringent but probably missed a lot of information due to the Monte-Carlo simulation, which likely explained why we did not replicate results from “classic” GWAS. Four variants (rs227849, rs4703908, rs2479037, and rs966674 were significantly associated with an increased risk of OMA. Rs4703908, located close to ZNF366, provided a higher risk of OMA (OR = 2.22; 95% CI: 1.26–3.92 and DIE, especially with bowel involvement (OR = 2.09; 95% CI: 1.12–3.91. ZNF366, involved in estrogen metabolism and progression of breast cancer, is a new biologically plausible candidate for endometriosis.

  11. Identification of susceptibility genes for peritoneal, ovarian, and deep infiltrating endometriosis using a pooled sample-based genome-wide association study.

    Science.gov (United States)

    Borghese, Bruno; Tost, Jörg; de Surville, Magalie; Busato, Florence; Letourneur, Frank; Mondon, Françoise; Vaiman, Daniel; Chapron, Charles

    2015-01-01

    Characterizing genetic contributions to endometriosis might help to shorten the time to diagnosis, especially in the most severe forms, but represents a challenge. Previous genome-wide association studies (GWAS) made no distinction between peritoneal endometriosis (SUP), endometrioma (OMA), and deep infiltrating endometriosis (DIE). We therefore conducted a pooled sample-based GWAS and distinguished histologically confirmed endometriosis subtypes. We performed an initial discovery step on 10-individual pools (two pools per condition). After quality control filtering, a Monte-Carlo simulation was used to rank the significant SNPs according to the ratio of allele frequencies and the coefficient of variation. Then, a replication step of individual genotyping was conducted in an independent cohort of 259 cases and 288 controls. Our approach was very stringent but probably missed a lot of information due to the Monte-Carlo simulation, which likely explained why we did not replicate results from "classic" GWAS. Four variants (rs227849, rs4703908, rs2479037, and rs966674) were significantly associated with an increased risk of OMA. Rs4703908, located close to ZNF366, provided a higher risk of OMA (OR = 2.22; 95% CI: 1.26-3.92) and DIE, especially with bowel involvement (OR = 2.09; 95% CI: 1.12-3.91). ZNF366, involved in estrogen metabolism and progression of breast cancer, is a new biologically plausible candidate for endometriosis.

  12. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  13. Just-in-time : on strategy annotations

    NARCIS (Netherlands)

    J.C. van de Pol (Jaco)

    2001-01-01

    textabstractA simple kind of strategy annotations is investigated, giving rise to a class of strategies, including leftmost-innermost. It is shown that under certain restrictions, an interpreter can be written which computes the normal form of a term in a bottom-up traversal. The main contribution

  14. Implementation Strategy

    Science.gov (United States)

    1983-01-01

    Meeting the identified needs of Earth science requires approaching EOS as an information system and not simply as one or more satellites with instruments. Six elements of strategy are outlined as follows: implementation of the individual discipline missions as currently planned; use of sustained observational capabilities offered by operational satellites without waiting for the launch of new mission; put first priority on the data system; deploy an Advanced Data Collection and Location System; put a substantial new observing capability in a low Earth orbit in such a way as to provide for sustained measurements; and group instruments to exploit their capabilities for synergism; maximize the scientific utility of the mission; and minimize the costs of implementation where possible.

  15. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  16. Proven Strategies for Teaching and Learning

    NARCIS (Netherlands)

    D.G. Brown (David)

    2004-01-01

    textabstract50 technology-using professors at 50 of America's most-wired campuses were asked to explain how their teaching strategies have been augmented by the use of computers. From their responses emerges a pattern. Most professors are using computers in teaching in order to enable more

  17. Computer Tutors: An Innovative Approach to Computer Literacy. Part I: The Early Stages.

    Science.gov (United States)

    Targ, Joan

    1981-01-01

    In Part I of this two-part article, the author describes the evolution of the Computer Tutor project in Palo Alto, California, and the strategies she incorporated into a successful student-taught computer literacy program. Journal availability: Educational Computer, P.O. Box 535, Cupertino, CA 95015. (Editor/SJL)

  18. Preparation of alumina-coated magnetite nanoparticle for extraction of trimethoprim from environmental water samples based on mixed hemimicelles solid-phase extraction.

    Science.gov (United States)

    Sun, Lei; Zhang, Chuanzhou; Chen, Ligang; Liu, Jun; Jin, Haiyan; Xu, Haoyan; Ding, Lan

    2009-04-13

    In this study, a new type of alumina-coated magnetite nanoparticles (Fe(3)O(4)/Al(2)O(3) NPs) modified by the surfactant sodium dodecyl sulfate (SDS) has been successfully synthesized and applied for extraction of trimethoprim (TMP) from environmental water samples based on mixed hemimicelles solid-phase extraction (MHSPE). The coating of alumina on Fe(3)O(4) NPs not only avoids the dissolving of Fe(3)O(4) NPs in acidic solution, but also extends their application without sacrificing their unique magnetization characteristics. Due to the high surface area of these new sorbents and the excellent adsorption capacity after surface modification by SDS, satisfactory concentration factor and extraction recoveries can be produced with only 0.1g Fe(3)O(4)/Al(2)O(3) NPs. Main factors affecting the adsolubilization of TMP such as the amount of SDS, pH value, standing time, desorption solvent and maximal extraction volume were optimized. Under the selected conditions, TMP could be quantitatively extracted. The recoveries of TMP by analyzing the four spiked water samples were between 67 and 86%, and the relative standard deviation (RSD) ranged from 2 to 6%. Detection and quantification limits of the proposed method were 0.09 and 0.24 microg L(-1), respectively. Concentration factor of 1000 was achieved using this method to extract 500 mL of different environmental water samples. Compared with conventional SPE methods, the advantages of this new Fe(3)O(4)/Al(2)O(3) NPs MHSPE method still include easy preparation and regeneration of sorbents, short times of sample pretreatment, high extraction yields, and high breakthrough volumes. It shows great analytical potential in preconcentration of organic compounds from large volume water samples.

  19. Understanding reasons for and outcomes of patients lost to follow-up in antiretroviral therapy programs in Africa through a sampling-based approach.

    Science.gov (United States)

    Geng, Elvin H; Bangsberg, David R; Musinguzi, Nicolas; Emenyonu, Nneka; Bwana, Mwebesa Bosco; Yiannoutsos, Constantin T; Glidden, David V; Deeks, Steven G; Martin, Jeffrey N

    2010-03-01

    Losses to follow-up after initiation of antiretroviral therapy (ART) are common in Africa and are a considerable obstacle to understanding the effectiveness of nascent treatment programs. We sought to characterize, through a sampling-based approach, reasons for and outcomes of patients who become lost to follow-up. Cohort study. We searched for and interviewed a representative sample of lost patients or close informants in the community to determine reasons for and outcomes among lost patients. Three thousand six hundred twenty-eight HIV-infected adults initiated ART between January 1, 2004 and September 30, 2007 in Mbarara, Uganda. Eight hundred twenty-nine became lost to follow-up (cumulative incidence at 1, 2, and 3 years of 16%, 30%, and 39%). We sought a representative sample of 128 lost patients in the community and ascertained vital status in 111 (87%). Top reasons for loss included lack of transportation or money and work/child care responsibilities. Among the 111 lost patients who had their vital status ascertained through tracking, 32 deaths occurred (cumulative 1-year incidence 36%); mortality was highest shortly after the last clinic visit. Lower pre-ART CD4 T-cell count, older age, low blood pressure, and a central nervous system syndrome at the last clinic visit predicted deaths. Of patients directly interviewed, 83% were in care at another clinic and 71% were still using ART. Sociostructural factors are the primary reasons for loss to follow-up. Outcomes among the lost are heterogeneous: both deaths and transfers to other clinics were common. Tracking a sample of lost patients is an efficient means for programs to understand site-specific reasons for and outcomes among patients lost to follow-up.

  20. Estimating average shock pressures recorded by impactite samples based on universal stage investigations of planar deformation features in quartz - Sources of error and recommendations

    Science.gov (United States)

    Holm-Alwmark, S.; Ferrière, L.; Alwmark, C.; Poelchau, M. H.

    2018-01-01

    Planar deformation features (PDFs) in quartz are the most widely used indicator of shock metamorphism in terrestrial rocks. They can also be used for estimating average shock pressures that quartz-bearing rocks have been subjected to. Here we report on a number of observations and problems that we have encountered when performing universal stage measurements and crystallographically indexing of PDF orientations in quartz. These include a comparison between manual and automated methods of indexing PDFs, an evaluation of the new stereographic projection template, and observations regarding the PDF statistics related to the c-axis position and rhombohedral plane symmetry. We further discuss the implications that our findings have for shock barometry studies. Our study shows that the currently used stereographic projection template for indexing PDFs in quartz might induce an overestimation of rhombohedral planes with low Miller-Bravais indices. We suggest, based on a comparison of different shock barometry methods, that a unified method of assigning shock pressures to samples based on PDFs in quartz is necessary to allow comparison of data sets. This method needs to take into account not only the average number of PDF sets/grain but also the number of high Miller-Bravais index planes, both of which are important factors according to our study. Finally, we present a suggestion for such a method (which is valid for nonporous quartz-bearing rock types), which consists of assigning quartz grains into types (A-E) based on the PDF orientation pattern, and then calculation of a mean shock pressure for each sample.

  1. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  2. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  3. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  4. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  5. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  6. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  7. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  8. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  9. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  10. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  11. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  12. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  13. Prototyping and Simulating Parallel, Distributed Computations with VISA

    National Research Council Canada - National Science Library

    Demeure, Isabelle M; Nutt, Gary J

    1989-01-01

    ...] to support the design, prototyping, and simulation of parallel, distributed computations. In particular, VISA is meant to guide the choice of partitioning and communication strategies for such computations, based on their performance...

  14. Public Communication Strategies for Public Engagement with RFT

    International Nuclear Information System (INIS)

    Kim, H. S.; Cho, S. K.; Kim, H. D.

    2006-12-01

    The objective of this study is to provide a discussion for improving the public understanding of products RFT by investigation the following: - perception of nuclear, radiation and products. This study conducted a telephone survey using nationally sampled 800 adults. The sample was selected using probabilistic quota sampling based on sex, age and region. The different communication strategies are required for different types of products using RFT. And government and related institutes should consider and access cautiously in promoting the radiation technology industry.

  15. Computer-Based Career Interventions.

    Science.gov (United States)

    Mau, Wei-Cheng

    The possible utilities and limitations of computer-assisted career guidance systems (CACG) have been widely discussed although the effectiveness of CACG has not been systematically considered. This paper investigates the effectiveness of a theory-based CACG program, integrating Sequential Elimination and Expected Utility strategies. Three types of…

  16. Reconditioning of Computers

    DEFF Research Database (Denmark)

    Bundgaard, Anja Marie

    2017-01-01

    Fast technological development and short innovation cycles has resulted in shorter life spans for certain consumer electronics. Reconditioning is proposed as one of the strategies to close the loops in the circular economy and increase the lifespan of products and components. The paper therefore...... qualitative research interviews. The case study of the contextual barriers indicated that trust from the buyer and the seller of the used computers were important for a viable business. If trust was not in place, it could be a potential barrier. Furthermore, economy obsolescence and a lack of influence...

  17. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  18. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  19. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  20. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  1. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  3. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  4. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  5. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  6. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  7. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  8. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  9. Sampling-based approaches to improve estimation of mortality among patient dropouts: experience from a large PEPFAR-funded program in Western Kenya.

    Directory of Open Access Journals (Sweden)

    Constantin T Yiannoutsos

    Full Text Available Monitoring and evaluation (M&E of HIV care and treatment programs is impacted by losses to follow-up (LTFU in the patient population. The severity of this effect is undeniable but its extent unknown. Tracing all lost patients addresses this but census methods are not feasible in programs involving rapid scale-up of HIV treatment in the developing world. Sampling-based approaches and statistical adjustment are the only scaleable methods permitting accurate estimation of M&E indices.In a large antiretroviral therapy (ART program in western Kenya, we assessed the impact of LTFU on estimating patient mortality among 8,977 adult clients of whom, 3,624 were LTFU. Overall, dropouts were more likely male (36.8% versus 33.7%; p = 0.003, and younger than non-dropouts (35.3 versus 35.7 years old; p = 0.020, with lower median CD4 count at enrollment (160 versus 189 cells/ml; p<0.001 and WHO stage 3-4 disease (47.5% versus 41.1%; p<0.001. Urban clinic clients were 75.0% of non-dropouts but 70.3% of dropouts (p<0.001. Of the 3,624 dropouts, 1,143 were sought and 621 had their vital status ascertained. Statistical techniques were used to adjust mortality estimates based on information obtained from located LTFU patients. Observed mortality estimates one year after enrollment were 1.7% (95% CI 1.3%-2.0%, revised to 2.8% (2.3%-3.1% when deaths discovered through outreach were added and adjusted to 9.2% (7.8%-10.6% and 9.9% (8.4%-11.5% through statistical modeling depending on the method used. The estimates 12 months after ART initiation were 1.7% (1.3%-2.2%, 3.4% (2.9%-4.0%, 10.5% (8.7%-12.3% and 10.7% (8.9%-12.6% respectively. CONCLUSIONS/SIGNIFICANCE ABSTRACT: Assessment of the impact of LTFU is critical in program M&E as estimated mortality based on passive monitoring may underestimate true mortality by up to 80%. This bias can be ameliorated by tracing a sample of dropouts and statistically adjust the mortality estimates to properly evaluate and guide large

  10. Myocardial perfusion 320-row multidetector computed tomography-guided treatment strategy for the clinical management of patients with recent acute-onset chest pain: Design of the CArdiac cT in the treatment of acute CHest pain (CATCH)-2 randomized controlled trial.

    Science.gov (United States)

    Sørgaard, Mathias; Linde, Jesper J; Hove, Jens D; Petersen, Jan R; Jørgensen, Tem B S; Abdulla, Jawdat; Heitmann, Merete; Kragelund, Charlotte; Hansen, Thomas Fritz; Udholm, Patricia M; Pihl, Christian; Kühl, J Tobias; Engstrøm, Thomas; Jensen, Jan Skov; Høfsten, Dan E; Kelbæk, Henning; Kofoed, Klaus F

    2016-09-01

    Patients admitted with chest pain are a diagnostic challenge because the majority does not have coronary artery disease (CAD). Assessment of CAD with coronary computed tomography angiography (CCTA) is safe, cost-effective, and accurate, albeit with a modest specificity. Stress myocardial computed tomography perfusion (CTP) has been shown to increase the specificity when added to CCTA, without lowering the sensitivity. This article describes the design of a randomized controlled trial, CATCH-2, comparing a clinical diagnostic management strategy of CCTA alone against CCTA in combination with CTP. Patients with acute-onset chest pain older than 50 years and with at least one cardiovascular risk factor for CAD are being prospectively enrolled to this study from 6 different clinical sites since October 2013. A total of 600 patients will be included. Patients are randomized 1:1 to clinical management based on CCTA or on CCTA in combination with CTP, determining the need for further testing with invasive coronary angiography including measurement of the fractional flow reserve in vessels with coronary artery lesions. Patients are scanned with a 320-row multidetector computed tomography scanner. Decisions to revascularize the patients are taken by the invasive cardiologist independently of the study allocation. The primary end point is the frequency of revascularization. Secondary end points of clinical outcome are also recorded. The CATCH-2 will determine whether CCTA in combination with CTP is diagnostically superior to CCTA alone in the management of patients with acute-onset chest pain. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  12. Inventory control strategies

    International Nuclear Information System (INIS)

    Primrose, D.

    1998-01-01

    Finning International Inc. is in the business of selling, financing and servicing Caterpillar and complementary equipment. Its main markets are in western Canada, Britain and Chile. This paper discusses the parts inventory strategies system for Finning (Canada). The company's territory covers British Columbia, Alberta, the Yukon and the Northwest Territories. Finning's parts inventory consists of 80,000 component units valued at more than $150 M. Distribution centres are located in Langley, British Columbia and Edmonton, Alberta. To make inventory and orders easier to control, Finning has designed a computer-based system, with software written exclusively for Caterpillar dealers. The system makes use of a real time electronic interface with all Finning locations, plus all Caterpillar facilities and other dealers in North America. Details of the system are discussed, including territorial stocking procedures, addition to stock, exhaustion of stock, automatic/suggest order controls, surplus inventory management, and procedures for jointly managed inventory. 3 tabs., 1 fig

  13. SURF Model Calibration Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-10

    SURF and SURFplus are high explosive reactive burn models for shock initiation and propagation of detonation waves. They are engineering models motivated by the ignition & growth concept of high spots and for SURFplus a second slow reaction for the energy release from carbon clustering. A key feature of the SURF model is that there is a partial decoupling between model parameters and detonation properties. This enables reduced sets of independent parameters to be calibrated sequentially for the initiation and propagation regimes. Here we focus on a methodology for tting the initiation parameters to Pop plot data based on 1-D simulations to compute a numerical Pop plot. In addition, the strategy for tting the remaining parameters for the propagation regime and failure diameter is discussed.

  14. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  15. Heuristic Strategies in Systems Biology

    Directory of Open Access Journals (Sweden)

    Fridolin Gross

    2016-06-01

    Full Text Available Systems biology is sometimes presented as providing a superior approach to the problem of biological complexity. Its use of ‘unbiased’ methods and formal quantitative tools might lead to the impression that the human factor is effectively eliminated. However, a closer look reveals that this impression is misguided. Systems biologists cannot simply assemble molecular information and compute biological behavior. Instead, systems biology’s main contribution is to accelerate the discovery of mechanisms by applying models as heuristic tools. These models rely on a variety of idealizing and simplifying assumptions in order to be efficient for this purpose. The strategies of systems biologists are similar to those of experimentalists in that they attempt to reduce the complexity of the discovery process. Analyzing and comparing these strategies, or ‘heuristics’, reveals the importance of the human factor in computational approaches and helps to situate systems biology within the epistemic landscape of the life sciences.

  16. Interactive Control System, Intended Strategy, Implemented Strategy dan Emergent Strategy

    Directory of Open Access Journals (Sweden)

    Tubagus Ismail

    2012-09-01

    Full Text Available The purpose of this study was to examine the relationship between management control system (MCS and strategy formation processes, namely: intended strategy, emergent strategy and impelemented strategy. The focus of MCS in this study was interactive control system. The study was based on Structural Equation Modeling (SEM as its multivariate analyses instrument. The samples were upper middle managers of manufacturing company in Banten Province, DKI Jakarta Province and West Java Province. AMOS Software 16 program is used as an additional instrument to resolve the problem in SEM modeling. The study found that interactive control system brought a positive and significant influence on Intended strategy; interactive control system brought a positive and significant influence on implemented strategy; interactive control system brought a positive and significant influence on emergent strategy. The limitation of this study is that our empirical model only used one way relationship between the process of strategy formation and interactive control system.

  17. Interactive Control System, Intended Strategy, Implemented Strategy dan Emergent Strategy

    OpenAIRE

    Tubagus Ismail; Darjat Sudrajat

    2012-01-01

    The purpose of this study was to examine the relationship between management control system (MCS) and strategy formation processes, namely: intended strategy, emergent strategy and impelemented strategy. The focus of MCS in this study was interactive control system. The study was based on Structural Equation Modeling (SEM) as its multivariate analyses instrument. The samples were upper middle managers of manufacturing company in Banten Province, DKI Jakarta Province and West Java Province. AM...

  18. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  19. Scottish Nuclear's information systems strategy

    International Nuclear Information System (INIS)

    Inglis, P.

    1991-01-01

    Scottish Nuclear, the company which has owned and operated Scotland's nuclear power generating capacity since privatization, inherited a substantial amount of computer hardware and software from its predecessor, the South of Scotland Electricity Board. Each of the two power stations, Torness and Hunterston, were using Digital Vax clusters as the Scottish Nuclear company was formed. This had a major influence on the information systems strategy which has subsequently been adopted. (UK)

  20. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  1. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  2. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  3. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  4. Modeling and Generating Strategy Games Mechanics

    OpenAIRE

    Mahlmann, Tobias

    2013-01-01

    Strategy games are a popular genre of games with a long history, originatingfrom games like Chess or Go. The first strategy games were published as“Kriegspiele” (engl. wargames) in the late 18th century, intended for the education of young cadets. Since then strategy games were refined and transformed over two centuries into a medium of entertainment. Today’s computer strategy games have their roots in the board- and roleplaying games of the 20th century and enjoy great popularity. We use str...

  5. Strategies in the HSCM

    Directory of Open Access Journals (Sweden)

    Hung-Chang Liao

    2015-01-01

    Full Text Available The purpose of this study was to establish a hospital supply chain management (HSCM model in which three kinds of drugs in the same class and with the same indications were used in creating an optimal robust design and adjustable ordering strategies to deal with a drug shortage. The main assumption was that although each doctor has his/her own prescription pattern, when there is a shortage of a particular drug, the doctor may choose a similar drug with the same indications as a replacement. Four steps were used to construct and analyze the HSCM model. The computation technology used included a simulation, a neural network (NN, and a genetic algorithm (GA. The mathematical methods of the simulation and the NN were used to construct a relationship between the factor levels and performance, while the GA was used to obtain the optimal combination of factor levels from the NN. A sensitivity analysis was also used to assess the change in the optimal factor levels. Adjustable ordering strategies were also developed to prevent drug shortages.

  6. Exploiting Virtualization and Cloud Computing in ATLAS

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    This work will present the current status of the Virtualization and Cloud Computing R&D project in ATLAS Distributed Computing. First, strategies for deploying PanDA queues on cloud sites will be discussed, including the introduction of a "cloud factory" for managing cloud VM instances. Ne...

  7. Algorithmic Mechanism Design of Evolutionary Computation.

    Science.gov (United States)

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  8. LIMS strategy review

    International Nuclear Information System (INIS)

    Ibsen, T.G.

    1995-01-01

    The evaluation team found that the LABCORE program has a viable Laboratory Information Management System (LIMS) that will handle the sample tracking and reporting necessary to accomplish the requirements established by the system specifications. The team also found that the LABCORE program has made real accomplishments and that it has been moving in the right direction, for example the phased implementation at the 222-S laboratory. The LABCORE program has many strengths but it also has short comings that need to be addressed. The strengths are that the hardware, software, staff, and the computer and training facilities are all first rate. The shortcomings are that it is missing a strategy outlining the goals which would clearly define when the project implementation has been successfully completed. It also had excessive cost during its acquisition. Specific recommendations are made. As a whole, the evaluation team believes that LABCORE is the system that was specified and will meet the needs of both laboratories, 222-S and the Waste Sampling and Characterization Facility. LABCORE has experienced all the challenges of a new computer system but with continued effort on the part of the LABCORE team and laboratory personnel the system will operate as needed

  9. Estimated mortality on HIV treatment among active patients and patients lost to follow-up in 4 provinces of Zambia: Findings from a multistage sampling-based survey.

    Directory of Open Access Journals (Sweden)

    Charles B Holmes

    2018-01-01

    treatment at 4.6/100 person-years (95% CI 3.9-5.5, 2.9/100 person-years (95% CI 2.1-3.9 in year 2, and approximately 1.6% per year through 8 years on treatment. In multivariate analysis, patient-level factors including male sex and pretherapy CD4 levels and WHO stage were associated with higher mortality among new ART users, while male sex and HIV disclosure were associated with mortality among all ART users. In both cases, being late (>14 days late for appointment or lost (>90 days late for an appointment was associated with deaths. We were unable to ascertain the vital status of about one-quarter of those lost and selected for tracing and did not adjudicate causes of death.HIV treatment in Zambia is not optimally effective. The high and sustained mortality rates and marked under-reporting of mortality at the provincial-level and unexplained heterogeneity between regions and sites suggest opportunities for the use of corrected mortality rates for quality improvement. A regionally representative sampling-based approach can bring gaps and opportunities for programs into clear epidemiological focus for local and global decision makers.

  10. Understanding and preventing computer vision syndrome.

    Science.gov (United States)

    Loh, Ky; Redd, Sc

    2008-01-01

    The invention of computer and advancement in information technology has revolutionized and benefited the society but at the same time has caused symptoms related to its usage such as ocular sprain, irritation, redness, dryness, blurred vision and double vision. This cluster of symptoms is known as computer vision syndrome which is characterized by the visual symptoms which result from interaction with computer display or its environment. Three major mechanisms that lead to computer vision syndrome are extraocular mechanism, accommodative mechanism and ocular surface mechanism. The visual effects of the computer such as brightness, resolution, glare and quality all are known factors that contribute to computer vision syndrome. Prevention is the most important strategy in managing computer vision syndrome. Modification in the ergonomics of the working environment, patient education and proper eye care are crucial in managing computer vision syndrome.

  11. UNDERSTANDING AND PREVENTING COMPUTER VISION SYNDROME

    Directory of Open Access Journals (Sweden)

    REDDY SC

    2008-01-01

    Full Text Available The invention of computer and advancement in information technology has revolutionized and benefited the society but at the same time has caused symptoms related to its usage such as ocular sprain, irritation, redness, dryness, blurred vision and double vision. This cluster of symptoms is known as computer vision syndrome which is characterized by the visual symptoms which result from interaction with computer display or its environment. Three major mechanisms that lead to computer vision syndrome are extraocular mechanism, accommodative mechanism and ocular surface mechanism. The visual effects of the computer such as brightness, resolution, glare and quality all are known factors that contribute to computer vision syndrome. Prevention is the most important strategy in managing computer vision syndrome. Modification in the ergonomics of the working environment, patient education and proper eye care are crucial in managing computer vision syndrome.

  12. ATLAS Distributed Computing in LHC Run2

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2015-01-01

    The ATLAS Distributed Computing infrastructure has evolved after the first period of LHC data taking in order to cope with the challenges of the upcoming LHC Run2. An increased data rate and computing demands of the Monte-Carlo simulation, as well as new approaches to ATLAS analysis, dictated a more dynamic workload management system (ProdSys2) and data management system (Rucio), overcoming the boundaries imposed by the design of the old computing model. In particular, the commissioning of new central computing system components was the core part of the migration toward the flexible computing model. The flexible computing utilization exploring the opportunistic resources such as HPC, cloud, and volunteer computing is embedded in the new computing model, the data access mechanisms have been enhanced with the remote access, and the network topology and performance is deeply integrated into the core of the system. Moreover a new data management strategy, based on defined lifetime for each dataset, has been defin...

  13. Analysis on the security of cloud computing

    Science.gov (United States)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  14. Xeml Lab: a tool that supports the design of experiments at a graphical interface and generates computer-readable metadata files, which capture information about genotypes, growth conditions, environmental perturbations and sampling strategy.

    Science.gov (United States)

    Hannemann, Jan; Poorter, Hendrik; Usadel, Björn; Bläsing, Oliver E; Finck, Alex; Tardieu, Francois; Atkin, Owen K; Pons, Thijs; Stitt, Mark; Gibon, Yves

    2009-09-01

    Data mining depends on the ability to access machine-readable metadata that describe genotypes, environmental conditions, and sampling times and strategy. This article presents Xeml Lab. The Xeml Interactive Designer provides an interactive graphical interface at which complex experiments can be designed, and concomitantly generates machine-readable metadata files. It uses a new eXtensible Mark-up Language (XML)-derived dialect termed XEML. Xeml Lab includes a new ontology for environmental conditions, called Xeml Environment Ontology. However, to provide versatility, it is designed to be generic and also accepts other commonly used ontology formats, including OBO and OWL. A review summarizing important environmental conditions that need to be controlled, monitored and captured as metadata is posted in a Wiki (http://www.codeplex.com/XeO) to promote community discussion. The usefulness of Xeml Lab is illustrated by two meta-analyses of a large set of experiments that were performed with Arabidopsis thaliana during 5 years. The first reveals sources of noise that affect measurements of metabolite levels and enzyme activities. The second shows that Arabidopsis maintains remarkably stable levels of sugars and amino acids across a wide range of photoperiod treatments, and that adjustment of starch turnover and the leaf protein content contribute to this metabolic homeostasis.

  15. A pedagogic JavaScript program for point location strategies

    KAUST Repository

    de Castro, Pedro; Devillers, Olivier

    2011-01-01

    Point location in triangulations is a classical problem in computational geometry. And walking in a triangulation is often used as the starting point for several nice point location strategies. We present a pedagogic JavaScript program demonstrating some of these strategies, which is available at: www-sop.inria.fr/geometrica/demo/point location strategies/.

  16. Mental Time Travel, Memory and the Social Learning Strategies Tournament

    Science.gov (United States)

    Fogarty, L.; Rendell, L.; Laland, K. N.

    2012-01-01

    The social learning strategies tournament was an open computer-based tournament investigating the best way to learn in a changing environment. Here we present an analysis of the impact of memory on the ability of strategies entered into the social learning strategies tournament (Rendell, Boyd, et al., 2010) to modify their own behavior to suit a…

  17. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  18. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  19. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  20. Bacterial computing with engineered populations.

    Science.gov (United States)

    Amos, Martyn; Axmann, Ilka Maria; Blüthgen, Nils; de la Cruz, Fernando; Jaramillo, Alfonso; Rodriguez-Paton, Alfonso; Simmel, Friedrich

    2015-07-28

    We describe strategies for the construction of bacterial computing platforms by describing a number of results from the recently completed bacterial computing with engineered populations project. In general, the implementation of such systems requires a framework containing various components such as intracellular circuits, single cell input/output and cell-cell interfacing, as well as extensive analysis. In this overview paper, we describe our approach to each of these, and suggest possible areas for future research. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  2. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  3. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  4. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  5. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  6. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  7. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  8. A facile and selective approach for enrichment of l-cysteine in human plasma sample based on zinc organic polymer: Optimization by response surface methodology.

    Science.gov (United States)

    Bahrani, Sonia; Ghaedi, Mehrorang; Ostovan, Abbas; Javadian, Hamedreza; Mansoorkhani, Mohammad Javad Khoshnood; Taghipour, Tahere

    2018-02-05

    In this research, a facile and selective method was described to extract l-cysteine (l-Cys), an essential α-amino acid for anti-ageing playing an important role in human health, from human blood plasma sample. The importance of this research was the mild and time-consuming synthesis of zinc organic polymer (Zn-MOP) as an adsorbent and evaluation of its ability for efficient enrichment of l-Cys by ultrasound-assisted dispersive micro solid-phase extraction (UA-DMSPE) method. The structure of Zn-MOP was investigated by FT-IR, XRD and SEM. Analysis of variance (ANOVA) was applied for the experimental data to reach the best optimum conditions. The quantification of l-Cys was carried out by high performance liquid chromatography with UV detection set at λ=230nm. The calibration graph showed reasonable linear responses towards l-Cys concentrations in the range of 4.0-1000μg/L (r 2 =0.999) with low limit of detection (0.76μg/L, S/N=3) and RSD≤2.18 (n=3). The results revealed the applicability and high performance of this novel strategy in detecting trace l-Cys by Zn-MOP in complicated matrices. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Strategy Development in Organisations

    DEFF Research Database (Denmark)

    Sørensen, Lene

    1999-01-01

    There exist certain ambiguities with the converging fields of information technology and organisational strategy development. The term "IT strategy" has evolved and reflects in some respects this confusion. This paper discusses some of the ambiguities and difficulties of the term "IT strategy......" as used in practice and literature. Emphasis is put on how the term is related to the problem, the organisation, the strategy process and the practical way of methodologically developing the strategy. Finally, alternative strategy developing perspectives are presented....

  10. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  11. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... kinds of evolutionary algorithms, have been prudently analyzed. This analysis was followed by a thorough analysis of various issues involved in stochastic local search algorithms. An interesting survey of various technological and industrial applications in mechanical engineering and design has been...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...

  12. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  13. Nurturing a growing field: Computers & Geosciences

    Science.gov (United States)

    Mariethoz, Gregoire; Pebesma, Edzer

    2017-10-01

    Computational issues are becoming increasingly critical for virtually all fields of geoscience. This includes the development of improved algorithms and models, strategies for implementing high-performance computing, or the management and visualization of the large datasets provided by an ever-growing number of environmental sensors. Such issues are central to scientific fields as diverse as geological modeling, Earth observation, geophysics or climatology, to name just a few. Related computational advances, across a range of geoscience disciplines, are the core focus of Computers & Geosciences, which is thus a truly multidisciplinary journal.

  14. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  15. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  16. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  17. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  18. WE-G-BRD-02: Characterizing Information Loss in a Sparse-Sampling-Based Dynamic MRI Sequence (k-T BLAST) for Lung Motion Monitoring

    International Nuclear Information System (INIS)

    Arai, T; Nofiele, J; Sawant, A

    2015-01-01

    Purpose: Rapid MRI is an attractive, non-ionizing tool for soft-tissue-based monitoring of respiratory motion in thoracic and abdominal radiotherapy. One big challenge is to achieve high temporal resolution while maintaining adequate spatial resolution. K-t BLAST, sparse-sampling and reconstruction sequence based on a-priori information represents a potential solution. In this work, we investigated how much “true” motion information is lost as a-priori information is progressively added for faster imaging. Methods: Lung tumor motions in superior-inferior direction obtained from ten individuals were replayed into an in-house, MRI-compatible, programmable motion platform (50Hz refresh and 100microns precision). Six water-filled 1.5ml tubes were placed on it as fiducial markers. Dynamic marker motion within a coronal slice (FOV: 32×32cm"2, resolution: 0.67×0.67mm"2, slice-thickness: 5mm) was collected on 3.0T body scanner (Ingenia, Philips). Balanced-FFE (TE/TR: 1.3ms/2.5ms, flip-angle: 40degrees) was used in conjunction with k-t BLAST. Each motion was repeated four times as four k-t acceleration factors 1, 2, 5, and 16 (corresponding frame rates were 2.5, 4.7, 9.8, and 19.1Hz, respectively) were compared. For each image set, one average motion trajectory was computed from six marker displacements. Root mean square error (RMS) was used as a metric of spatial accuracy where measured trajectories were compared to original data. Results: Tumor motion was approximately 10mm. The mean(standard deviation) of respiratory rates over ten patients was 0.28(0.06)Hz. Cumulative distributions of tumor motion frequency spectra (0–25Hz) obtained from the patients showed that 90% of motion fell on 3.88Hz or less. Therefore, the frame rate must be a double or higher for accurate monitoring. The RMS errors over patients for k-t factors of 1, 2, 5, and 16 were.10(.04),.17(.04), .21(.06) and.26(.06)mm, respectively. Conclusions: K-t factor of 5 or higher can cover the high

  19. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  20. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.